Garde, Anne Helene; Hansen, Ase Marie; Kristiansen, Jesper;
2004-01-01
When measuring biomarkers in urine, volume (and time) or concentration of creatinine are both accepted methods of standardization for diuresis. Both types of standardization contribute uncertainty to the final result. The aim of the present paper was to compare the uncertainty introduced when using...... that the uncertainty associated with creatinine standardization (19-35%) was higher than the uncertainty related to volume standardization (up to 10%, when not correcting for deviations from 24 h) for 24 h urine samples. However, volume standardization introduced an average bias of 4% due to missed volumes...... in population studies. When studying a single 24 h sample from one individual, there was a 15-20% risk that the sample was incomplete. In this case a bias of approximately 25% was introduced when using volume standardization, whereas the uncertainty related to creatinine standardization was independent...
Garde, Anne Helene; Hansen, Ase Marie; Kristiansen, Jesper
2004-01-01
When measuring biomarkers in urine, volume (and time) or concentration of creatinine are both accepted methods of standardization for diuresis. Both types of standardization contribute uncertainty to the final result. The aim of the present paper was to compare the uncertainty introduced when using...... the two types of standardization on 24 h samples from healthy individuals. Estimates of uncertainties were based on results from the literature supplemented with data from our own studies. Only the difference in uncertainty related to the two standardization methods was evaluated. It was found...... increase in convenience for the participants, when collecting small volumes rather than complete 24 h samples....
Generalized uncertainty relations
Herdegen, Andrzej; Ziobro, Piotr
2017-04-01
The standard uncertainty relations (UR) in quantum mechanics are typically used for unbounded operators (like the canonical pair). This implies the need for the control of the domain problems. On the other hand, the use of (possibly bounded) functions of basic observables usually leads to more complex and less readily interpretable relations. In addition, UR may turn trivial for certain states if the commutator of observables is not proportional to a positive operator. In this letter we consider a generalization of standard UR resulting from the use of two, instead of one, vector states. The possibility to link these states to each other in various ways adds additional flexibility to UR, which may compensate some of the above-mentioned drawbacks. We discuss applications of the general scheme, leading not only to technical improvements, but also to interesting new insight.
Prior information: how to beat the standard joint-measurement uncertainty relation
Hall, M J W
2003-01-01
The canonical joint measurement of position X and momentum P corresponds to measuring the commuting operators X_J=X+X', P_J=P-P', where the primed variables refer to an auxilary system in a minimum-uncertainty state. It is well known that Delta X_J Delta P_J >= hbar. Here it is shown that given the _same_ physical experimental setup, and information about the system_prior_ to measurement, one can make improved joint estimates X_est and P_est of X and P. These improved estimates are not only statistically closer to X and P: they satisfy Delta X_est Delta P_est >= hbar/4, where equality can be achieved in certain cases. Thus one can do up to four times better than the standard lower bound (where the latter corresponds to the limit of_no_ prior information). A formula is given for the optimal estimate of any observable, based on arbitrary measurement data and prior information about the state of the system, which generalises and provides a more robust interpretation of previous formulas for `local expectations' ...
Optimal Universal Uncertainty Relations
Li, Tao; Xiao, Yunlong; Ma, Teng; Fei, Shao-Ming; Jing, Naihuan; Li-Jost, Xianqing; Wang, Zhi-Xi
2016-01-01
We study universal uncertainty relations and present a method called joint probability distribution diagram to improve the majorization bounds constructed independently in [Phys. Rev. Lett. 111, 230401 (2013)] and [J. Phys. A. 46, 272002 (2013)]. The results give rise to state independent uncertainty relations satisfied by any nonnegative Schur-concave functions. On the other hand, a remarkable recent result of entropic uncertainty relation is the direct-sum majorization relation. In this paper, we illustrate our bounds by showing how they provide a complement to that in [Phys. Rev. A. 89, 052115 (2014)]. PMID:27775010
Measurement uncertainty relations
Busch, Paul, E-mail: paul.busch@york.ac.uk [Department of Mathematics, University of York, York (United Kingdom); Lahti, Pekka, E-mail: pekka.lahti@utu.fi [Turku Centre for Quantum Physics, Department of Physics and Astronomy, University of Turku, FI-20014 Turku (Finland); Werner, Reinhard F., E-mail: reinhard.werner@itp.uni-hannover.de [Institut für Theoretische Physik, Leibniz Universität, Hannover (Germany)
2014-04-15
Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.
Transfer Standard Uncertainty Can Cause Inconclusive Inter-Laboratory Comparisons.
Wright, John; Toman, Blaza; Mickan, Bodo; Wübbeler, Gerd; Bodnar, Olha; Elster, Clemens
2016-12-01
Inter-laboratory comparisons use the best available transfer standards to check the participants' uncertainty analyses, identify underestimated uncertainty claims or unknown measurement biases, and improve the global measurement system. For some measurands, instability of the transfer standard can lead to an inconclusive comparison result. If the transfer standard uncertainty is large relative to a participating laboratory's uncertainty, the commonly used standardized degree of equivalence ≤ 1 criterion does not always correctly assess whether a participant is working within their uncertainty claims. We show comparison results that demonstrate this issue and propose several criteria for assessing a comparison result as passing, failing, or inconclusive. We investigate the behavior of the standardized degree of equivalence and alternative comparison measures for a range of values of the transfer standard uncertainty relative to the individual laboratory uncertainty values. The proposed alternative criteria successfully discerned between passing, failing, and inconclusive comparison results for the cases we examined.
Pauli effects in uncertainty relations
Toranzo, I V; Esquivel, R O; Dehesa, J S
2014-01-01
In this letter we analyze the effect of the spin dimensionality of a physical system in two mathematical formulations of the uncertainty principle: a generalized Heisenberg uncertainty relation valid for all antisymmetric N-fermion wavefunctions, and the Fisher-information- based uncertainty relation valid for all antisymmetric N-fermion wavefunctions of central potentials. The accuracy of these spin-modified uncertainty relations is examined for all atoms from Hydrogen to Lawrencium in a self-consistent framework.
Variance-based uncertainty relations
Huang, Yichen
2010-01-01
It is hard to overestimate the fundamental importance of uncertainty relations in quantum mechanics. In this work, I propose state-independent variance-based uncertainty relations for arbitrary observables in both finite and infinite dimensional spaces. We recover the Heisenberg uncertainty principle as a special case. By studying examples, we find that the lower bounds provided by our new uncertainty relations are optimal or near-optimal. I illustrate the uses of our new uncertainty relations by showing that they eliminate one common obstacle in a sequence of well-known works in entanglement detection, and thus make these works much easier to access in applications.
Relational uncertainty in service dyads
Kreye, Melanie
2016-01-01
Purpose: Relational uncertainty determines how relationships develop because it enables the building of trust and commitment. However, relational uncertainty has not been explored in an inter-organisational setting. This paper investigates how organisations experience relational uncertainty...... via semi-structured interviews and secondary data. Findings: The findings suggest that relational uncertainty is caused by the partner’s unresolved organisational uncertainty, i.e. their lacking capabilities to deliver or receive (parts of) the service. Furthermore, we found that resolving...... and explain the actions of a partnering organisation due to a lack of knowledge about their abilities and intentions. Second, we present suitable organisational responses to relational uncertainty and their effect on service quality....
Uncertainty relation in Schwarzschild spacetime
Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng
2015-04-01
We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.
Relational uncertainty in service dyads
Kreye, Melanie
2016-01-01
in service dyads and how they resolve it through suitable organisational responses to increase the level of service quality. Design/methodology/approach: We apply the overall logic of Organisational Information-Processing Theory (OIPT) and present empirical insights from two industrial case studies collected...... via semi-structured interviews and secondary data. Findings: The findings suggest that relational uncertainty is caused by the partner’s unresolved organisational uncertainty, i.e. their lacking capabilities to deliver or receive (parts of) the service. Furthermore, we found that resolving...... the relational uncertainty increased the functional quality while resolving the partner’s organisational uncertainty increased the technical quality of the delivered service. Originality: We make two contributions. First, we introduce relational uncertainty to the OM literature as the inability to predict...
Thermodynamic and relativistic uncertainty relations
Artamonov, A. A.; Plotnikov, E. M.
2017-01-01
Thermodynamic uncertainty relation (UR) was verified experimentally. The experiments have shown the validity of the quantum analogue of the zeroth law of stochastic thermodynamics in the form of the saturated Schrödinger UR. We have also proposed a new type of UR for the relativistic mechanics. These relations allow us to consider macroscopic phenomena within the limits of the ratio of the uncertainty relations for different physical quantities.
Uncertainty relation in Schwarzschild spacetime
Jun Feng
2015-04-01
Full Text Available We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time–energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit −log2c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.
STANDARDIZATION OF UNCERTAINTY SITUATIONS IN TRAINING MODULES
Sergey A. Safontsev
2015-01-01
Full Text Available The aim of this study is the description of modular structure of the academic discipline in accordance with the requirements of Federal State Educational Standards.Methods. The authors use the methods of standardization of the educational system that are based on educational theory quality measurement. As the process of learning does not depend on the perspective of the diagnostician, the objectification of the results has been achieved by using relative units, allowing the authors to compare the effectiveness of different stages of education quality assessment with quantitative methods. Furthermore, sampling method, correlation and comparative analysis of statistical significance of the obtained distributions have been used to exclude from analytical data the results that were not confirmed experimentally.Results. Statistical methods are presented in a complex, allowing the authors to receive experimental result at level of the statistical importance of psychological and pedagogical researches. According to ideas of the competence-based education and general theory of systems and educational qualimetry, it is shown that constructs of vocational training are problem, test and detailed designs included in structure of educational modules. Funds of estimated means have been used to measure the level of trainee’s competences: – problematic situation of uncertainty orientation for current control; – situations of test and project orientation for boundary control at the end of each module, intermediate control in the form of the exam held at the end of the semester, as well as state certification of control at the end of the study at the university. Validity of the degree of interest of the learning process, reliability of coincidences constructive reflection of students’ own achievements with independent project performance and efficiency as the ratio of the result obtained to the costs of implementing the target function of the educational
Uncertainty relation for mutual information
Schneeloch, James; Broadbent, Curtis J.; Howell, John C.
2014-12-01
We postulate the existence of a universal uncertainty relation between the quantum and classical mutual informations between pairs of quantum systems. Specifically, we propose that the sum of the classical mutual information, determined by two mutually unbiased pairs of observables, never exceeds the quantum mutual information. We call this the complementary-quantum correlation (CQC) relation and prove its validity for pure states, for states with one maximally mixed subsystem, and for all states when one measurement is minimally disturbing. We provide results of a Monte Carlo simulation suggesting that the CQC relation is generally valid. Importantly, we also show that the CQC relation represents an improvement to an entropic uncertainty principle in the presence of a quantum memory, and that it can be used to verify an achievable secret key rate in the quantum one-time pad cryptographic protocol.
Uncertainty Relation from Holography Principle
Chen, Jia-Zhong; Jia, Duoje
2004-01-01
We propose that the information and entropy of an isolated system are two sides of one coin in the sense that they can convert into each other by measurement and evolution of the system while the sum of them is identically conserved. The holographic principle is reformulated in the way that this conserved sum is bounded by a quarter of the area A of system boundary. Uncertainty relation is derived from the holographic principle.
Strong majorization entropic uncertainty relations
Rudnicki, Lukasz [Freiburg Institute for Advanced Studies, Albert-Ludwigs University of Freiburg, Albertstrasse 19, 79104 Freiburg (Germany); Center for Theoretical Physics, Polish Academy of Sciences, Aleja Lotnikow 32/46, PL-02-668 Warsaw (Poland); Puchala, Zbigniew [Institute of Theoretical and Applied Informatics, Polish Academy of Sciences, Baltycka 5, 44-100 Gliwice (Poland); Institute of Physics, Jagiellonian University, ul Reymonta 4, 30-059 Krakow (Poland); Zyczkowski, Karol [Center for Theoretical Physics, Polish Academy of Sciences, Aleja Lotnikow 32/46, PL-02-668 Warsaw (Poland); Institute of Physics, Jagiellonian University, ul Reymonta 4, 30-059 Krakow (Poland)
2014-07-01
We present new entropic uncertainty relations in a finite-dimensional Hilbert space. Using the majorization technique we derive several explicit lower bounds for the sum of two Renyi entropies of the same order. Obtained bounds are expressed in terms of the largest singular values of given unitary matrices. Numerical simulations with random unitary matrices show that our bound is almost always stronger than the well known result of Maassen and Uffink.
Uncertainty Relations and Possible Experience
Gregg Jaeger
2016-06-01
Full Text Available The uncertainty principle can be understood as a condition of joint indeterminacy of classes of properties in quantum theory. The mathematical expressions most closely associated with this principle have been the uncertainty relations, various inequalities exemplified by the well known expression regarding position and momentum introduced by Heisenberg. Here, recent work involving a new sort of “logical” indeterminacy principle and associated relations introduced by Pitowsky, expressable directly in terms of probabilities of outcomes of measurements of sharp quantum observables, is reviewed and its quantum nature is discussed. These novel relations are derivable from Boolean “conditions of possible experience” of the quantum realm and have been considered both as fundamentally logical and as fundamentally geometrical. This work focuses on the relationship of indeterminacy to the propositions regarding the values of discrete, sharp observables of quantum systems. Here, reasons for favoring each of these two positions are considered. Finally, with an eye toward future research related to indeterminacy relations, further novel approaches grounded in category theory and intended to capture and reconceptualize the complementarity characteristics of quantum propositions are discussed in relation to the former.
Uncertainty Relation and Inseparability Criterion
Goswami, Ashutosh K.; Panigrahi, Prasanta K.
2016-11-01
We investigate the Peres-Horodecki positive partial transpose criterion in the context of conserved quantities and derive a condition of inseparability for a composite bipartite system depending only on the dimensions of its subsystems, which leads to a bi-linear entanglement witness for the two qubit system. A separability inequality using generalized Schrodinger-Robertson uncertainty relation taking suitable operators, has been derived, which proves to be stronger than the bi-linear entanglement witness operator. In the case of mixed density matrices, it identically distinguishes the separable and non separable Werner states.
Aspects of universally valid Heisenberg uncertainty relation
Fujikawa, Kazuo
2012-01-01
A numerical illustration of a universally valid Heisenberg uncertainty relation, which was proposed recently, is presented by using the experimental data on spin-measurements by J. Erhart, et al.[ Nature Phys. {\\bf 8}, 185 (2012)]. This uncertainty relation is closely related to a modified form of the Arthurs-Kelly uncertainty relation which is also tested by the spin-measurements. The universally valid Heisenberg uncertainty relation always holds, but both the modified Arthurs-Kelly uncertainty relation and Heisenberg's error-disturbance relation proposed by Ozawa, which was analyzed in the original experiment, fail in the present context of spin-measurements, and the cause of their failure is identified with the assumptions of unbiased measurement and disturbance. It is also shown that all the universally valid uncertainty relations are derived from Robertson's relation and thus the essence of the uncertainty relation is exhausted by Robertson's relation as is widely accepted.
Uncertainty Relations for Analog Signals
Eldar, Yonina C
2008-01-01
In the past several years there has been a surge of research investigating various aspects of sparse representations and compressed sensing. Most of this work has focused on the finite-dimensional setting in which the goal is to decompose a finite-length vector into a given finite dictionary. Underlying many of these results is the conceptual notion of an uncertainty principle: a signal cannot be sparsely represented in two different bases. Here, we extend these ideas and results to the analog, infinite-dimensional setting by considering signals that lie in a finitely-generated shift-invariant (SI) space. This class of signals is rich enough to include many interesting special cases such as multiband signals and splines. By adapting the notion of coherence defined for finite dictionaries to infinite SI representations, we develop an uncertainty principle similar in spirit to its finite counterpart. We demonstrate tightness of our bound by considering a bandlimited low-pass comb that achieves the uncertainty p...
Uncertainty relations for general unitary operators
Bagchi, Shrobona; Pati, Arun Kumar
2016-10-01
We derive several uncertainty relations for two arbitrary unitary operators acting on physical states of a Hilbert space. We show that our bounds are tighter in various cases than the ones existing in the current literature. Using the uncertainty relation for the unitary operators, we obtain the tight state-independent lower bound for the uncertainty of two Pauli observables and anticommuting observables in higher dimensions. With regard to the minimum-uncertainty states, we derive the minimum-uncertainty state equation by the analytic method and relate this to the ground-state problem of the Harper Hamiltonian. Furthermore, the higher-dimensional limit of the uncertainty relations and minimum-uncertainty states are explored. From an operational point of view, we show that the uncertainty in the unitary operator is directly related to the visibility of quantum interference in an interferometer where one arm of the interferometer is affected by a unitary operator. This shows a principle of preparation uncertainty, i.e., for any quantum system, the amount of visibility for two general noncommuting unitary operators is nontrivially upper bounded.
Nonclassicality in phase-number uncertainty relations
Matia-Hernando, Paloma; Luis, Alfredo [Departamento de Optica, Facultad de Ciencias Fisicas, Universidad Complutense, 28040 Madrid (Spain)
2011-12-15
We show that there are nonclassical states with lesser joint fluctuations of phase and number than any classical state. This is rather paradoxical since one would expect classical coherent states to be always of minimum uncertainty. The same result is obtained when we replace phase by a phase-dependent field quadrature. Number and phase uncertainties are assessed using variance and Holevo relation.
Accounting for the Uncertainty in Performance Standards.
deGruijter, Dato N. M.
The setting of standards involves subjective value judgments. The inherent arbitrariness of specific standards has been severely criticized by Glass. His antagonists agree that standard setting is a judgmental task but they have pointed out that arbitrariness in the positive sense of serious judgmental decisions is unavoidable. Further, small…
Uncertainty Relations in Terms of Fisher Information
LUO Shun-Long
2001-01-01
By virtue of the well-known concept of Fisher information in the theory of statistical inference, we obtain an inequality chain which generalizes and refines the conventional Heisenberg uncertainty relations.``
Positive phase space distributions and uncertainty relations
Kruger, Jan
1993-01-01
In contrast to a widespread belief, Wigner's theorem allows the construction of true joint probabilities in phase space for distributions describing the object system as well as for distributions depending on the measurement apparatus. The fundamental role of Heisenberg's uncertainty relations in Schroedinger form (including correlations) is pointed out for these two possible interpretations of joint probability distributions. Hence, in order that a multivariate normal probability distribution in phase space may correspond to a Wigner distribution of a pure or a mixed state, it is necessary and sufficient that Heisenberg's uncertainty relation in Schroedinger form should be satisfied.
Detecting multimode entanglement by symplectic uncertainty relations
Serafini, A
2005-01-01
Quantities invariant under symplectic (i.e. linear and canonical) transformations are constructed as functions of the second moments of N pairs of bosonic field operators. A general multimode uncertainty relation is derived as a necessary constraint on such symplectic invariants. In turn, necessary conditions for the separability of multimode continuous variable states under (MxN)-mode bipartitions are derived from the uncertainty relation. These conditions are proven to be necessary and sufficient for (1+N)-mode Gaussian states and for (M+N)-mode bisymmetric Gaussian states.
Exact Discrete Analogs of Canonical Commutation and Uncertainty Relations
Vasily E. Tarasov
2016-06-01
Full Text Available An exact discretization of the canonical commutation and corresponding uncertainty relations are suggested. We prove that the canonical commutation relations of discrete quantum mechanics, which is based on standard finite difference, holds for constant wave functions only. In this paper, we use the recently proposed exact discretization of derivatives, which is based on differences that are represented by infinite series. This new mathematical tool allows us to build sensible discrete quantum mechanics based on the suggested differences and includes the correct canonical commutation and uncertainty relations.
Uncertainty relations and approximate quantum error correction
Renes, Joseph M.
2016-09-01
The uncertainty principle can be understood as constraining the probability of winning a game in which Alice measures one of two conjugate observables, such as position or momentum, on a system provided by Bob, and he is to guess the outcome. Two variants are possible: either Alice tells Bob which observable she measured, or he has to furnish guesses for both cases. Here I derive uncertainty relations for both, formulated directly in terms of Bob's guessing probabilities. For the former these relate to the entanglement that can be recovered by action on Bob's system alone. This gives an explicit quantum circuit for approximate quantum error correction using the guessing measurements for "amplitude" and "phase" information, implicitly used in the recent construction of efficient quantum polar codes. I also find a relation on the guessing probabilities for the latter game, which has application to wave-particle duality relations.
Uncertainty relations based on skew information with quantum memory
Ma, ZhiHao; Chen, ZhiHua; Fei, Shao-Ming
2017-01-01
We present a new uncertainty relation by defining a measure of uncertainty based on skew information. For bipartite systems, we establish uncertainty relations with the existence of a quantum memory. A general relation between quantum correlations and tight bounds of uncertainty has been presented.
Approaches to handling uncertainty when setting environmental exposure standards
Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe
2009-01-01
Mathematical modelling has become in recent years an essential tool for the prediction of environmental change and for the development of sustainable policies. Yet, many of the uncertainties associated with modelling efforts appear poorly understood by many, especially by policy makers. This book...... attempts for the first time to cover the full range of issues related to model uncertainties, from the subjectivity of setting up a conceptual model of a given system, all the way to communicating the nature of model uncertainties to non-scientists and accounting for model uncertainties in policy decisions....... Theoretical chapters, providing background information on specific steps in the modelling process and in the adoption of models by end-users, are complemented by illustrative case studies dealing with soils and global climate change. All the chapters are authored by recognized experts in their respective...
Elcio Cruz de Oliveira
2010-01-01
Full Text Available Traditionally, in the cigarettes industry, the determination of ammonium ion in the mainstream smoke is performed by ion chromatography. This work studies this determination and compares the results of this technique with the use of external and internal standard calibration. A reference cigarette sample presented measurement uncertainty of 2.0 μg/cigarette and 1.5 μg/cigarette, with external and internal standard, respectively. It is observed that the greatest source of uncertainty is the bias correction factor and that it is even more significant when using external standard, confirming thus the importance of internal standardization for this correction.
Error-disturbance uncertainty relations studied in neutron optics
Sponar, Stephan; Sulyok, Georg; Demirel, Bulent; Hasegawa, Yuji
2016-09-01
Heisenberg's uncertainty principle is probably the most famous statement of quantum physics and its essential aspects are well described by a formulations in terms of standard deviations. However, a naive Heisenberg-type error-disturbance relation is not valid. An alternative universally valid relation was derived by Ozawa in 2003. Though universally valid Ozawa's relation is not optimal. Recently, Branciard has derived a tight error-disturbance uncertainty relation (EDUR), describing the optimal trade-off between error and disturbance. Here, we report a neutron-optical experiment that records the error of a spin-component measurement, as well as the disturbance caused on another spin-component to test EDURs. We demonstrate that Heisenberg's original EDUR is violated, and the Ozawa's and Branciard's EDURs are valid in a wide range of experimental parameters, applying a new measurement procedure referred to as two-state method.
Uncertainty relation based on unbiased parameter estimations
Sun, Liang-Liang; Song, Yong-Shun; Qiao, Cong-Feng; Yu, Sixia; Chen, Zeng-Bing
2017-02-01
Heisenberg's uncertainty relation has been extensively studied in spirit of its well-known original form, in which the inaccuracy measures used exhibit some controversial properties and don't conform with quantum metrology, where the measurement precision is well defined in terms of estimation theory. In this paper, we treat the joint measurement of incompatible observables as a parameter estimation problem, i.e., estimating the parameters characterizing the statistics of the incompatible observables. Our crucial observation is that, in a sequential measurement scenario, the bias induced by the first unbiased measurement in the subsequent measurement can be eradicated by the information acquired, allowing one to extract unbiased information of the second measurement of an incompatible observable. In terms of Fisher information we propose a kind of information comparison measure and explore various types of trade-offs between the information gains and measurement precisions, which interpret the uncertainty relation as surplus variance trade-off over individual perfect measurements instead of a constraint on extracting complete information of incompatible observables.
Entanglement and discord assisted entropic uncertainty relations under decoherence
Yao, ChunMei; Chen, ZhiHua; Ma, ZhiHao; Severini, Simone; Serafini, Alessio
2014-09-01
The uncertainty principle is a crucial aspect of quantum mechanics. It has been shown that quantum entanglement as well as more general notions of correlations, such as quantum discord, can relax or tighten the entropic uncertainty relation in the presence of an ancillary system. We explored the behaviour of entropic uncertainty relations for system of two qubits-one of which subjects to several forms of independent quantum noise, in both Markovian and non-Markovian regimes. The uncertainties and their lower bounds, identified by the entropic uncertainty relations, increase under independent local unital Markovian noisy channels, but they may decrease under non-unital channels. The behaviour of the uncertainties (and lower bounds) exhibit periodical oscillations due to correlation dynamics under independent non-Markovian reservoirs. In addition, we compare different entropic uncertainty relations in several special cases and find that discord-tightened entropic uncertainty relations offer in general a better estimate of the uncertainties in play.
Uncertainty Relation between Angular Momentum and Angle Variable.
Roy, C. L.; Sannigrahi, A. B.
1979-01-01
Discusses certain pitfalls regarding the uncertainty relation between angular momentum and the angle variable from a pedagogic point of view. Further, an uncertainty relation has been derived for these variables in a simple and consistant manner. (Author/HM)
Role of information theoretic uncertainty relations in quantum theory
Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)
2015-04-15
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.
Spin and localization of relativistic fermions and uncertainty relations
Céleri, Lucas C.; Kiosses, Vasilis; Terno, Daniel R.
2016-12-01
We discuss relations between several relativistic spin observables and derive a Lorentz-invariant characteristic of a reduced spin density matrix. A relativistic position operator that satisfies all the properties of its nonrelativistic analog does not exist. Instead we propose two causality-preserving positive operator-valued measures (POVMs) that are based on projections onto one-particle and antiparticle spaces, and on the normalized energy density. They predict identical expectation values for position. The variances differ by less than a quarter of the squared de Broglie wavelength and coincide in the nonrelativistic limit. Since the resulting statistical moment operators are not canonical conjugates of momentum, the Heisenberg uncertainty relations need not hold. Indeed, the energy density POVM leads to a lower uncertainty. We reformulate the standard equations of the spin dynamics by explicitly considering the charge-independent acceleration, allowing a consistent treatment of backreaction and inclusion of a weak gravitational field.
Conventionalism and Methodological Standards in Contending with Skepticism about Uncertainty
Brumble, K. C.
2012-12-01
What it means to measure and interpret confidence and uncertainty in a result is often particular to a specific scientific community and its methodology of verification. Additionally, methodology in the sciences varies greatly across disciplines and scientific communities. Understanding the accuracy of predictions of a particular science thus depends largely upon having an intimate working knowledge of the methods, standards, and conventions utilized and underpinning discoveries in that scientific field. Thus, valid criticism of scientific predictions and discoveries must be conducted by those who are literate in the field in question: they must have intimate working knowledge of the methods of the particular community and of the particular research under question. The interpretation and acceptance of uncertainty is one such shared, community-based convention. In the philosophy of science, this methodological and community-based way of understanding scientific work is referred to as conventionalism. By applying the conventionalism of historian and philosopher of science Thomas Kuhn to recent attacks upon methods of multi-proxy mean temperature reconstructions, I hope to illuminate how climate skeptics and their adherents fail to appreciate the need for community-based fluency in the methodological standards for understanding uncertainty shared by the wider climate science community. Further, I will flesh out a picture of climate science community standards of evidence and statistical argument following the work of philosopher of science Helen Longino. I will describe how failure to appreciate the conventions of professionalism and standards of evidence accepted in the climate science community results in the application of naïve falsification criteria. Appeal to naïve falsification in turn has allowed scientists outside the standards and conventions of the mainstream climate science community to consider themselves and to be judged by climate skeptics as valid
Fine-grained uncertainty relation under the relativistic motion
Feng, Jun; Gould, Mark D; Fan, Heng
2014-01-01
One of the most important features of quantum theory is the uncertainty principle. Amount various uncertainty relations, the profound Fine-Grained Uncertainty Relation (FGUR) is used to distinguish the uncertainty inherent in obtaining any combination of outcomes for different measurements. In this paper, we explore this uncertainty relation in relativistic regime. For observer undergoes an uniform acceleration who immersed in an Unruh thermal bath, we show that the uncertainty bound is dependent on the acceleration parameter and choice of Unruh modes. Dramatically, we find that the measurements in Mutually Unbiased Bases (MUBs), sharing same uncertainty bound in inertial frame, could be distinguished from each other for a noninertial observer. On the other hand, once the Unruh decoherence is prevented by utilizing the cavity, the entanglement could be generated from nonuniform motion. We show that, for the observer restricted in a single rigid cavity, the uncertainty exhibits a periodic evolution with respec...
Some applications of uncertainty relations in quantum information
Majumdar, A. S.; Pramanik, T.
2016-08-01
We discuss some applications of various versions of uncertainty relations for both discrete and continuous variables in the context of quantum information theory. The Heisenberg uncertainty relation enables demonstration of the Einstein, Podolsky and Rosen (EPR) paradox. Entropic uncertainty relations (EURs) are used to reveal quantum steering for non-Gaussian continuous variable states. EURs for discrete variables are studied in the context of quantum memory where fine-graining yields the optimum lower bound of uncertainty. The fine-grained uncertainty relation is used to obtain connections between uncertainty and the nonlocality of retrieval games for bipartite and tripartite systems. The Robertson-Schrödinger (RS) uncertainty relation is applied for distinguishing pure and mixed states of discrete variables.
EDITORIAL: Squeezed states and uncertainty relations
Jauregue-Renaud, Rocio; Kim, Young S.; Man'ko, Margarita A.; Moya-Cessa, Hector
2004-06-01
This special issue of Journal of Optics B: Quantum and Semiclassical Optics is composed mainly of extended versions of talks and papers presented at the Eighth International Conference on Squeezed States and Uncertainty Relations held in Puebla, Mexico on 9-13 June 2003. The Conference was hosted by Instituto de Astrofísica, Óptica y Electrónica, and the Universidad Nacional Autónoma de México. This series of meetings began at the University of Maryland, College Park, USA, in March 1991. The second and third workshops were organized by the Lebedev Physical Institute in Moscow, Russia, in 1992 and by the University of Maryland Baltimore County, USA, in 1993, respectively. Afterwards, it was decided that the workshop series should be held every two years. Thus the fourth meeting took place at the University of Shanxi in China and was supported by the International Union of Pure and Applied Physics (IUPAP). The next three meetings in 1997, 1999 and 2001 were held in Lake Balatonfüred, Hungary, in Naples, Italy, and in Boston, USA, respectively. All of them were sponsored by IUPAP. The ninth workshop will take place in Besançon, France, in 2005. The conference has now become one of the major international meetings on quantum optics and the foundations of quantum mechanics, where most of the active research groups throughout the world present their new results. Accordingly this conference has been able to align itself to the current trend in quantum optics and quantum mechanics. The Puebla meeting covered most extensively the following areas: quantum measurements, quantum computing and information theory, trapped atoms and degenerate gases, and the generation and characterization of quantum states of light. The meeting also covered squeeze-like transformations in areas other than quantum optics, such as atomic physics, nuclear physics, statistical physics and relativity, as well as optical devices. There were many new participants at this meeting, particularly
The Asymptotic Standard Errors of Some Estimates of Uncertainty in the Two-Way Contingency Table
Brown, Morton B.
1975-01-01
Estimates of conditional uncertainty, contingent uncertainty, and normed modifications of contingent uncertainity have been proposed for the two-way contingency table. The asymptotic standard errors of the estimates are derived. (Author)
New entropic uncertainty relations for prime power dimensions
Funder, Jakob Løvstad
2011-01-01
We consider the question of entropic uncertainty relations for prime power dimensions. In order to improve upon such uncertainty relations for higher dimensional quantum systems, we derive a tight lower bound amount of entropy for multiple probability distributions under the constraint that the sum...
Uncertainty relations, zero point energy and the linear canonical group
Sudarshan, E. C. G.
1993-01-01
The close relationship between the zero point energy, the uncertainty relations, coherent states, squeezed states, and correlated states for one mode is investigated. This group-theoretic perspective enables the parametrization and identification of their multimode generalization. In particular the generalized Schroedinger-Robertson uncertainty relations are analyzed. An elementary method of determining the canonical structure of the generalized correlated states is presented.
Relating confidence to measured information uncertainty in qualitative reasoning
Chavez, Gregory M [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory
2010-10-07
Qualitative reasoning makes use of qualitative assessments provided by subject matter experts to model factors such as security risk. Confidence in a result is important and useful when comparing competing results. Quantifying the confidence in an evidential reasoning result must be consistent and based on the available information. A novel method is proposed to relate confidence to the available information uncertainty in the result using fuzzy sets. Information uncertainty can be quantified through measures of non-specificity and conflict. Fuzzy values for confidence are established from information uncertainty values that lie between the measured minimum and maximum information uncertainty values.
Improved entropic uncertainty relations and information exclusion relations
Coles, Patrick J.; Piani, Marco
2013-01-01
The uncertainty principle can be expressed in entropic terms, also taking into account the role of entanglement in reducing uncertainty. The information exclusion principle bounds instead the correlations that can exist between the outcomes of incompatible measurements on one physical system, and a second reference system. We provide a more stringent formulation of both the uncertainty principle and the information exclusion principle, with direct applications for, e.g., the security analysis...
Fractional revivals through Rényi uncertainty relations
Romera, E.; de Los Santos, F.
2008-07-01
We show that the Rényi uncertainty relations give a good description of the dynamical behavior of wave packets and constitute a sound approach to revival phenomena by analyzing three model systems: the simple harmonic oscillator, the infinite square well, and the quantum bouncer. We prove the usefulness of entropic uncertainty relations as a tool for identifying fractional revivals by providing a comparison in different contexts with the usual Heisenberg uncertainty relation and with the common approach in terms of the autocorrelation function.
Multimode uncertainty relations and separability of continuous variable states
Serafini, A
2006-01-01
A multimode uncertainty relation (generalising the Robertson-Schroedinger relation) is derived as a necessary constraint on the second moments of n pairs of canonical operators. In turn, necessary conditions for the separability of multimode continuous variable states under (m+n)-mode bipartitions are derived from the uncertainty relation. These conditions are proven to be necessary and sufficient for (1+n)-mode Gaussian states and for (m+n)-mode bisymmetric Gaussian states.
Do the Uncertainty Relations Really have Crucial Significances for Physics?
Dumitru S.
2010-10-01
Full Text Available It is proved the falsity of idea that the Uncertainty Relations (UR have crucial significances for physics. Additionally one argues for the necesity of an UR-disconnected quantum philosophy.
Optimal uncertainty relations in a modified Heisenberg algebra
Abdelkhalek, Kais; Fiedler, Leander; Mangano, Gianpiero; Schwonnek, René
2016-01-01
Various theories that aim at unifying gravity with quantum mechanics suggest modifications of the Heisenberg algebra for position and momentum. From the perspective of quantum mechanics, such modifications lead to new uncertainty relations which are thought (but not proven) to imply the existence of a minimal observable length. Here we prove this statement in a framework of sufficient physical and structural assumptions. Moreover, we present a general method that allows to formulate optimal and state-independent variance-based uncertainty relations. In addition, instead of variances, we make use of entropies as a measure of uncertainty and provide uncertainty relations in terms of min- and Shannon entropies. We compute the corresponding entropic minimal lengths and find that the minimal length in terms of min-entropy is exactly one bit.
Optimal uncertainty relations in a modified Heisenberg algebra
Abdelkhalek, Kais; Chemissany, Wissam; Fiedler, Leander; Mangano, Gianpiero; Schwonnek, René
2016-12-01
Various theories that aim at unifying gravity with quantum mechanics suggest modifications of the Heisenberg algebra for position and momentum. From the perspective of quantum mechanics, such modifications lead to new uncertainty relations that are thought (but not proven) to imply the existence of a minimal observable length. Here we prove this statement in a framework of sufficient physical and structural assumptions. Moreover, we present a general method that allows us to formulate optimal and state-independent variance-based uncertainty relations. In addition, instead of variances, we make use of entropies as a measure of uncertainty and provide uncertainty relations in terms of min and Shannon entropies. We compute the corresponding entropic minimal lengths and find that the minimal length in terms of min entropy is exactly 1 bit.
Physics-related epistemic uncertainties in proton depth dose simulation
Pia, Maria Grazia; Lechner, Anton; Quintieri, Lina; Saracco, Paolo
2010-01-01
A set of physics models and parameters pertaining to the simulation of proton energy deposition in matter are evaluated in the energy range up to approximately 65 MeV, based on their implementations in the Geant4 toolkit. The analysis assesses several features of the models and the impact of their associated epistemic uncertainties, i.e. uncertainties due to lack of knowledge, on the simulation results. Possible systematic effects deriving from uncertainties of this kind are highlighted; their relevance in relation to the application environment and different experimental requirements are discussed, with emphasis on the simulation of radiotherapy set-ups. By documenting quantitatively the features of a wide set of simulation models and the related intrinsic uncertainties affecting the simulation results, this analysis provides guidance regarding the use of the concerned simulation tools in experimental applications; it also provides indications for further experimental measurements addressing the sources of s...
New Inequalities and Uncertainty Relations on Linear Canonical Transform Revisit
Xu Guanlei
2009-01-01
Full Text Available The uncertainty principle plays an important role in mathematics, physics, signal processing, and so on. Firstly, based on definition of the linear canonical transform (LCT and the traditional Pitt's inequality, one novel Pitt's inequality in the LCT domains is obtained, which is connected with the LCT parameters a and b. Then one novel logarithmic uncertainty principle is derived from this novel Pitt's inequality in the LCT domains, which is associated with parameters of the two LCTs. Secondly, from the relation between the original function and LCT, one entropic uncertainty principle and one Heisenberg's uncertainty principle in the LCT domains are derived, which are associated with the LCT parameters a and b. The reason why the three lower bounds are only associated with LCT parameters a and b and independent of c and d is presented. The results show it is possible that the bounds tend to zeros.
The Second International Workshop on Squeezed States and Uncertainty Relations
Han, D. (Editor); Kim, Y. S.; Manko, V. I.
1993-01-01
This conference publication contains the proceedings of the Second International Workshop on Squeezed States and Uncertainty Relations held in Moscow, Russia, on 25-29 May 1992. The purpose of this workshop was to study possible applications of squeezed states of light. The Workshop brought together many active researchers in squeezed states of light and those who may find the concept of squeezed states useful in their research, particularly in understanding the uncertainty relations. It was found at this workshop that the squeezed state has a much broader implication than the two-photon coherent states in quantum optics, since the squeeze transformation is one of the most fundamental transformations in physics.
Approaches to handling uncertainty when setting environmental exposure standards
Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe
2009-01-01
. Theoretical chapters, providing background information on specific steps in the modelling process and in the adoption of models by end-users, are complemented by illustrative case studies dealing with soils and global climate change. All the chapters are authored by recognized experts in their respective...... disciplines, and provide a timely and uniquely comprehensive coverage of an important field. Written for: Environmental scientists, global climate change experts, environmental policy makers, environmental activists......Mathematical modelling has become in recent years an essential tool for the prediction of environmental change and for the development of sustainable policies. Yet, many of the uncertainties associated with modelling efforts appear poorly understood by many, especially by policy makers. This book...
Error-disturbance uncertainty relations in neutron spin measurements
Sponar, Stephan
2016-05-01
Heisenberg’s uncertainty principle in a formulation of uncertainties, intrinsic to any quantum system, is rigorously proven and demonstrated in various quantum systems. Nevertheless, Heisenberg’s original formulation of the uncertainty principle was given in terms of a reciprocal relation between the error of a position measurement and the thereby induced disturbance on a subsequent momentum measurement. However, a naive generalization of a Heisenberg-type error-disturbance relation for arbitrary observables is not valid. An alternative universally valid relation was derived by Ozawa in 2003. Though universally valid, Ozawa’s relation is not optimal. Recently, Branciard has derived a tight error-disturbance uncertainty relation (EDUR), describing the optimal trade-off between error and disturbance under certain conditions. Here, we report a neutron-optical experiment that records the error of a spin-component measurement, as well as the disturbance caused on another spin-component to test EDURs. We demonstrate that Heisenberg’s original EDUR is violated, and Ozawa’s and Branciard’s EDURs are valid in a wide range of experimental parameters, as well as the tightness of Branciard’s relation.
Fifth International Conference on Squeezed States and Uncertainty Relations
Han, D. (Editor); Janszky, J. (Editor); Kim, Y. S. (Editor); Man'ko, V. I. (Editor)
1998-01-01
The Fifth International Conference on Squeezed States and Uncertainty Relations was held at Balatonfured, Hungary, on 27-31 May 1997. This series was initiated in 1991 at the College Park Campus of the University of Maryland as the Workshop on Squeezed States and Uncertainty Relations. The scientific purpose of this series was to discuss squeezed states of light, but in recent years the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics including quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic. As the meeting attracted more participants and started covering more diversified subjects, the fourth meeting was called an international conference. The Fourth International Conference on Squeezed States and Uncertainty Relations was held in 1995 was hosted by Shanxi University in Taiyuan, China. The fifth meeting of this series, which was held at Balatonfured, Hungary, was also supported by the IUPAP. In 1999, the Sixth International Conference will be hosted by the University of Naples in 1999. The meeting will take place in Ravello near Naples.
Uncertainty related to Environmental Data and Estimated Extreme Events
Burcharth, H. F.
The design loads on rubble mound breakwaters are almost entirely determined by the environmental conditions, i.e. sea state, water levels, sea bed characteristics, etc. It is the objective of sub-group B to identify the most important environmental parameters and evaluate the related uncertaintie...
Uncertainty relations in quantum optics. Is the photon intelligent?
Przanowski, Maciej; García-Compeán, Hugo; Tosiek, Jaromir; Turrubiates, Francisco J.
2016-10-01
The Robertson-Schrödinger, Heisenberg-Robertson and Trifonov uncertainty relations for arbitrary two functions f1 and f2 depending on the quantum phase and the number of photons respectively, are given. Intelligent states and states which minimize locally the product of uncertainties (Δf1) 2 ṡ(Δf2) 2 or the sum (Δf1) 2 +(Δf2) 2 are investigated for the cases f1 = ϕ , exp(iϕ) , exp(- iϕ) , cos ϕ , sin ϕ and f2 = n.
Implementation of unscented transform to estimate the uncertainty of a liquid flow standard system
Chun, Sejong; Choi, Hae-Man; Yoon, Byung-Ro; Kang, Woong [Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)
2017-03-15
First-order partial derivatives of a mathematical model are an essential part of evaluating the measurement uncertainty of a liquid flow standard system according to the Guide to the expression of uncertainty in measurement (GUM). Although the GUM provides a straightforward method to evaluate the measurement uncertainty of volume flow rate, the first-order partial derivatives can be complicated. The mathematical model of volume flow rate in a liquid flow standard system has a cross-correlation between liquid density and buoyancy correction factor. This cross-correlation can make derivation of the first-order partial derivatives difficult. Monte Carlo simulation can be used as an alternative method to circumvent the difficulty in partial derivation. However, the Monte Carlo simulation requires large computational resources for a correct simulation because it considers the completeness issue whether an ideal or a real operator conducts an experiment to evaluate the measurement uncertainty. Thus, the Monte Carlo simulation needs a large number of samples to ensure that the uncertainty evaluation is as close to the GUM as possible. Unscented transform can alleviate this problem because unscented transform can be regarded as a Monte Carlo simulation with an infinite number of samples. This idea means that unscented transform considers the uncertainty evaluation with respect to the ideal operator. Thus, unscented transform can evaluate the measurement uncertainty the same as the uncertainty that the GUM provides.
String theory, scale relativity and the generalized uncertainty principle
Castro, C
1995-01-01
An extension/ modification of the Stringy Heisenberg Uncertainty principle is derived within the framework of the theory of Special Scale-Relativity proposed by Nottale. Based on the fractal structure of two dimensional Quantum Gravity which has attracted considerable interest recently we conjecture that the underlying fundamental principle behind String theory should be based on an extension of Scale Relativity where both dynamics as well as scales are incorporated in the same footing.
Comment on the uncertainty relation with periodic boundary conditions
Fujikawa, Kazuo
2010-01-01
The Kennard-type uncertainty relation $\\Delta x\\Delta p >\\frac{\\hbar}{2}$ is formulated for a free particle with given momentum $ inside a box with periodic boundary conditions in the large box limit. Our construction of a free particle state is analogous to that of the Bloch wave in a periodic potential. A simple Robertson-type relation, which minimizes the effect of the box boundary and may be useful in some practical applications, is also presented.
Multiplatform application for calculating a combined standard uncertainty using a Monte Carlo method
Niewinski, Marek; Gurnecki, Pawel
2016-12-01
The paper presents a new computer program for calculating a combined standard uncertainty. It implements the algorithm described in JCGM 101:20081 which is concerned with the use of a Monte Carlo method as an implementation of the propagation of distributions for uncertainty evaluation. The accuracy of the calculation has been obtained by using the high quality random number generators. The paper describes the main principles of the program and compares the obtained result with example problems presented in JCGM Supplement 1.
Fourth International Conference on Squeezed States and Uncertainty Relations
Han, D. (Editor); Peng, Kunchi (Editor); Kim, Y. S. (Editor); Manko, V. I. (Editor)
1996-01-01
The fourth International Conference on Squeezed States and Uncertainty Relations was held at Shanxi University, Taiyuan, Shanxi, China, on June 5 - 9, 1995. This conference was jointly organized by Shanxi University, the University of Maryland (U.S.A.), and the Lebedev Physical Institute (Russia). The first meeting of this series was called the Workshop on Squeezed States and Uncertainty Relations, and was held in 1991 at College Park, Maryland. The second and third meetings in this series were hosted in 1992 by the Lebedev Institute in Moscow, and in 1993 by the University of Maryland Baltimore County, respectively. The scientific purpose of this series was initially to discuss squeezed states of light, but in recent years, the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics, including, of course, quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic transformation. This transition took place at the fourth meeting of this series held at Shanxi University in 1995. The fifth meeting in this series will be held in Budapest (Hungary) in 1997, and the principal organizer will be Jozsef Janszky of the Laboratory of Crystal Physics, P.O. Box 132, H-1052. Budapest, Hungary.
The Physics of Interdependence, Social Uncertainty Relations, and Incompleteness
W.F. Lawless
2015-01-01
Full Text Available We report on the development of a mathematical model of social uncertainty relations to replace traditional models of the interaction, as well as a model of complexity from econophysics. Our goal with this mathematics is to control hybrid teams, firms and systems (i.e., where “hybrids” are arbitrary combinations of humans, robots and machines. But uncertainty is created by states of interdependence between social objects: at one extreme, interdependence reduces to independence between agents, producing rational but asocial effects; at the other extreme, interdependence de-individuates a group’s members until individual identity dissolves into a group (e.g., strong cults, mobs, gangs, and well-run teams and firms. In other studies, we have reviewed the structure of teams; in this report, we focus on how interdependence impedes efforts at direct control by making meaning incomplete. We begin with bistability to simplify interdependence, and generalize to full interdependence.
Realistic Approach of the Relations of Uncertainty of Heisenberg
Paul E. Sterian
2013-01-01
Full Text Available Due to the requirements of the principle of causality in the theory of relativity, one cannot make a device for the simultaneous measuring of the canonical conjugate variables in the conjugate Fourier spaces. Instead of admitting that a particle’s position and its conjugate momentum cannot be accurately measured at the same time, we consider the only probabilities which can be determined when working at subatomic level to be valid. On the other hand, based on Schwinger's action principle and using the quadridimensional form of the unitary transformation generator function of the quantum operators in the paper, the general form of the evolution equation for these operators is established. In the nonrelativistic case one obtains the Heisenberg's type evolution equations which can be particularized to derive Heisenberg's uncertainty relations. The analysis of the uncertainty relations as implicit evolution equations allows us to put into evidence the intrinsic nature of the correlation expressed by these equations in straight relations with the measuring process. The independence of the quantisation postulate from the causal evolution postulate of quantum mechanics is also put into discussion.
Quantum-memory-assisted entropic uncertainty relations under weak measurements
Li, Lei; Wang, Qing-Wen; Shen, Shu-Qian; Li, Ming
2017-08-01
We investigate quantum-memory-assisted entropic uncertainty relations (EURs) based on weak measurements. It is shown that the lower bound of EUR revealed by weak measurements is always larger than that revealed by the corresponding projective measurements. A series of lower bounds of EUR under both weak measurements and projective measurements are presented. Interestingly, the quantum-memory-assisted EUR based on weak measurements is a monotonically decreasing function of the strength parameter. Furthermore, some information-theoretic inequalities associated with weak measurements are also derived.
Uncertainty relation and black hole entropy of Kerr spacetime
Hu Shuang-Qi; Zhao Ren
2005-01-01
The properties of thermal radiation are discussed by using a new equation of state density, which is motivated by the generalized uncertainty relation in the quantum gravity. There is no burst at the last stage of the emission of Kerr black hole. When the new equation of state density is utilized to investigate the entropy of a Bosonic field and Fermionic field outside the horizon of a static Kerr black hole, the divergence appearing in the brick wall model is removed, without any cutoff. The entropy proportional to the horizon area is derived from the contribution of the vicinity of the horizon.
Heisenberg uncertainty relation and statistical measures in the square well
Jaime Sañudo
2012-07-01
Full Text Available A non stationary state in the one-dimensional infinite square well formed by a combination of the ground state and the first excited one is considered. The statistical complexity and the Fisher-Shannon entropy in position and momentum are calculated with time for this system. These measures are compared with the Heisenberg uncertainty relation, $Delta xDelta p$. It is observed that the extreme values of $Delta xDelta p$ coincide in time with extreme values of the other two statistical magnitudes.
Einstein-Podolsky-Rosen steering inequalities from entropic uncertainty relations
Schneeloch, James; Broadbent, Curtis J.; Walborn, Stephen P.; Cavalcanti, Eric G.; Howell, John C.
2013-06-01
We use entropic uncertainty relations to formulate inequalities that witness Einstein-Podolsky-Rosen (EPR)-steering correlations in diverse quantum systems. We then use these inequalities to formulate symmetric EPR-steering inequalities using the mutual information. We explore the differing natures of the correlations captured by one-way and symmetric steering inequalities and examine the possibility of exclusive one-way steerability in two-qubit states. Furthermore, we show that steering inequalities can be extended to generalized positive operator-valued measures, and we also derive hybrid steering inequalities between alternate degrees of freedom.
Uncertainty relation and black hole entropy of Kerr spacetime
Hu, Shuang-Qi; Zhao, Ren
2005-07-01
The properties of thermal radiation are discussed by using a new equation of state density, which is motivated by the generalized uncertainty relation in the quantum gravity. There is no burst at the last stage of the emission of Kerr black hole. When the new equation of state density is utilized to investigate the entropy of a Bosonic field and Fermionic field outside the horizon of a static Kerr black hole, the divergence appearing in the brick wall model is removed, without any cutoff. The entropy proportional to the horizon area is derived from the contribution of the vicinity of the horizon.
Characterizing quantum correlations. Entanglement, uncertainty relations and exponential families
Niekamp, Soenke
2012-04-20
This thesis is concerned with different characterizations of multi-particle quantum correlations and with entropic uncertainty relations. The effect of statistical errors on the detection of entanglement is investigated. First, general results on the statistical significance of entanglement witnesses are obtained. Then, using an error model for experiments with polarization-entangled photons, it is demonstrated that Bell inequalities with lower violation can have higher significance. The question for the best observables to discriminate between a state and the equivalence class of another state is addressed. Two measures for the discrimination strength of an observable are defined, and optimal families of observables are constructed for several examples. A property of stabilizer bases is shown which is a natural generalization of mutual unbiasedness. For sets of several dichotomic, pairwise anticommuting observables, uncertainty relations using different entropies are constructed in a systematic way. Exponential families provide a classification of states according to their correlations. In this classification scheme, a state is considered as k-correlated if it can be written as thermal state of a k-body Hamiltonian. Witness operators for the detection of higher-order interactions are constructed, and an algorithm for the computation of the nearest k-correlated state is developed.
Sixth International Conference on Squeezed States and Uncertainty Relations
Han, D. (Editor); Kim, Y. S. (Editor); Solimento, S. (Editor)
2000-01-01
These proceedings contain contributions from about 200 participants to the 6th International Conference on Squeezed States and Uncertainty Relations (ICSSUR'99) held in Naples May 24-29, 1999, and organized jointly by the University of Naples "Federico II," the University of Maryland at College Park, and the Lebedev Institute, Moscow. This was the sixth of a series of very successful meetings started in 1990 at the College Park Campus of the University of Maryland. The other meetings in the series were held in Moscow (1992), Baltimore (1993), Taiyuan P.R.C. (1995) and Balatonfuered, Hungary (1997). The present one was held at the campus Monte Sant'Angelo of the University "Federico II" of Naples. The meeting sought to provide a forum for updating and reviewing a wide range of quantum optics disciplines, including device developments and applications, and related areas of quantum measurements and quantum noise. Over the years, the ICSSUR Conference evolved from a meeting on quantum measurement sector of quantum optics, to a wide range of quantum optics themes, including multifacet aspects of generation, measurement, and applications of nonclassical light (squeezed and Schrodinger cat radiation fields, etc.), and encompassing several related areas, ranging from quantum measurement to quantum noise. ICSSUR'99 brought together about 250 people active in the field of quantum optics, with special emphasis on nonclassical light sources and related areas. The Conference was organized in 8 Sections: Squeezed states and uncertainty relations; Harmonic oscillators and squeeze transformations; Methods of quantum interference and correlations; Quantum measurements; Generation and characterisation of non-classical light; Quantum noise; Quantum communication and information; and Quantum-like systems.
Solvable Models on Noncommutative Spaces with Minimal Length Uncertainty Relations
Dey, Sanjib
2014-01-01
Our main focus is to explore different models in noncommutative spaces in higher dimensions. We provide a procedure to relate a three dimensional q-deformed oscillator algebra to the corresponding algebra satisfied by canonical variables describing non-commutative spaces. The representations for the corresponding operators obey algebras whose uncertainty relations lead to minimal length, areas and volumes in phase space, which are in principle natural candidates of many different approaches of quantum gravity. We study some explicit models on these types of noncommutative spaces, first by utilising the perturbation theory, later in an exact manner. In many cases the operators are not Hermitian, therefore we use PT -symmetry and pseudo-Hermiticity property, wherever applicable, to make them self-consistent. Apart from building mathematical models, we focus on the physical implications of noncommutative theories too. We construct Klauder coherent states for the perturbative and nonperturbative noncommutative ha...
Reconsidering the conventional interpretation of the uncertainty relations
Dumitru, S
2000-01-01
The Conventional Interpretation of the Uncertainty Relations (CIUR) is reconsidered through a revaluation of its main assertions. It is shown that all the respective assertions are troubled by insurmountable defects. So it is revealed the indubitable failure of CIUR and the necessity of its abandonment as an unjustified doctrine which generates misconceptions and cofusions.Consequently theUR must be deprived of their quality of crucial formulae in distinction between quantum and classical physics. The theoretical UR are shown to be not indicators of measuring accuracies but simple fluctuations formmulae with natural analogous in classical (non-quantum) physics. That is why we propose that they to be named simply Heisenberg's relations. It is argued that the measurements description must be discriminated from quantum mechanics and done in a distict scientific branh disinct .
Kusec, Andrea; Tallon, Kathleen; Koerner, Naomi
2016-06-01
Although numerous studies have provided support for the notion that intolerance of uncertainty plays a key role in pathological worry (the hallmark feature of generalized anxiety disorder (GAD)), other uncertainty-related constructs may also have relevance for the understanding of individuals who engage in pathological worry. Three constructs from the social cognition literature, causal uncertainty, causal importance, and self-concept clarity, were examined in the present study to assess the degree to which these explain unique variance in GAD, over and above intolerance of uncertainty. N = 235 participants completed self-report measures of trait worry, GAD symptoms, and uncertainty-relevant constructs. A subgroup was subsequently classified as low in GAD symptoms (n = 69) or high in GAD symptoms (n = 54) based on validated cut scores on measures of trait worry and GAD symptoms. In logistic regressions, only elevated intolerance of uncertainty and lower self-concept clarity emerged as unique correlates of high (vs. low) GAD symptoms. The possible role of self-concept uncertainty in GAD and the utility of integrating social cognition theories and constructs into clinical research on intolerance of uncertainty are discussed.
Quantised inertia from relativity and the uncertainty principle
McCulloch, M E
2016-01-01
It is shown here that if we assume that what is conserved in nature is not simply mass-energy, but rather mass-energy plus the energy uncertainty of the uncertainty principle, and if we also assume that position uncertainty is reduced by the formation of relativistic horizons, then the resulting increase of energy uncertainty is close to that needed for a new model for inertial mass (MiHsC, quantised inertia) which has been shown to predict galaxy rotation without dark matter and cosmic acceleration without dark energy. The same principle can also be used to model the inverse square law of gravity, and predicts the mass of the electron.
Bhogal, Nirmala; Combes, Robert
2006-10-01
At the request of the Food Standards Agency, the Committee on Toxicity of Chemicals in Food, Consumer Products and the Environment (COT) established a Working Group on Variation and Uncertainty in Toxicology (WG VUT). In April 2006, the WG VUT produced a draft report for public consultation. FRAME made a submission in response to this consultation in July 2006. We commend the WG VUT for its comprehensive account of many of the problems associated with risk assessment, and for making recommendations about the problems that need to be addressed. We were particularly encouraged by the WG VUTs recognition of the need for guidelines on how toxicological studies should be conducted and data analysed. However, we believe that the report has not achieved all of its objectives. It does not adequately consider how modern technologies, experimental design, statistical analysis and species extrapolation can be used in practice to address variability and uncertainty. There is a disproportionate focus on the sources of variability and uncertainty in human data, with relatively little consideration of how variation and uncertainty due to animal tests can be addressed. Furthermore, it is clear that, until the advantages and limitations of all toxicological methods are fully appraised and testing strategies and guidelines are agreed, the scope for improving the existing approaches to risk assessment will be severely limited. Hence, the use of alternative methods for hazard identification and characterisation merit more consideration than they were given in the draft report.
Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Godin-Beekmann, Sophie; Haefele, Alexander; Trickl, Thomas; Payen, Guillaume; Liberti, Gianluigi
2016-08-01
A standardized approach for the definition, propagation, and reporting of uncertainty in the ozone differential absorption lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One essential aspect of the proposed approach is the propagation in parallel of all independent uncertainty components through the data processing chain before they are combined together to form the ozone combined standard uncertainty. The independent uncertainty components contributing to the overall budget include random noise associated with signal detection, uncertainty due to saturation correction, background noise extraction, the absorption cross sections of O3, NO2, SO2, and O2, the molecular extinction cross sections, and the number densities of the air, NO2, and SO2. The expression of the individual uncertainty components and their step-by-step propagation through the ozone differential absorption lidar (DIAL) processing chain are thoroughly estimated. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which requires knowledge of the covariance matrix when the lidar signal is vertically filtered. In addition, the covariance terms must be taken into account if the same detection hardware is shared by the lidar receiver channels at the absorbed and non-absorbed wavelengths. The ozone uncertainty budget is presented as much as possible in a generic form (i.e., as a function of instrument performance and wavelength) so that all NDACC ozone DIAL investigators across the network can estimate, for their own instrument and in a straightforward manner, the expected impact of each reviewed uncertainty component. In addition, two actual examples of full uncertainty budget are provided, using nighttime measurements from the tropospheric ozone DIAL located at the Jet Propulsion Laboratory (JPL) Table Mountain Facility, California, and nighttime measurements from the JPL
Uncertainty of Water-hammer Loads for Safety Related Systems
Lee, Seung Chan; Yoon, Duk Joo [Korea Hydro and Nuclear Power Co., LT., Daejeon (Korea, Republic of)
2013-10-15
In this study, the basic methodology is base on ISO GUM (Guide to the Expression of Uncertainty in Measurements). For a given gas void volumes in the discharge piping, the maximum pressure of water hammer is defined in equation. From equation, uncertainty parameter is selected as U{sub s} (superficial velocity for the specific pipe size and corresponding area) of equation. The main uncertainty parameter (U{sub s}) is estimated by measurement method and Monte Carlo simulation. Two methods are in good agreement with the extended uncertainty. Extended uncertainty of the measurement and Monte Carlo simulation is 1.30 and 1.34 respectively in 95% confidence interval. In 99% confidence interval, the uncertainties are 1.95 and 1.97 respectively. NRC Generic Letter 2008-01 requires nuclear power plant operators to evaluate the possibility of noncondensable gas accumulation for the Emergency Core Cooling System. Specially, gas accumulation can result in system pressure transient in pump discharge piping at a pump start. Consequently, this evolves into a gas water, a water-hammer event and the force imbalances on the piping segments. In this paper, MCS (Monte Carlo Simulation) method is introduced in estimating the uncertainty of water hammer. The aim is to evaluate the uncertainty of the water hammer estimation results carried out by KHNP CRI in 2013.
A Monte Carlo approach for estimating measurement uncertainty using standard spreadsheet software.
Chew, Gina; Walczyk, Thomas
2012-03-01
Despite the importance of stating the measurement uncertainty in chemical analysis, concepts are still not widely applied by the broader scientific community. The Guide to the expression of uncertainty in measurement approves the use of both the partial derivative approach and the Monte Carlo approach. There are two limitations to the partial derivative approach. Firstly, it involves the computation of first-order derivatives of each component of the output quantity. This requires some mathematical skills and can be tedious if the mathematical model is complex. Secondly, it is not able to predict the probability distribution of the output quantity accurately if the input quantities are not normally distributed. Knowledge of the probability distribution is essential to determine the coverage interval. The Monte Carlo approach performs random sampling from probability distributions of the input quantities; hence, there is no need to compute first-order derivatives. In addition, it gives the probability density function of the output quantity as the end result, from which the coverage interval can be determined. Here we demonstrate how the Monte Carlo approach can be easily implemented to estimate measurement uncertainty using a standard spreadsheet software program such as Microsoft Excel. It is our aim to provide the analytical community with a tool to estimate measurement uncertainty using software that is already widely available and that is so simple to apply that it can even be used by students with basic computer skills and minimal mathematical knowledge.
Regeneration decisions in forestry under climate change related uncertainties and risks
Schou, Erik; Thorsen, Bo Jellesmark; Jacobsen, Jette Bredahl
2015-01-01
Future climate development and its effects on forest ecosystems are not easily predicted or described in terms of standard probability concepts. Nevertheless, forest managers continuously make long-term decisions that will be subject to climate change impacts. The manager's assessment of possible...... developments and impacts and the related uncertainty will affect the combined decision on timing of final harvest and the choice of species for regeneration. We analyse harvest of a Norway spruce stand with the option to regenerate with Norway spruce or oak. We use simulated variations in biophysical risks......) assigned to each outcome. Results show that the later a forest manager expects to obtain certainty about climate change or the more skewed their belief distribution, the more will decisions be based on ex ante assessments — suggesting that if forest managers believe that climate change uncertainty...
Reconsideration of the Uncertainty Relations and Quantum Measurements
Dumitru S.
2008-04-01
Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and dis- cussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucial pieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii sim- ple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information- transmission model, in which the quantum observables are considered as random vari- ables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.
Reconsideration of the Uncertainty Relations and Quantum Measurements
Dumitru S.
2008-04-01
Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and discussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucialpieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii simple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information-transmission model, in which the quantum observables are considered as random variables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.
On the Role of Information Theoretic Uncertainty Relations in Quantum Theory
Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo
2014-01-01
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) R\\'{e}nyi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson-Schr\\"{o}ding...
Energy levels of one-dimensional systems satisfying the minimal length uncertainty relation
Bernardo, Reginald Christian S.; Esguerra, Jose Perico H.
2016-10-01
The standard approach to calculating the energy levels for quantum systems satisfying the minimal length uncertainty relation is to solve an eigenvalue problem involving a fourth- or higher-order differential equation in quasiposition space. It is shown that the problem can be reformulated so that the energy levels of these systems can be obtained by solving only a second-order quasiposition eigenvalue equation. Through this formulation the energy levels are calculated for the following potentials: particle in a box, harmonic oscillator, Pöschl-Teller well, Gaussian well, and double-Gaussian well. For the particle in a box, the second-order quasiposition eigenvalue equation is a second-order differential equation with constant coefficients. For the harmonic oscillator, Pöschl-Teller well, Gaussian well, and double-Gaussian well, a method that involves using Wronskians has been used to solve the second-order quasiposition eigenvalue equation. It is observed for all of these quantum systems that the introduction of a nonzero minimal length uncertainty induces a positive shift in the energy levels. It is shown that the calculation of energy levels in systems satisfying the minimal length uncertainty relation is not limited to a small number of problems like particle in a box and the harmonic oscillator but can be extended to a wider class of problems involving potentials such as the Pöschl-Teller and Gaussian wells.
Energy levels of one-dimensional systems satisfying the minimal length uncertainty relation
Bernardo, Reginald Christian S., E-mail: rcbernardo@nip.upd.edu.ph; Esguerra, Jose Perico H., E-mail: jesguerra@nip.upd.edu.ph
2016-10-15
The standard approach to calculating the energy levels for quantum systems satisfying the minimal length uncertainty relation is to solve an eigenvalue problem involving a fourth- or higher-order differential equation in quasiposition space. It is shown that the problem can be reformulated so that the energy levels of these systems can be obtained by solving only a second-order quasiposition eigenvalue equation. Through this formulation the energy levels are calculated for the following potentials: particle in a box, harmonic oscillator, Pöschl–Teller well, Gaussian well, and double-Gaussian well. For the particle in a box, the second-order quasiposition eigenvalue equation is a second-order differential equation with constant coefficients. For the harmonic oscillator, Pöschl–Teller well, Gaussian well, and double-Gaussian well, a method that involves using Wronskians has been used to solve the second-order quasiposition eigenvalue equation. It is observed for all of these quantum systems that the introduction of a nonzero minimal length uncertainty induces a positive shift in the energy levels. It is shown that the calculation of energy levels in systems satisfying the minimal length uncertainty relation is not limited to a small number of problems like particle in a box and the harmonic oscillator but can be extended to a wider class of problems involving potentials such as the Pöschl–Teller and Gaussian wells.
Victor G. Gorb
2015-01-01
Full Text Available The aim of the investigation is to determine the ways of overcoming methodological uncertainties included into the teacher’s (tutor’s professional standards. Methods. The system-activity approach to the teacher’s activity development was used by the author. Results. The author has developed (on the basis of the system-activity approach the structure and content of the teacher’s personnel administration plan that allows realizing demands of the teacher’s (tutor’s professional standards and solving its main methodological uncertainties. Scientific novelty. The author presents own systematic and activity-based methodology to the development of personnel administration plan for the educational sphere personnel in order to enhance pedagogical potential of educational activity, and create organizational arrangements for its effectiveness, quality and social efficiency. Practical significance. The proposed system-activity methodology can be used under the modernization of personnel administration plans for teachers within the context of the teacher’s professional standard realization.
Van Otterlo, M
2009-01-01
Markov decision processes have become the de facto standard in modeling and solving sequential decision making problems under uncertainty. This book studies lifting Markov decision processes, reinforcement learning and dynamic programming to the first-order (or, relational) setting.
Review of studies related to uncertainty in risk analsis
Rish, W.R.; Marnicio, R.J.
1988-08-01
The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented.
Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Haefele, Alexander; Payen, Guillaume; Liberti, Gianluigi
2016-08-01
A standardized approach for the definition, propagation, and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified uncertainty sources comprise major components such as signal detection, saturation correction, background noise extraction, temperature tie-on at the top of the profile, and absorption by ozone if working in the visible spectrum, as well as other components such as molecular extinction, the acceleration of gravity, and the molecular mass of air, whose magnitudes depend on the instrument, data processing algorithm, and altitude range of interest. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated, taking into account the effect of vertical filtering and the merging of multiple channels. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated from the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. Using this standardized approach, an example of uncertainty budget is provided for the Jet Propulsion Laboratory (JPL) lidar at Mauna Loa Observatory, Hawai'i, which is
Minimal length uncertainty relation and gravitational quantum well
Brau, F.; Buisseret, F.
2006-01-01
The dynamics of a particle in a gravitational quantum well is studied in the context of nonrelativistic quantum mechanics with a particular deformation of a two-dimensional Heisenberg algebra. This deformation yields a new short-distance structure characterized by a finite minimal uncertainty in pos
Psychological Entropy: A Framework for Understanding Uncertainty-Related Anxiety
Hirsh, Jacob B.; Mar, Raymond A.; Peterson, Jordan B.
2012-01-01
Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level. We propose the entropy model of…
Meija, Juris; Pagliano, Enea; Mester, Zoltán
2014-09-02
Uncertainty of the result from the method of standard addition is often underestimated due to neglect of the covariance between the intercept and the slope. In order to simplify the data analysis from standard addition experiments, we propose x-y coordinate swapping in conventional linear regression. Unlike the ratio of the intercept and slope, which is the result of the traditional method of standard addition, the result of the inverse standard addition is obtained directly from the intercept of the swapped calibration line. Consequently, the uncertainty evaluation becomes markedly simpler. The method is also applicable to nonlinear curves, such as the quadratic model, without incurring any additional complexity.
A discussion on the Heisenberg uncertainty principle from the perspective of special relativity
Nanni, Luca
2016-09-01
In this note, we consider the implications of the Heisenberg uncertainty principle (HUP) when computing uncertainties that affect the main dynamical quantities, from the perspective of special relativity. Using the well-known formula for propagating statistical errors, we prove that the uncertainty relations between the moduli of conjugate observables are not relativistically invariant. The new relationships show that, in experiments involving relativistic particles, limitations of the precision of a quantity obtained by indirect calculations may affect the final result.
Vergni, L.; Di Lena, B.; Todisco, F.; Mannocchi, F.
2017-04-01
As shown by several authors, drought monitoring by the Standardized Precipitation Index (SPI) presents some uncertainties, mainly dependent on the choice of the probability distribution used to describe the cumulative precipitation and on the characteristics (e.g., length and variability) of the dataset. In this paper, the uncertainty related to SPI estimates has been quantified and analyzed with regards to the case study of the Abruzzo region (Central Italy), by using monthly precipitation recorded at 75 stations during the period 1951-2009. First, a set of distributions suitable to describe the cumulative precipitation at the 3-, 6-, and 12-month time scales was identified by using L-moments ratio diagrams. The goodness-of-fit was evaluated by applying the Kolmogorov-Smirnov test, and the Normality test on the derived SPI series. Then the confidence intervals of SPI have been calculated by applying a bootstrap procedure. The size of the confidence intervals has been considered as a measure of uncertainty, and its dependence on several factors such as the distribution type, the time scale, the record length, and the season has been examined. Results show that the distributions Pearson type III (PE3), Weibull (WEI), Generalized Normal (GNO), Generalized Extreme Value (GEV), and Gamma (GA2) are all suitable to describe the cumulative precipitation, with a slightly better performance of the PE3 and GNO distributions. As expected, the uncertainty increases as the record length and time scale decrease. The leading source of uncertainty is the record length while the effects due to seasonality and time scale are negligible. Two-parameter distributions make it possible to obtain confidence intervals of SPI (particularly for extreme values) narrower than those obtained by three-parameter distributions. Nevertheless, due to a poorer goodness of fit, two-parameter distributions can provide less reliable estimates of the precipitation probability. In any event, independently
Vergni, L.; Di Lena, B.; Todisco, F.; Mannocchi, F.
2015-12-01
As shown by several authors, drought monitoring by the Standardized Precipitation Index (SPI) presents some uncertainties, mainly dependent on the choice of the probability distribution used to describe the cumulative precipitation and on the characteristics (e.g., length and variability) of the dataset. In this paper, the uncertainty related to SPI estimates has been quantified and analyzed with regards to the case study of the Abruzzo region (Central Italy), by using monthly precipitation recorded at 75 stations during the period 1951-2009. First, a set of distributions suitable to describe the cumulative precipitation at the 3-, 6-, and 12-month time scales was identified by using L-moments ratio diagrams. The goodness-of-fit was evaluated by applying the Kolmogorov-Smirnov test, and the Normality test on the derived SPI series. Then the confidence intervals of SPI have been calculated by applying a bootstrap procedure. The size of the confidence intervals has been considered as a measure of uncertainty, and its dependence on several factors such as the distribution type, the time scale, the record length, and the season has been examined. Results show that the distributions Pearson type III (PE3), Weibull (WEI), Generalized Normal (GNO), Generalized Extreme Value (GEV), and Gamma (GA2) are all suitable to describe the cumulative precipitation, with a slightly better performance of the PE3 and GNO distributions. As expected, the uncertainty increases as the record length and time scale decrease. The leading source of uncertainty is the record length while the effects due to seasonality and time scale are negligible. Two-parameter distributions make it possible to obtain confidence intervals of SPI (particularly for extreme values) narrower than those obtained by three-parameter distributions. Nevertheless, due to a poorer goodness of fit, two-parameter distributions can provide less reliable estimates of the precipitation probability. In any event, independently
A genuine reinterpretation of the Heisenberg's ("uncertainty") relations
Dumitru, S
2000-01-01
In spite \\smallskip of their popularity the \\QTR{bf}{H}eisenberg's (``uncertainty'') \\QTR{bf}{R}elations (HR) still generate controversies. The \\QTR{bf}{T}raditional \\QTR{bf}{I}nterpretation of HR (TIHR) dominate our days science, although over the years a lot of its defects were signaled. These facts justify a reinvestigation of the questions connected with the interpretation / significance of HR. Here it is developped such a reinvestigation starting with a revaluation of the main elements of TIHR. So one finds that all the respective elements are troubled by insurmountable defects. Then it results the indubitable failure of TIHR and the necessity of its abandonment. Consequently the HR must be deprived of their quality of crucial physical formulae. Moreover the HR are shown to be nothing but simple fluctuations formulae with natural analogous in classical (non-quantum) physics. The description of the maesuring uncertainties (traditionally associated with HR) is approached from a new informational perspectiv...
Climate dynamics and fluid mechanics: Natural variability and related uncertainties
Ghil, Michael; Simonnet, Eric; 10.1016/j.physd.2008.03.036
2010-01-01
The purpose of this review-and-research paper is twofold: (i) to review the role played in climate dynamics by fluid-dynamical models; and (ii) to contribute to the understanding and reduction of the uncertainties in future climate-change projections. To illustrate the first point, we focus on the large-scale, wind-driven flow of the mid-latitude oceans which contribute in a crucial way to Earth's climate, and to changes therein. We study the low-frequency variability (LFV) of the wind-driven, double-gyre circulation in mid-latitude ocean basins, via the bifurcation sequence that leads from steady states through periodic solutions and on to the chaotic, irregular flows documented in the observations. This sequence involves local, pitchfork and Hopf bifurcations, as well as global, homoclinic ones. The natural climate variability induced by the LFV of the ocean circulation is but one of the causes of uncertainties in climate projections. Another major cause of such uncertainties could reside in the structural ...
Verburg, J.; Habib, M. B.; Keulen, M. van
2015-01-01
Relation extraction involves different types of uncertainty due to the imperfection of the extraction tools and the inherent ambiguity of unstructured text. In this paper, we discuss several ways of handling uncertainties in relation extraction from social media. Our study case is to extract tennis
Fan, Ya-Jing; Cao, Huai-Xin; Meng, Hui-Xian; Chen, Liang
2016-12-01
The uncertainty principle in quantum mechanics is a fundamental relation with different forms, including Heisenberg's uncertainty relation and Schrödinger's uncertainty relation. In this paper, we prove a Schrödinger-type uncertainty relation in terms of generalized metric adjusted skew information and correlation measure by using operator monotone functions, which reads, U_ρ ^{(g,f)}(A)U_ρ ^{(g,f)}(B)≥ f(0)^2l/k| Corr_ρ ^{s(g,f)}(A,B)| ^2 for some operator monotone functions f and g, all n-dimensional observables A, B and a non-singular density matrix ρ . As applications, we derive some new uncertainty relations for Wigner-Yanase skew information and Wigner-Yanase-Dyson skew information.
Fan, Ya-Jing; Cao, Huai-Xin; Meng, Hui-Xian; Chen, Liang
2016-09-01
The uncertainty principle in quantum mechanics is a fundamental relation with different forms, including Heisenberg's uncertainty relation and Schrödinger's uncertainty relation. In this paper, we prove a Schrödinger-type uncertainty relation in terms of generalized metric adjusted skew information and correlation measure by using operator monotone functions, which reads, U_ρ ^{(g,f)}(A)U_ρ ^{(g,f)}(B)≥ f(0)^2l/k| {Corr}_ρ ^{s(g,f)}(A,B)| ^2 for some operator monotone functions f and g, all n-dimensional observables A, B and a non-singular density matrix ρ . As applications, we derive some new uncertainty relations for Wigner-Yanase skew information and Wigner-Yanase-Dyson skew information.
Visualizing seismic risk and uncertainty: a review of related research.
Bostrom, Ann; Anselin, Luc; Farris, Jeremy
2008-04-01
Government agencies and other authorities often communicate earthquake risks using maps derived from geographic information systems. Yet, little is known about the effects of these maps on risk perceptions. While mental models research and other approaches are available to inform risk communication text design, similar empirically derived guidance is lacking for visual risk communications, such as maps, which are likely to trump text in their impact and appeal. This paper reviews the empirical research that might inform such guidance. Research on graphs, spatial and visual perception, and map design suggests that graphics increase risk avoidance over numerical risk representations, and countable visuals, like dots, can increase the accuracy of perceived risks, but not always. Cartographic design features, such as color, animation, interactivity, and depth cues, are all candidates to represent risk and uncertainty and to influence risk perception. While there are robust known effects of color (e.g., red = danger), with some cultural variability, animation can increase the salience of otherwise obscure features but is not uniformly effective. Depth cues, dimensionality, and the extent to which a representation depicts versus symbolizes a scene will influence the viewer's perspective and perception, depending on the viewer's familiarity with the scene; their effects on risk perception remain unclear. The translation and representation of technical information about risk and uncertainty is critical to risk communication effectiveness. Our review suggests a handful of candidate criteria for evaluating the effects of risk visualizations, short of changes in behavior: accuracy, accessibility, retention, and perceived risk and usefulness.
Kolarik, Jakub; Olesen, Bjarne W.
2015-01-01
European Standard EN 15 251 in its current version does not provide any guidance on how to handle uncertainty of long term measurements of indoor environmental parameters used for classification of buildings. The objective of the study was to analyse the uncertainty for field measurements...... measurements of operative temperature at two measuring points (south/south-west and north/northeast orientation). Results of the present study suggest that measurement uncertainty needs to be considered during assessment of thermal environment in existing buildings. When expanded standard uncertainty was taken...... of operative temperature and evaluate its effect on categorization of thermal environment according to EN 15251. A data-set of field measurements of operative temperature four office buildings situated in Denmark, Italy and Spain was used. Data for each building included approx. one year of continuous...
Position-momentum uncertainty relations in the presence of quantum memory
Furrer, Fabian; Berta, Mario; Tomamichel, Marco
2014-01-01
A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear...... operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused....... As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states....
Do the Uncertainty Relations Really have Crucial Signiﬁcances for Physics?
Dumitru S.
2010-10-01
Full Text Available It is proved the falsity of idea that the Uncertainty Relations (UR have crucial signif- icances for physics. Additionally one argues for the necesity of an UR-disconnected quantum philosophy.
Time-dependent q-deformed bi-coherent states for generalized uncertainty relations
Gouba, Laure
2015-07-01
We consider the time-dependent bi-coherent states that are essentially the Gazeau-Klauder coherent states for the two dimensional noncommutative harmonic oscillator. Starting from some q-deformations of the oscillator algebra for which the entire deformed Fock space can be constructed explicitly, we define the q-deformed bi-coherent states. We verify the generalized Heisenberg's uncertainty relations projected onto these states. For the initial value in time, the states are shown to satisfy a generalized version of Heisenberg's uncertainty relations. For the initial value in time and for the parameter of noncommutativity θ = 0, the inequalities are saturated for the simultaneous measurement of the position-momentum observables. When the time evolves, the uncertainty products are different from their values at the initial time and do not always respect the generalized uncertainty relations.
Ahn, Kuk-Hyun; Merwade, Venkatesh; Ojha, C. S. P.; Palmer, Richard N.
2016-11-01
In spite of recent popularity for investigating human-induced climate change in regional areas, understanding the contributors to the relative uncertainties in the process remains unclear. To remedy this, this study presents a statistical framework to quantify relative uncertainties in a detection and attribution study. Primary uncertainty contributors are categorized into three types: climate data, hydrologic, and detection uncertainties. While an ensemble of climate models is used to define climate data uncertainty, hydrologic uncertainty is defined using a Bayesian approach. Before relative uncertainties in the detection and attribution study are quantified, an optimal fingerprint-based detection and attribution analysis is employed to investigate changes in winter streamflow in the Connecticut River Basin, which is located in the Eastern United States. Results indicate that winter streamflow over a period of 64 years (1950-2013) lies outside the range expected from natural variability of climate alone with a 90% confidence interval in the climate models. Investigation of relative uncertainties shows that the uncertainty linked to the climate data is greater than the uncertainty induced by hydrologic modeling. Detection uncertainty, defined as the uncertainty related to time evolution of the anthropogenic climate change in the historical data (signal) above the natural internal climate variability (noise), shows that uncertainties in natural internal climate variability (piControl) scenarios may be the source of the significant degree of uncertainty in the regional Detection and Attribution study.
Position-momentum uncertainty relations in the presence of quantum memory
Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp [Department of Physics, Graduate School of Science, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Berta, Mario [Institute for Quantum Information and Matter, Caltech, Pasadena, California 91125 (United States); Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Tomamichel, Marco [School of Physics, The University of Sydney, Sydney 2006 (Australia); Centre for Quantum Technologies, National University of Singapore, Singapore 117543 (Singapore); Scholz, Volkher B. [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Christandl, Matthias [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Department of Mathematical Sciences, University of Copenhagen, Universitetsparken 5, 2100 Copenhagen (Denmark)
2014-12-15
A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.
Le, Yen-Chi; Aune, Krystyna S.
2012-01-01
This study examined the relationship between relational uncertainty and perceptions of division of household labor (DHL) in cohabiting and married couples. Specifically, research questions explored perceived fairness in DHL and relational uncertainty, perceptual convergence of contributions, convergence of perceptions and relational uncertainty, and convergence of perceptions and relationship satisfaction. A behavioral methodology called the Household Portrait Technique was employed to examine how couples discuss how they decide who does what in the household. A total of 33 couples independently completed a self-report instrument and jointly participated in the Household Portrait activity. Results showed that husbands and wives were agreed in their perceptions of fairness. Couples agreed that husbands do more of the outdoor work and automobile maintenance whereas wives do more of the childcare. Convergent perceptions regarding DHL was positively associated with relational certainty and marginally associated with relationship satisfaction. PMID:25083172
Krystyna S. Aune
2011-12-01
Full Text Available This study examined the relationship between relational uncertainty and perceptions of division of household labor (DHL in cohabiting and married couples. Specifically, research questions explored perceived fairness in DHL and relational uncertainty, perceptual convergence of contributions, convergence of perceptions and relational uncertainty, and convergence of perceptions and relationship satisfaction. A behavioral methodology called the Household Portrait Technique was employed to examine how couples discuss how they decide who does what in the household. A total of 33 couples independently completed a self-report instrument and jointly participated in the Household Portrait activity. Results showed that husbands and wives were agreed in their perceptions of fairness. Couples agreed that husbands do more of the outdoor work and automobile maintenance whereas wives do more of the childcare. Convergent perceptions regarding DHL was positively associated with relational certainty and marginally associated with relationship satisfaction.
Kang, Namgoo; Jung, Min-Ho; Jeong, Hyun-Cheol; Lee, Yung-Seop
2015-06-01
The general sample standard deviation and the Monte-Carlo methods as an estimate of confidence interval is frequently being used for estimates of uncertainties with regard to greenhouse gas emission, based on the critical assumption that a given data set follows a normal (Gaussian) or statistically known probability distribution. However, uncertainty estimated using those methods are severely limited in practical applications where it is challenging to assume the probability distribution of a data set or where the real data distribution form appears to deviate significantly from statistically known probability distribution models. In order to solve these issues encountered especially in reasonable estimation of uncertainty about the average of greenhouse gas emission, we present two statistical methods, the pooled standard deviation method (PSDM) and the standardized-t bootstrap method (STBM) based upon statistical theories. We also report interesting results of the uncertainties about the average of a data set of methane (CH4) emission from rice cultivation under the four different irrigation conditions in Korea, measured by gas sampling and subsequent gas analysis. Results from the applications of the PSDM and the STBM to these rice cultivation methane emission data sets clearly demonstrate that the uncertainties estimated by the PSDM were significantly smaller than those by the STBM. We found that the PSDM needs to be adopted in many cases where a data probability distribution form appears to follow an assumed normal distribution with both spatial and temporal variations taken into account. However, the STBM is a more appropriate method widely applicable to practical situations where it is realistically impossible with the given data set to reasonably assume or determine a probability distribution model with a data set showing evidence of fairly asymmetric distribution but severely deviating from known probability distribution models.
Madaniyazi, Lina; Guo, Yuming; Yu, Weiwei; Tong, Shilu
2015-02-01
Climate change may affect mortality associated with air pollutants, especially for fine particulate matter (PM2.5) and ozone (O3). Projection studies of such kind involve complicated modelling approaches with uncertainties. We conducted a systematic review of researches and methods for projecting future PM2.5-/O3-related mortality to identify the uncertainties and optimal approaches for handling uncertainty. A literature search was conducted in October 2013, using the electronic databases: PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search was limited to peer-reviewed journal articles published in English from January 1980 to September 2013. Fifteen studies fulfilled the inclusion criteria. Most studies reported that an increase of climate change-induced PM2.5 and O3 may result in an increase in mortality. However, little research has been conducted in developing countries with high emissions and dense populations. Additionally, health effects induced by PM2.5 may dominate compared to those caused by O3, but projection studies of PM2.5-related mortality are fewer than those of O3-related mortality. There is a considerable variation in approaches of scenario-based projection researches, which makes it difficult to compare results. Multiple scenarios, models and downscaling methods have been used to reduce uncertainties. However, few studies have discussed what the main source of uncertainties is and which uncertainty could be most effectively reduced. Projecting air pollution-related mortality requires a systematic consideration of assumptions and uncertainties, which will significantly aid policymakers in efforts to manage potential impacts of PM2.5 and O3 on mortality in the context of climate change. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
D. Papale
2006-01-01
Full Text Available Eddy covariance technique to measure CO2, water and energy fluxes between biosphere and atmosphere is widely spread and used in various regional networks. Currently more than 250 eddy covariance sites are active around the world measuring carbon exchange at high temporal resolution for different biomes and climatic conditions. In this paper a new standardized set of corrections is introduced and the uncertainties associated with these corrections are assessed for eight different forest sites in Europe with a total of 12 yearly datasets. The uncertainties introduced on the two components GPP (Gross Primary Production and TER (Terrestrial Ecosystem Respiration are also discussed and a quantitative analysis presented. Through a factorial analysis we find that generally, uncertainties by different corrections are additive without interactions and that the heuristic u*-correction introduces the largest uncertainty. The results show that a standardized data processing is needed for an effective comparison across biomes and for underpinning inter-annual variability. The methodology presented in this paper has also been integrated in the European database of the eddy covariance measurements.
Standards for Educational Public Relations and Communications Professionals.
Chappelow, Marsha A.
2003-01-01
Describes National School Public Relations Association standards for school public relations and communications professionals and program. Includes reactions and comments about new Association standards from seven superintendents and four school public-relations professionals. (PKP)
Standards for Educational Public Relations and Communications Professionals.
Chappelow, Marsha A.
2003-01-01
Describes National School Public Relations Association standards for school public relations and communications professionals and program. Includes reactions and comments about new Association standards from seven superintendents and four school public-relations professionals. (PKP)
Mentoring Support and Relational Uncertainty in the Advisor-Advisee Relationship
Mansson, Daniel H.; Myers, Scott A.
2013-01-01
We examine the extent to which career mentoring and psychosocial mentoring received from their advisors relates to advisee perceptions of advisor-advisee relational uncertainty. Doctoral students (N = 378) completed the "Academic Mentoring Behaviors Scale" (Schrodt, Cawyer, & Sanders, 2003), the "Mentoring and Communication…
Mentoring Support and Relational Uncertainty in the Advisor-Advisee Relationship
Mansson, Daniel H.; Myers, Scott A.
2013-01-01
We examine the extent to which career mentoring and psychosocial mentoring received from their advisors relates to advisee perceptions of advisor-advisee relational uncertainty. Doctoral students (N = 378) completed the "Academic Mentoring Behaviors Scale" (Schrodt, Cawyer, & Sanders, 2003), the "Mentoring and Communication…
Ando, Amy W; Mallory, Mindy L
2012-04-24
Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit-cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change-induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area.
Ando, Amy W.; Mallory, Mindy L.
2012-01-01
Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit–cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change–induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area. PMID:22451914
Uncertainties Related to Extreme Event Statistics of Sewer System Surcharge and Overflow
Schaarup-Jensen, Kjeld; Johansen, C.; Thorndahl, Søren Liedtke
2005-01-01
by performing long term simulations - using a sewer flow simulation model - and draw up extreme event statistics from the model simulations. In this context it is important to realize that uncertainties related to the input parameters of rainfall runoff models will give rise to uncertainties related...... to draw up extreme event statistics covering return periods of as much as 33 years. By comparing these two different extreme event statistics it is evident that these to a great extent depend on the uncertainties related to the input parameters of the rainfall runoff model....... proceeding in an acceptable manner, if flooding of these levels is having an average return period bigger than a predefined value. This practice is also often used in functional analysis of existing sewer systems. If a sewer system can fulfil recommended flooding frequencies or not, can only be verified...
Uncertainty Relations and Sparse Signal Recovery for Pairs of General Signal Sets
Kuppinger, Patrick; Bölcskei, Helmut
2011-01-01
We present an uncertainty relation for the representation of signals in two different general (possibly redundant or incomplete) signal sets. This uncertainty relation is relevant for the analysis of signals containing two distinct features each of which can be described sparsely in a suitable general signal set. Furthermore, the new uncertainty relation is shown to lead to improved sparsity thresholds for recovery of signals that are sparse in general dictionaries. Specifically, our results improve on the well-known $(1+1/d)/2$-threshold for dictionaries with coherence $d$ by up to a factor of two. Furthermore, we provide probabilistic recovery guarantees for pairs of general dictionaries that also allow us to understand which parts of a general dictionary one needs to randomize over to "weed out" the sparsity patterns that prohibit breaking the square-root bottleneck.
Smith, Brennan T [ORNL
2015-01-01
Turbine discharges at low-head short converging intakes are difficult to measure accurately. The proximity of the measurement section to the intake entrance admits large uncertainties related to asymmetry of the velocity profile, swirl, and turbulence. Existing turbine performance codes [10, 24] do not address this special case and published literature is largely silent on rigorous evaluation of uncertainties associated with this measurement context. The American Society of Mechanical Engineers (ASME) Committee investigated the use of Acoustic transit time (ATT), Acoustic scintillation (AS), and Current meter (CM) in a short converging intake at the Kootenay Canal Generating Station in 2009. Based on their findings, a standardized uncertainty analysis (UA) framework for velocity-area method (specifically for CM measurements) is presented in this paper given the fact that CM is still the most fundamental and common type of measurement system. Typical sources of systematic and random errors associated with CM measurements are investigated, and the major sources of uncertainties associated with turbulence and velocity fluctuations, numerical velocity integration technique (bi-cubic spline), and the number and placement of current meters are being considered for an evaluation. Since the velocity measurements in a short converging intake are associated with complex nonlinear and time varying uncertainties (e.g., Reynolds stress in fluid dynamics), simply applying the law of propagation of uncertainty is known to overestimate the measurement variance while the Monte Carlo method does not. Therefore, a pseudo-Monte Carlo simulation method (random flow generation technique [8]) which was initially developed for the purpose of establishing upstream or initial conditions in the Large-Eddy Simulation (LES) and the Direct Numerical Simulation (DNS) is used to statistically determine uncertainties associated with turbulence and velocity fluctuations. This technique is then
Time-dependent q-deformed coherent states for generalized uncertainty relations
Dey, Sanjib; Gouba, Laure; Castro, Paulo G
2012-01-01
We investigate properties of generalized time-dependent q-deformed coherent states for a noncommutative harmonic oscillator. The states are shown to satisfy a generalized version of Heisenberg's uncertainty relations. For the initial value in time the states are demonstrated to be squeezed, i.e. the inequalities are saturated, whereas when time evolves the uncertainty product oscillates away from this value albeit still respecting the relations. For the canonical variables on a noncommutative space we verify explicitly that Ehrenfest's theorem hold at all times. We conjecture that the model exhibits revival times to infinite order. Explicit sample computations for the fractional revival times and superrevival times are presented.
One-parameter class of uncertainty relations based on entropy power
Jizba, Petr; Ma, Yue; Hayes, Anthony; Dunningham, Jacob A.
2016-06-01
We use the concept of entropy power to derive a one-parameter class of information-theoretic uncertainty relations for pairs of conjugate observables in an infinite-dimensional Hilbert space. This class constitutes an infinite tower of higher-order statistics uncertainty relations, which allows one in principle to determine the shape of the underlying information-distribution function by measuring the relevant entropy powers. We illustrate the capability of this class by discussing two examples: superpositions of vacuum and squeezed states and the Cauchy-type heavy-tailed wave function.
LI Zuoyong; PENG Lihong
2004-01-01
This paper analyses the intrinsic relationship between the BP network learning ability and generalization ability and other influencing factors when the overfit occurs, and introduces the multiple correlation coefficient to describe the complexity of samples; it follows the calculation uncertainty principle and the minimum principle of neural network structural design, provides an analogy of the general uncertainty relation in the information transfer process, and ascertains the uncertainty relation between the training relative error of the training sample set, which reflects the network learning ability,and the test relative error of the test sample set, which represents the network generalization ability; through the simulation of BP network overfit numerical modeling test with different types of functions, it is ascertained that the overfit parameter q in the relation generally has a span of 7×10-3 to 7 × 10-2; the uncertainty relation then helps to obtain the formula for calculating the number of hidden nodes of a network with good generalization ability under the condition that multiple correlation coefficient is used to describe sample complexity and the given approximation error requirement is satisfied;the rationality of this formula is verified; this paper also points out that applying the BP network to the training process of the given sample set is the best method for stopping training that improves the generalization ability.
Complementarity and the Nature of Uncertainty Relations in Einstein–Bohr Recoiling Slit Experiment
Shogo Tanimura
2015-07-01
Full Text Available A model of the Einstein–Bohr recoiling slit experiment is formulated in a fully quantum theoretical setting. In this model, the state and dynamics of a movable wall that has two slits in it, as well as the state of a particle incoming to the two slits, are described by quantum mechanics. Using this model, we analyzed complementarity between exhibiting an interference pattern and distinguishing the particle path. Comparing the Kennard–Robertson type and the Ozawa-type uncertainty relations, we conclude that the uncertainty relation involved in the double-slit experiment is not the Ozawa-type uncertainty relation but the Kennard-type uncertainty relation of the position and the momentum of the double-slit wall. A possible experiment to test the complementarity relation is suggested. It is also argued that various phenomena which occur at the interface of a quantum system and a classical system, including distinguishability, interference, decoherence, quantum eraser, and weak value, can be understood as aspects of entanglement.Quanta 2015; 4: 1–9.
Bird-landscape relations in the Chihuahuan Desert: Coping with uncertainties about predictive models
Gutzwiller, K.J.; Barrow, W.C.
2001-01-01
During the springs of 1995-1997, we studied birds and landscapes in the Chihuahuan Desert along part of the Texas-Mexico border. Our objectives were to assess bird-landscape relations and their interannual consistency and to identify ways to cope with associated uncertainties that undermine confidence in using such relations in conservation decision processes. Bird distributions were often significantly associated with landscape features, and many bird-landscape models were valid and useful for predictive purposes. Differences in early spring rainfall appeared to influence bird abundance, but there was no evidence that annual differences in bird abundance affected model consistency. Model consistency for richness (42%) was higher than mean model consistency for 26 focal species (mean 30%, range 0-67%), suggesting that relations involving individual species are, on average, more subject to factors that cause variation than are richness-landscape relations. Consistency of bird-landscape relations may be influenced by such factors as plant succession, exotic species invasion, bird species' tolerances for environmental variation, habitat occupancy patterns, and variation in food density or weather. The low model consistency that we observed for most species indicates the high variation in bird-landscape relations that managers and other decision makers may encounter. The uncertainty of interannual variation in bird-landscape relations can be reduced by using projections of bird distributions from different annual models to determine the likely range of temporal and spatial variation in a species' distribution. Stochastic simulation models can be used to incorporate the uncertainty of random environmental variation into predictions of bird distributions based on bird-landscape relations and to provide probabilistic projections with which managers can weigh the costs and benefits of various decisions, Uncertainty about the true structure of bird-landscape relations
Event-by-event simulation of single-neutron experiments to test uncertainty relations
De Raedt, H.; Michielsen, K.
2014-01-01
Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not requi
Experimental test of error-disturbance uncertainty relations by weak measurement.
Kaneda, Fumihiro; Baek, So-Young; Ozawa, Masanao; Edamatsu, Keiichi
2014-01-17
We experimentally test the error-disturbance uncertainty relation (EDR) in generalized, strength-variable measurement of a single photon polarization qubit, making use of weak measurement that keeps the initial signal state practically unchanged. We demonstrate that the Heisenberg EDR is violated, yet the Ozawa and Branciard EDRs are valid throughout the range of our measurement strength.
Damage functions for climate-related hazards: unification and uncertainty analysis
Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.
2016-05-01
Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.
Liang Xue
2017-02-01
Full Text Available Reservoir simulations always involve a large number of parameters to characterize the properties of formation and fluid, many of which are subject to uncertainties owing to spatial heterogeneity and insufficient measurements. To provide solutions to uncertainty-related issues in reservoir simulations, a general package called GenPack has been developed. GenPack includes three main functions required for full stochastic analysis in petroleum engineering, generation of random parameter fields, predictive uncertainty quantifications and automatic history matching. GenPack, which was developed in a modularized manner, is a non-intrusive package which can be integrated with any existing commercial simulator in petroleum engineering to facilitate its application. Computational efficiency can be improved both theoretically by introducing a surrogate model-based probabilistic collocation method, and technically by using parallel computing. A series of synthetic cases are designed to demonstrate the capability of GenPack. The test results show that the random parameter field can be flexibly generated in a customized manner for petroleum engineering applications. The predictive uncertainty can be reasonably quantified and the computational efficiency is significantly improved. The ensemble Kalman filter (EnKF-based automatic history matching method can improve predictive accuracy and reduce the corresponding predictive uncertainty by accounting for observations.
Palenčár, Rudolf; Sopkuliak, Peter; Palenčár, Jakub; Ďuriš, Stanislav; Suroviak, Emil; Halaj, Martin
2017-06-01
Evaluation of uncertainties of the temperature measurement by standard platinum resistance thermometer calibrated at the defining fixed points according to ITS-90 is a problem that can be solved in different ways. The paper presents a procedure based on the propagation of distributions using the Monte Carlo method. The procedure employs generation of pseudo-random numbers for the input variables of resistances at the defining fixed points, supposing the multivariate Gaussian distribution for input quantities. This allows taking into account the correlations among resistances at the defining fixed points. Assumption of Gaussian probability density function is acceptable, with respect to the several sources of uncertainties of resistances. In the case of uncorrelated resistances at the defining fixed points, the method is applicable to any probability density function. Validation of the law of propagation of uncertainty using the Monte Carlo method is presented on the example of specific data for 25 Ω standard platinum resistance thermometer in the temperature range from 0 to 660 °C. Using this example, we demonstrate suitability of the method by validation of its results.
Reactions to the New Standards for School Public Relations Specialists.
Kowalski, Theodore J.
2002-01-01
Reactions by 10 individuals associated with the "Journal of School Public Relations" to new National School Public Relations Association standards for school public relations and communications professionals and programs. Includes general reactions, impact of the standards, possible ambiguity, adding or eliminating standards, and…
Reactions to the New Standards for School Public Relations Specialists.
Kowalski, Theodore J.
2002-01-01
Reactions by 10 individuals associated with the "Journal of School Public Relations" to new National School Public Relations Association standards for school public relations and communications professionals and programs. Includes general reactions, impact of the standards, possible ambiguity, adding or eliminating standards, and influence on…
Farah, J., E-mail: jad.farah@irsn.fr; Clairand, I.; Huet, C. [External Dosimetry Department, Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP-17, 92260 Fontenay-aux-Roses (France); Trianni, A. [Medical Physics Department, Udine University Hospital S. Maria della Misericordia (AOUD), p.le S. Maria della Misericordia, 15, 33100 Udine (Italy); Ciraj-Bjelac, O. [Vinca Institute of Nuclear Sciences (VINCA), P.O. Box 522, 11001 Belgrade (Serbia); De Angelis, C. [Department of Technology and Health, Istituto Superiore di Sanità (ISS), Viale Regina Elena 299, 00161 Rome (Italy); Delle Canne, S. [Fatebenefratelli San Giovanni Calibita Hospital (FBF), UOC Medical Physics - Isola Tiberina, 00186 Rome (Italy); Hadid, L.; Waryn, M. J. [Radiology Department, Hôpital Jean Verdier (HJV), Avenue du 14 Juillet, 93140 Bondy Cedex (France); Jarvinen, H.; Siiskonen, T. [Radiation and Nuclear Safety Authority (STUK), P.O. Box 14, 00881 Helsinki (Finland); Negri, A. [Veneto Institute of Oncology (IOV), Via Gattamelata 64, 35124 Padova (Italy); Novák, L. [National Radiation Protection Institute (NRPI), Bartoškova 28, 140 00 Prague 4 (Czech Republic); Pinto, M. [Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA-INMRI), C.R. Casaccia, Via Anguillarese 301, I-00123 Santa Maria di Galeria (RM) (Italy); Knežević, Ž. [Ruđer Bošković Institute (RBI), Bijenička c. 54, 10000 Zagreb (Croatia)
2015-07-15
Purpose: To investigate the optimal use of XR-RV3 GafChromic{sup ®} films to assess patient skin dose in interventional radiology while addressing the means to reduce uncertainties in dose assessment. Methods: XR-Type R GafChromic films have been shown to represent the most efficient and suitable solution to determine patient skin dose in interventional procedures. As film dosimetry can be associated with high uncertainty, this paper presents the EURADOS WG 12 initiative to carry out a comprehensive study of film characteristics with a multisite approach. The considered sources of uncertainties include scanner, film, and fitting-related errors. The work focused on studying film behavior with clinical high-dose-rate pulsed beams (previously unavailable in the literature) together with reference standard laboratory beams. Results: First, the performance analysis of six different scanner models has shown that scan uniformity perpendicular to the lamp motion axis and that long term stability are the main sources of scanner-related uncertainties. These could induce errors of up to 7% on the film readings unless regularly checked and corrected. Typically, scan uniformity correction matrices and reading normalization to the scanner-specific and daily background reading should be done. In addition, the analysis on multiple film batches has shown that XR-RV3 films have generally good uniformity within one batch (<1.5%), require 24 h to stabilize after the irradiation and their response is roughly independent of dose rate (<5%). However, XR-RV3 films showed large variations (up to 15%) with radiation quality both in standard laboratory and in clinical conditions. As such, and prior to conducting patient skin dose measurements, it is mandatory to choose the appropriate calibration beam quality depending on the characteristics of the x-ray systems that will be used clinically. In addition, yellow side film irradiations should be preferentially used since they showed a lower
Li, B.; Lee, H. C.; Duan, X.; Shen, C.; Zhou, L.; Jia, X.; Yang, M.
2017-09-01
The dual-energy CT-based (DECT) approach holds promise in reducing the overall uncertainty in proton stopping-power-ratio (SPR) estimation as compared to the conventional stoichiometric calibration approach. The objective of this study was to analyze the factors contributing to uncertainty in SPR estimation using the DECT-based approach and to derive a comprehensive estimate of the range uncertainty associated with SPR estimation in treatment planning. Two state-of-the-art DECT-based methods were selected and implemented on a Siemens SOMATOM Force DECT scanner. The uncertainties were first divided into five independent categories. The uncertainty associated with each category was estimated for lung, soft and bone tissues separately. A single composite uncertainty estimate was eventually determined for three tumor sites (lung, prostate and head-and-neck) by weighting the relative proportion of each tissue group for that specific site. The uncertainties associated with the two selected DECT methods were found to be similar, therefore the following results applied to both methods. The overall uncertainty (1σ) in SPR estimation with the DECT-based approach was estimated to be 3.8%, 1.2% and 2.0% for lung, soft and bone tissues, respectively. The dominant factor contributing to uncertainty in the DECT approach was the imaging uncertainties, followed by the DECT modeling uncertainties. Our study showed that the DECT approach can reduce the overall range uncertainty to approximately 2.2% (2σ) in clinical scenarios, in contrast to the previously reported 1%.
Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model
Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.
2012-12-01
Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root
Standard Terminology Relating to Wear and Erosion
American Society for Testing and Materials. Philadelphia
2010-01-01
1.1 The terms and their definitions given herein represent terminology relating to wear and erosion of solid bodies due to mechanical interactions such as occur with cavitation, impingement by liquid jets or drops or by solid particles, or relative motion against contacting solid surfaces or fluids. This scope interfaces with but generally excludes those processes where material loss is wholly or principally due to chemical action and other related technical fields as, for instance, lubrication. 1.2 This terminology is not exhaustive; the absence of any particular term from this collection does not necessarily imply that its use within this scope is discouraged. However, the terms given herein are the recommended terms for the concepts they represent unless otherwise noted. 1.3 Certain general terms and definitions may be restricted and interpreted, if necessary, to make them particularly applicable to the scope as defined herein. 1.4 The purpose of this terminology is to encourage uniformity and accuracy ...
Le, Thinh Phuc; Scarani, Valerio
2011-01-01
We define a family of reference-frame-independent quantum cryptography protocols for arbitrary dimensional signals. The generalized entropic uncertainty relations [M. Tomamichel and R. Renner, Phys. Rev. Lett. 106, 110506 (2011)] are used for the first time to derive security bounds for protocols which use more than two measurements and combine the statistics in a non-linear parameter. This shows the power and versatility of this technique compared to the heavier, though usually tighter, conventional techniques.
The small sample uncertainty aspect in relation to bullwhip effect measurement
Nielsen, Erland Hejn
2009-01-01
a conceptual phenomenon. This paper intends primarily to investigate why this might be so and thereby investigate the various aspects, possibilities and obstacles that must be taken into account, when considering the potential practical use and measure of the bullwhip effect in order to actually get the supply...... chain under control. This paper will put special emphasis on the unavoidable small-sample uncertainty aspects relating to the measurement or estimation of the bullwhip effect. ...
Comment on ''Improved bounds on entropic uncertainty relations''
Bosyk, G. M.; Portesi, M.; Plastino, A.; Zozor, S. [Instituto de Fisica La Plata (IFLP, CONICET), and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, C.C. 67, 1900 La Plata (Argentina); Laboratoire Grenoblois d' Image, Parole, Signal et Automatique (GIPSA-Lab, CNRS), 961 rue de la Houille Blanche, F-38402 Saint Martin d' Heres (France)
2011-11-15
We provide an analytical proof of the entropic uncertainty relations presented by J. I. de Vicente and J. Sanchez-Ruiz [Phys. Rev. A 77, 042110 (2008)] and also show that the replacement of Eq. (27) by Eq. (29) in that reference introduces solutions that do not take fully into account the constraints of the problem, which in turn lead to some mistakes in their treatment.
Standardized Relative Quantification of Immunofluorescence Tissue Staining
sprotocols
2015-01-01
Authors: Oriol Arqués, Irene Chicote, Stephan Tenbaum, Isabel Puig & Héctor G. Palmer ### Abstract The detection of correlations between the expression levels or sub-cellular localization of different proteins with specific characteristics of human tumors, such as e.g. grade of malignancy, may give important hints of functional associations. Here we describe the method we use for relative quantification of immunofluorescence staining of tumor tissue sections, which allows us to co...
Relating Streamflow Depletion to Groundwater Pumpage in the Context of Uncertainty
Brakefield, L. K.; White, J.
2015-12-01
Interaction between groundwater and surface-water systems is an inherently complex process, especially in the context of relating groundwater use to local- scale stream flow characteristics. An integrated hydrologic model (MODFLOW with SWR process) is being developed for the lower San Antonio River Basin in Texas to assess how water use affects the groundwater contribution to stream flow under hypothetical and observed climate scenarios. The basin traverses several dipping aquifer systems and water from these systems is relied on for residential, recreational, industrial, and agricultural uses, as well as oil and gas activities. The current (2015) understanding of interaction between the two hydrologic systems within the basin is limited, but indicates considerable variability in space and time. The modeling analysis is designed to provide improved understanding of spatial and temporal characteristics of interaction under different hydrologic conditions, such as the drought conditions experienced from 2011 through 2013. Due to paucity of data and limited current understanding of interaction between the two hydrologic systems, model results will be inherently uncertain, making uncertainty analysis of paramount importance in the model development and use. As such, linear-based uncertainty analyses, also known as first-order, second-moment analysis, are being used to design the parameterization and objective function prior to the computationally-expensive history-matching process. For this process, we use pyEMU, an open-source python module for linear-based computer model uncertainty analysis. Results show that the prior uncertainty in model inputs yield largely uncertain local-scale interaction estimates. However, by appropriately designing the objective function and parameterization to be aligned with the focus of the modeling analysis, the posterior uncertainty of many local-scale interaction estimates can be reduced.
Rami, El-Nabulsi Ahmad [Department of Nuclear and Energy Engineering, Cheju National University, Ara-dong 1, Jeju 690-756 (Korea, Republic of)], E-mail: nabulsiahmadrami@yahoo.fr
2009-10-15
It was showed that the minimal length Heisenberg-Weyl uncertainty relation may be obtained if the ordinary momentum differentiation operator is extended to its fractional counterpart, namely the generalized fractional Riccati momentum operator of order 0 < {beta} {<=} 1. Some interesting consequences are exposed in concordance with the UV/IR correspondence obtained within the framework of non-commutative C-space geometry, string theory, Rovelli loop quantum gravity, Amelino-Camelia doubly special relativity, Nottale scale relativity and El-Naschie Cantorian fractal spacetime. The fractional theory integrates an absolute minimal length and surprisingly a non-commutative position space.
Standard Penetration Test and Relative Density
1971-02-01
laboratorio ejecutados con un penetr6metro est’tico pequeno. INTRODUCTION One of the main problems encountered in subsoil e’xploration is in situ...would be more valid. REFERENCES Burmister, D. M. �), "The Grading-Density Relation of Gr•anular Materials ." Proc,.edings of the American Society for...Reclamation (1953), "Second Progress Report of Re- search of the Penetration Resistance Method of Subsurface Explora- tion," Report No. EM-356. Design and Construction Division, Earth Materials Laboratory, Denver,
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
Yu, Min; Fang, Mao-Fa
2017-09-01
The dynamic properties of the quantum-memory-assisted entropic uncertainty relation for a system comprised of a qubit to be measured and a memory qubit are investigated. We explore the behaviors of the entropic uncertainty and its lower bound in three different cases: Only one of the two qubits interacts with an external environment and subjects to quantum-jump-based feedback control, or both of the two qubits independently experience their own environments and local quantum-jump-based feedback control. Our results reveal that the quantum-jump-based feedback control with an appropriate feedback parameter can reduce the entropic uncertainty and its lower bound, and for the three different scenarios, the reduction in the uncertainty relates to different physical quantities. Besides, we find out that the quantum-jump-based feedback control not only can remarkably decrease the entropic uncertainty, but also can make the uncertainty reach its lower bound where the dynamical map becomes unital.
van Dongen, Jeroen
2015-01-01
The Einstein-Rupp experiments have been unduly neglected in the history of quantum mechanics. While this is to be explained by the fact that Emil Rupp was later exposed as a fraud and had fabricated the results, it is not justified, due to the importance attached to the experiments at the time. This paper discusses Rupp's fraud, the relation between Albert Einstein and Rupp, and the Einstein-Rupp experiments, and argues that these experiments were an influence on Niels Bohr's development of complementarity and Werner Heisenberg's formulation of the uncertainty relations.
Experimental violation and reformulation of the Heisenberg's error-disturbance uncertainty relation.
Baek, So-Young; Kaneda, Fumihiro; Ozawa, Masanao; Edamatsu, Keiichi
2013-01-01
The uncertainty principle formulated by Heisenberg in 1927 describes a trade-off between the error of a measurement of one observable and the disturbance caused on another complementary observable such that their product should be no less than the limit set by Planck's constant. However, Ozawa in 1988 showed a model of position measurement that breaks Heisenberg's relation and in 2003 revealed an alternative relation for error and disturbance to be proven universally valid. Here, we report an experimental test of Ozawa's relation for a single-photon polarization qubit, exploiting a more general class of quantum measurements than the class of projective measurements. The test is carried out by linear optical devices and realizes an indirect measurement model that breaks Heisenberg's relation throughout the range of our experimental parameter and yet validates Ozawa's relation.
Experimental violation and reformulation of the Heisenberg's error-disturbance uncertainty relation
Baek, So-Young; Kaneda, Fumihiro; Ozawa, Masanao; Edamatsu, Keiichi
2013-07-01
The uncertainty principle formulated by Heisenberg in 1927 describes a trade-off between the error of a measurement of one observable and the disturbance caused on another complementary observable such that their product should be no less than the limit set by Planck's constant. However, Ozawa in 1988 showed a model of position measurement that breaks Heisenberg's relation and in 2003 revealed an alternative relation for error and disturbance to be proven universally valid. Here, we report an experimental test of Ozawa's relation for a single-photon polarization qubit, exploiting a more general class of quantum measurements than the class of projective measurements. The test is carried out by linear optical devices and realizes an indirect measurement model that breaks Heisenberg's relation throughout the range of our experimental parameter and yet validates Ozawa's relation.
Leblanc, T.; Godin-Beekmann, S.; Payen, Godin-Beekmann; Gabarrot, Franck; vanGijsel, Anne; Bandoro, J.; Sica, R.; Trickl, T.
2012-01-01
The international Network for the Detection of Atmospheric Composition Change (NDACC) is a global network of high-quality, remote-sensing research stations for observing and understanding the physical and chemical state of the Earth atmosphere. As part of NDACC, over 20 ground-based lidar instruments are dedicated to the long-term monitoring of atmospheric composition and to the validation of space-borne measurements of the atmosphere from environmental satellites such as Aura and ENVISAT. One caveat of large networks such as NDACC is the difficulty to archive measurement and analysis information consistently from one research group (or instrument) to another [1][2][3]. Yet the need for consistent definitions has strengthened as datasets of various origin (e.g., satellite and ground-based) are increasingly used for intercomparisons, validation, and ingested together in global assimilation systems.In the framework of the 2010 Call for Proposals by the International Space Science Institute (ISSI) located in Bern, Switzerland, a Team of lidar experts was created to address existing issues in three critical aspects of the NDACC lidar ozone and temperature data retrievals: signal filtering and the vertical filtering of the retrieved profiles, the quantification and propagation of the uncertainties, and the consistent definition and reporting of filtering and uncertainties in the NDACC- archived products. Additional experts from the satellite and global data standards communities complement the team to help address issues specific to the latter aspect.
Leblanc, T.; Godin-Beekmann, S.; Payen, Godin-Beekmann; Gabarrot, Franck; vanGijsel, Anne; Bandoro, J.; Sica, R.; Trickl, T.
2012-01-01
The international Network for the Detection of Atmospheric Composition Change (NDACC) is a global network of high-quality, remote-sensing research stations for observing and understanding the physical and chemical state of the Earth atmosphere. As part of NDACC, over 20 ground-based lidar instruments are dedicated to the long-term monitoring of atmospheric composition and to the validation of space-borne measurements of the atmosphere from environmental satellites such as Aura and ENVISAT. One caveat of large networks such as NDACC is the difficulty to archive measurement and analysis information consistently from one research group (or instrument) to another [1][2][3]. Yet the need for consistent definitions has strengthened as datasets of various origin (e.g., satellite and ground-based) are increasingly used for intercomparisons, validation, and ingested together in global assimilation systems.In the framework of the 2010 Call for Proposals by the International Space Science Institute (ISSI) located in Bern, Switzerland, a Team of lidar experts was created to address existing issues in three critical aspects of the NDACC lidar ozone and temperature data retrievals: signal filtering and the vertical filtering of the retrieved profiles, the quantification and propagation of the uncertainties, and the consistent definition and reporting of filtering and uncertainties in the NDACC- archived products. Additional experts from the satellite and global data standards communities complement the team to help address issues specific to the latter aspect.
梁丽军; 薛锦锋; 田琳琳; 刘明明; 沈磊
2016-01-01
ABSTRACT:ObjectiveTo evaluate the uncertainty in determination of methamphetamine by gas chromatography (GC) with internal standard method.MethodsEach source of uncertainty, arising from the procedure of testing, was analyzed and conifrmed according to the guidelines of the uncertainty in measurement. Analysis of the measurement uncertainty during the experiment,including the uncertainty of calculation, measurement repeatability, standard substance, internal standard substance, sample and gas chromatography was taken. After each uncertainty component was evaluated, the combined standard uncertainty and the expanded uncertainty of the result were calculated.ResultsThe relative uncertainty brought from the measurement repeatability was 1.4 %, the one from preparation for internal standard solution of the SKF525A was 0.89%, the one from preparation for standard solution of the methamphetamine was 0.85 %, the one from gas chromatography was 0.78 %, and the one from preparation for the sample of methamphetamine was 0.081 %. The expanded uncertainty was 0.4 % when the methamphetamine content was 10.4 %.ConclusionsThe measurement uncertainty of methamphetamine comes primarily from the measurement repeatability of sample, preparation for standard solution of the methamphetamine, GC, preparation for sample and internal standard. The uncertainty from the sample preparation was relatively minimal, the uncertainty from the uniformity of the sample and peak area ratio can be ignored.%目的：建立气相色谱内标法测定甲基苯丙胺含量的不确定度评估方法。方法从分析测定程序着手，依据不确定度评定的指导性文件，分析了测量实验过程中引入的不确定度来源，包括测量重复性、样品、标准物质、内标物质等分量引入的不确定度，最后合成标准不确定度及获得测量结果的扩展不确定度。结果各相对不确定度分别来源于样品重复性检测为1.4%，SKF525A内标溶液配制为
Frutiger, Jérôme; Marcarie, Camille; Abildskov, Jens; Sin, Gürkan
2016-11-15
This study presents new group contribution (GC) models for the prediction of Lower and Upper Flammability Limits (LFL and UFL), Flash Point (FP) and Auto Ignition Temperature (AIT) of organic chemicals applying the Marrero/Gani (MG) method. Advanced methods for parameter estimation using robust regression and outlier treatment have been applied to achieve high accuracy. Furthermore, linear error propagation based on covariance matrix of estimated parameters was performed. Therefore, every estimated property value of the flammability-related properties is reported together with its corresponding 95%-confidence interval of the prediction. Compared to existing models the developed ones have a higher accuracy, are simple to apply and provide uncertainty information on the calculated prediction. The average relative error and correlation coefficient are 11.5% and 0.99 for LFL, 15.9% and 0.91 for UFL, 2.0% and 0.99 for FP as well as 6.4% and 0.76 for AIT. Moreover, the temperature-dependence of LFL property was studied. A compound specific proportionality constant (K(LFL)) between LFL and temperature is introduced and an MG GC model to estimate K(LFL) is developed. Overall the ability to predict flammability-related properties including the corresponding uncertainty of the prediction can provide important information for a qualitative and quantitative safety-related risk assessment studies.
Greiter, M B; Denk, J; Hoedlmoser, H
2016-09-01
The individual monitoring service at the Helmholtz Zentrum München has adopted the recommendations of the ISO 4037 and 6980 standards series as base of its dosimetric systems for X-ray, gamma and beta dosimetry. These standards define technical requirements for radiation spectra and measurement processes, but leave flexibility in the implementation of irradiations as well as in the resulting uncertainty in dose or dose rate. This article provides an example for their practical implementation in the Munich IAEA/WHO secondary standard dosimetry laboratory. It focusses on two aspects: automation issues and uncertainties in calibration.
HAN Yi-Wen; LIU Shou-Yu
2005-01-01
@@ The new equation of state density is obtained by the utilization of the generalized uncertainty relation. With the help of coordinates and the Wentzel-Kramers-Brillouin approximation, direct calculation of the scalar field entropy of the non-state black hole with an internal global monopole is performed. The entropy obtained from the calculation is proportional to the horizon area. The calculation can be free from convergence if without any cutoff, which is different from the brick-wall method. However, the pertinent result is limited.
Radon contents in groundwater and the uncertainty related to risk assessment
Fukui, Masami [Kyoto Univ. (Japan)
1997-02-01
The United States has proposed 11 Bq/l (300 pCi/l) as the maximum contaminant levels (MCLs) of radon. Japan has not set up the standards for drinking water. The problems about evaluation of effects of radon on organism and MCLs of radon in groundwater and drinking water in 12 countries were reported. The local area content the high concentrations of radon, but generally it`s low levels were observed in Nigeria, China and Mexico. The countries which content high concentration of radon were Greek, Slovakia, Bornholm Island and Scotland. There are high and low concentration area in US and Japan. I proposed an uncertainty scheme on risk assessment for the exposure by radon. (S.Y.)
Sabouri, Sarah; Gerber, Markus; Lemola, Sakari; Becker, Stephen P; Shamsi, Mahin; Shakouri, Zeinab; Sadeghi Bahmani, Dena; Kalak, Nadeem; Holsboer-Trachsler, Edith; Brand, Serge
2016-07-01
The Dark Triad (DT) describes a set of three closely related personality traits, Machiavellianism, narcissism, and psychopathy. The aim of this study was to examine the associations between DT traits, sleep disturbances, anxiety sensitivity and intolerance of uncertainty. A total of 341 adults (M=29years) completed a series of questionnaires related to the DT traits, sleep disturbances, anxiety sensitivity, and intolerance of uncertainty. A higher DT total score was associated with increased sleep disturbances, and higher scores for anxiety sensitivity and intolerance of uncertainty. In regression analyses Machiavellianism and psychopathy were predictors of sleep disturbances, anxiety sensitivity, and intolerance of uncertainty. Results indicate that specific DT traits, namely Machiavellianism and psychopathy, are associated with sleep disturbances, anxiety sensitivity and intolerance of uncertainty in young adults. Copyright © 2016 Elsevier Inc. All rights reserved.
Yang, Ming; Zhu, X Ronald; Park, Peter C; Titt, Uwe; Mohan, Radhe; Virshup, Gary; Clayton, James E; Dong, Lei
2012-07-07
The purpose of this study was to analyze factors affecting proton stopping-power-ratio (SPR) estimations and range uncertainties in proton therapy planning using the standard stoichiometric calibration. The SPR uncertainties were grouped into five categories according to their origins and then estimated based on previously published reports or measurements. For the first time, the impact of tissue composition variations on SPR estimation was assessed and the uncertainty estimates of each category were determined for low-density (lung), soft, and high-density (bone) tissues. A composite, 95th percentile water-equivalent-thickness uncertainty was calculated from multiple beam directions in 15 patients with various types of cancer undergoing proton therapy. The SPR uncertainties (1σ) were quite different (ranging from 1.6% to 5.0%) in different tissue groups, although the final combined uncertainty (95th percentile) for different treatment sites was fairly consistent at 3.0-3.4%, primarily because soft tissue is the dominant tissue type in the human body. The dominant contributing factor for uncertainties in soft tissues was the degeneracy of Hounsfield numbers in the presence of tissue composition variations. To reduce the overall uncertainties in SPR estimation, the use of dual-energy computed tomography is suggested. The values recommended in this study based on typical treatment sites and a small group of patients roughly agree with the commonly referenced value (3.5%) used for margin design. By using tissue-specific range uncertainties, one could estimate the beam-specific range margin by accounting for different types and amounts of tissues along a beam, which may allow for customization of range uncertainty for each beam direction.
Marcela Brugnach
2008-12-01
Full Text Available Uncertainty of late has become an increasingly important and controversial topic in water resource management, and natural resources management in general. Diverse managing goals, changing environmental conditions, conflicting interests, and lack of predictability are some of the characteristics that decision makers have to face. This has resulted in the application and development of strategies such as adaptive management, which proposes flexibility and capability to adapt to unknown conditions as a way of dealing with uncertainties. However, this shift in ideas about managing has not always been accompanied by a general shift in the way uncertainties are understood and handled. To improve this situation, we believe it is necessary to recontextualize uncertainty in a broader way - relative to its role, meaning, and relationship with participants in decision making - because it is from this understanding that problems and solutions emerge. Under this view, solutions do not exclusively consist of eliminating or reducing uncertainty, but of reframing the problems as such so that they convey a different meaning. To this end, we propose a relational approach to uncertainty analysis. Here, we elaborate on this new conceptualization of uncertainty, and indicate some implications of this view for strategies for dealing with uncertainty in water management. We present an example as an illustration of these concepts.
Experimental Test of Heisenberg's Measurement Uncertainty Relation Based on Statistical Distances
Ma, Wenchao; Ma, Zhihao; Wang, Hengyan; Chen, Zhihua; Liu, Ying; Kong, Fei; Li, Zhaokai; Peng, Xinhua; Shi, Mingjun; Shi, Fazhan; Fei, Shao-Ming; Du, Jiangfeng
2016-04-01
Incompatible observables can be approximated by compatible observables in joint measurement or measured sequentially, with constrained accuracy as implied by Heisenberg's original formulation of the uncertainty principle. Recently, Busch, Lahti, and Werner proposed inaccuracy trade-off relations based on statistical distances between probability distributions of measurement outcomes [P. Busch et al., Phys. Rev. Lett. 111, 160405 (2013); P. Busch et al., Phys. Rev. A 89, 012129 (2014)]. Here we reformulate their theoretical framework, derive an improved relation for qubit measurement, and perform an experimental test on a spin system. The relation reveals that the worst-case inaccuracy is tightly bounded from below by the incompatibility of target observables, and is verified by the experiment employing joint measurement in which two compatible observables designed to approximate two incompatible observables on one qubit are measured simultaneously.
Uncertainties related to the representation of momentum transport in shallow convection
Schlemmer, Linda; Bechtold, Peter; Sandu, Irina; Ahlgrimm, Maike
2017-04-01
The vertical transport of horizontal momentum by convection has an important impact on the general circulation of the atmosphere as well as on the life cycle and track of cyclones. So far convective momentum transport (CMT) has mostly been studied for deep convection, whereas little is known about its characteristics and importance in shallow convection. In this study CMT by shallow convection is investigated by analyzing both data from large-eddy simulations (LES) and simulations performed with the Integrated Forecasting System (IFS) of the European Centre for Medium-Range Weather Forecasts (ECMWF). In addition, the central terms underlying the bulk mass-flux parametrization of CMT are evaluated offline. Further, the uncertainties related to the representation of CMT are explored by running the stochastically perturbed parametrizations (SPP) approach of the IFS. The analyzed cases exhibit shallow convective clouds developing within considerable low-level wind shear. Analysis of the momentum fluxes in the LES data reveals significant momentum transport by the convection in both cases, which is directed down-gradient despite substantial organization of the cloud field. A detailed inspection of the convection parametrization reveals a very good representation of the entrainment and detrainment rates and an appropriate representation of the convective mass and momentum fluxes. To determine the correct values of mass-flux and in-cloud momentum at the cloud base in the parametrization yet remains challenging. The spread in convection-related quantities generated by the SPP is reasonable and addresses many of the identified uncertainties.
On Uncertainties in Successive Measurements
Distler, Jacques
2012-01-01
When you measure an observable, A, in Quantum Mechanics, the state of the system changes. This, in turn, affects the quantum-mechanical uncertainty in some non-commuting observable, B. The standard Uncertainty Relation puts a lower bound on the uncertainty of B in the initial state. What is relevant for a subsequent measurement of B, however, is the uncertainty of B in the post-measurement state. We make some remarks on the latter problem, both in the case where A has a pure point spectrum and in the case where A has a continuous spectrum.
无
2006-01-01
There are a number of sources of uncertainty in regional climate change scenarios. When statistical downscaling is used to obtain regional climate change scenarios, the uncertainty may originate from the uncertainties in the global climate models used, the skill of the statistical model, and the forcing scenarios applied to the global climate model. The uncertainty associated with global climate models can be evaluated by examining the differences in the predictors and in the downscaled climate change scenarios based on a set of different global climate models. When standardized global climate model simulations such as the second phase of the Coupled Model Intercomparison Project (CMIP2) are used, the difference in the downscaled variables mainly reflects differences in the climate models and the natural variability in the simulated climates. It is proposed that the spread of the estimates can be taken as a measure of the uncertainty associated with global climate models. The proposed method is applied to the estimation of global-climate-model-related uncertainty in regional precipitation change scenarios in Sweden. Results from statistical downscaling based on 17 global climate models show that there is an overall increase in annual precipitation all over Sweden although a considerable spread of the changes in the precipitation exists. The general increase can be attributed to the increased large-scale precipitation and the enhanced westerly wind. The estimated uncertainty is nearly independent of region. However, there is a seasonal dependence. The estimates for winter show the highest level of confidence, while the estimates for summer show the least.
Haleigh A. Boswell
2015-12-01
Full Text Available Analysis of blood alcohol concentration is a routine analysis performed in many forensic laboratories. This analysis commonly utilizes static headspace sampling, followed by gas chromatography combined with flame ionization detection (GC-FID. Studies have shown several “optimal” methods for instrumental operating conditions, which are intended to yield accurate and precise data. Given that different instruments, sampling methods, application specific columns and parameters are often utilized, it is much less common to find information on the robustness of these reported conditions. A major problem can arise when these “optimal” conditions may not also be robust, thus producing data with higher than desired uncertainty or potentially inaccurate results. The goal of this research was to incorporate the principles of quality by design (QBD in the adjustment and determination of BAC (blood alcohol concentration instrumental headspace parameters, thereby ensuring that minor instrumental variations, which occur as a matter of normal work, do not appreciably affect the final results of this analysis. This study discusses both the QBD principles as well as the results of the experiments, which allow for determination of more favorable instrumental headspace conditions. Additionally, method detection limits will also be reported in order to determine a reporting threshold and the degree of uncertainty at the common threshold value of 0.08 g/dL. Furthermore, the comparison of two internal standards, n-propanol and t-butanol, will be investigated. The study showed that an altered parameter of 85 °C headspace oven temperature and 15 psi headspace vial pressurization produces the lowest percent relative standard deviation of 1.3% when t-butanol is implemented as an internal standard, at least for one very common platform. The study also showed that an altered parameter of 100 °C headspace oven temperature and 15-psi headspace vial pressurization
Managing uncertainty in multiple-criteria decision making related to sustainability assessment
Dorini, Gianluca Fabio; Kapelan, Zoran; Azapagic, Adisa
2011-01-01
In real life, decisions are usually made by comparing different options with respect to several, often conflicting criteria. This requires subjective judgements on the importance of different criteria by DMs and increases uncertainty in decision making. This article demonstrates how uncertainty can......: (1) no uncertainty, (2) uncertainty in data/models and (3) uncertainty in models and decision-makers’ preferences. The results shows how characterising and propagating uncertainty can help increase the effectiveness of multi-criteria decision making processes and lead to more informed decision....... be handled in multi-criteria decision situations using Compromise Programming, one of the Multi-criteria Decision Analysis (MCDA) techniques. Uncertainty is characterised using a probabilistic approach and propagated using a Monte Carlo simulation technique. The methodological approach is illustrated...
Chin, Brian; Nelson, Brady D; Jackson, Felicia; Hajcak, Greg
2016-01-01
Fear conditioning research on threat predictability has primarily examined the impact of temporal (i.e., timing) predictability on the startle reflex. However, there are other key features of threat that can vary in predictability. For example, the reinforcement rate (i.e., frequency) of threat is a crucial factor underlying fear learning. The present study examined the impact of threat reinforcement rate on the startle reflex and self-reported anxiety during a fear conditioning paradigm. Forty-five participants completed a fear learning task in which the conditioned stimulus was reinforced with an electric shock to the forearm on 50% of trials in one block and 75% of trials in a second block, in counter-balanced order. The present study also examined whether intolerance of uncertainty (IU), the tendency to perceive or experience uncertainty as stressful or unpleasant, was associated with the startle reflex during conditions of low (50%) vs. high (75%) reinforcement. Results indicated that, across all participants, startle was greater during the 75% relative to the 50% reinforcement condition. IU was positively correlated with startle potentiation (i.e., increased startle response to the CS+ relative to the CS-) during the 50%, but not the 75%, reinforcement condition. Thus, despite receiving fewer electric shocks during the 50% reinforcement condition, individuals with high IU uniquely demonstrated greater defense system activation when impending threat was more uncertain. The association between IU and startle was independent of state anxiety. The present study adds to a growing literature on threat predictability and aversive responding, and suggests IU is associated with abnormal responding in the context of uncertain threat.
Extreme Events in China under Climate Change: Uncertainty and related impacts (CSSP-FOREX)
Leckebusch, Gregor C.; Befort, Daniel J.; Hodges, Kevin I.
2016-04-01
Suitable adaptation strategies or the timely initiation of related mitigation efforts in East Asia will strongly depend on robust and comprehensive information about future near-term as well as long-term potential changes in the climate system. Therefore, understanding the driving mechanisms associated with the East Asian climate is of major importance. The FOREX project (Fostering Regional Decision Making by the Assessment of Uncertainties of Future Regional Extremes and their Linkage to Global Climate System Variability for China and East Asia) focuses on the investigation of extreme wind and rainfall related events over Eastern Asia and their possible future changes. Here, analyses focus on the link between local extreme events and their driving weather systems. This includes the coupling between local rainfall extremes and tropical cyclones, the Meiyu frontal system, extra-tropical teleconnections and monsoonal activity. Furthermore, the relation between these driving weather systems and large-scale variability modes, e.g. NAO, PDO, ENSO is analysed. Thus, beside analysing future changes of local extreme events, the temporal variability of their driving weather systems and related large-scale variability modes will be assessed in current CMIP5 global model simulations to obtain more robust results. Beyond an overview of FOREX itself, first results regarding the link between local extremes and their steering weather systems based on observational and reanalysis data are shown. Special focus is laid on the contribution of monsoonal activity, tropical cyclones and the Meiyu frontal system on the inter-annual variability of the East Asian summer rainfall.
Mezzasalma, Stefano A
2007-03-15
The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected.
Marcos, Patricia; Lopez-Nicolas, Antonio; Pulido-Velazquez, Manuel
2017-04-01
Southern Mediterranean basins are prone to droughts, due to the high temporal and spatial rainfall variability. In addition, semiarid Mediterranean regions emerge as noticeable climate change hotspots, with high uncertainty about the impacts of climate change on future droughts. Standardized drought indices have been traditionally used to assess and identify drought events, because of their simplicity and flexibility to compare the departure from normal status across regions at different timescales. Nevertheless, the statistical foundation of these indices assumes stationarity for certain aspects of the cli-matic variables, which could not be longer adopted under climate change. Thus, in recent years several modifications have been proposed in order to cope with these limitations. This contribution provides a framework to analyze climate change impact on meteorological and hydrological droughts, considering the predicted shifts in precipitation and temperature and the uncertainty of the assumed distribution parameters. To characterize drought in a climate change context, relative standardized indices instead of the traditional ones are applied: Standardized Precipitation Index (rSPI), Standardized Precipitation Evapotranspiration Index (rSPEI) and a Standardized Flow Index (rSFI). The behavior of the rSPI versus the multiscalar rSPEI is contrasted. A modification of the Thornthwaite scheme is presented to improve the representation of the intra-annual variation of the potential evapotranspiration (PET) in continental climate areas. The uncertainty due to the selected hydrological model is assessed through the comparison of the performance and outcome of three conceptual lumped-parameter models (Temez, GR2M, and HBV-light). The Temez model was selected to obtain the runoff for the rSFI, given that it showed the best fitting in our case study. To address the uncertainty of the indices distribution parameters, bootstrapping was combined with the computation of the
Lechner, Clemens M; Silbereisen, Rainer K; Tomasik, Martin J; Wasilewski, Jacek
2015-06-01
This study investigated how religiosity relates to goal engagement (i.e., investing time and effort; overcoming obstacles) and goal disengagement (i.e., protecting self-esteem and motivational resources against failure experiences; distancing from unattainable goals) in coping with perceived work-related uncertainties (e.g., growing risk of job loss) that arise from current social change. We hypothesised that religiosity not only expands individuals' capacities for both engagement and disengagement but also fosters an opportunity-congruent pattern of engagement and disengagement, promoting engagement especially under favourable opportunities for goal-striving in the social ecology and facilitating disengagement especially under unfavourable opportunities. Multilevel analyses in a sample of N = 2089 Polish adults aged 20-46 years partly supported these predictions. Religiosity was associated with higher goal engagement, especially under favourable economic opportunities for goal-striving in the social ecology (as measured by the regional net migration rate). For disengagement, the results were more mixed; religiosity was related to higher self-protection independently of the economic opportunity structure and predicted higher goal-distancing only under the most unfavourable opportunities. These results suggest that religiosity can promote different coping strategies under different conditions, fostering a pattern of opportunity-congruent engagement and, to some extent, disengagement that is likely to be adaptive. © 2014 International Union of Psychological Science.
Uncertainty Forecasts Improve Weather-Related Decisions and Attenuate the Effects of Forecast Error
Joslyn, Susan L.; LeClerc, Jared E.
2012-01-01
Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather…
Heydorn, Kaj; Anglov, Thomas
2002-01-01
Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration unce...
A Slippery Slope: Systematic Uncertainties in the Baryonic Tully-Fisher Relation
Bradford, Jeremy D; Bosch, Frank C van den
2016-01-01
The baryonic Tully-Fisher relation (BTFR) is both a valuable observational tool and a critical test of galaxy formation theory. We explore the systematic uncertainty in the slope and the scatter of the observed BTFR utilizing a homogeneously measured dataset of 930 isolated galaxies. We measure a fiducial relation of log_10 M_baryon = 3.24 log_10 V_rot + 3.21 with a scatter of 0.25 dex over the baryonic mass range of 10^7.4 to 10^11.3 M_sun. We then conservatively vary the definitions of M_baryon and V_rot, the sample definition and the linear fitting algorithm used to fit the BTFR. We obtain slopes ranging from 2.64 to 3.46 and scatter measurements ranging from 0.16 to 0.41 dex. We next compare our fiducial slope to literature measurements, where reported slopes range from 3.0 to 4.3 and scatter is either unmeasured, unmeasurable or as large as 0.4 dex. Measurements derived from unresolved HI line-widths tend to produce slopes of 3.2, while measurements derived strictly from resolved asymptotic rotation velo...
Li, Xi-Zeng; Su, Bao-Xia
1994-01-01
It is found that two-mode output quantum electromagnetic field in two-mode squeezed states exhibits higher-order squeezing to all even orders. And the generalized uncertainty relations are also presented for the first time. The concept of higher-order squeezing of the single-mode quantum electromagnetic field was first introduced and applied to several processes by Hong and Mandel in 1985. Lately Li Xizeng and Shan Ying have calculated the higher-order squeezing in the process of degenerate four-wave mixing and presented the higher-order uncertainty relations of the fields in single-mode squeezed states. In this paper we generalize the above work to the higher-order squeezing in two-mode squeezed states. The generalized uncertainty relations are also presented for the first time.
Carcioppolo, Nick; Yang, Fan; Yang, Qinghua
2016-09-01
Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.
Nanoparticles: Uncertainty Risk Analysis
Grieger, Khara Deanne; Hansen, Steffen Foss; Baun, Anders
2012-01-01
Scientific uncertainty plays a major role in assessing the potential environmental risks of nanoparticles. Moreover, there is uncertainty within fundamental data and information regarding the potential environmental and health risks of nanoparticles, hampering risk assessments based on standard a...
韩冰; 贺青; 李正坤; 李辰
2011-01-01
Based on the special structure of exciting coils system of Joule balance, the uncertainty sources of the magnetic density in the geometrical center of magnetic field was analysed and evaluated. The relative combined standard uncertainty of magnetic field in △H/H was 1.8 × 10 -3. In addition, it was proved that the largest components of uncertainty were due to terms with u ( I), which was 1.8 × 10-3 contribution to relative combined standard uncertainty in △H/H.Therefore the high-precision constant-current source was recommended to reduce uncertainty of magnetic field of Joule Balance.%基于焦耳天平激励线圈系统的具体结构,分析了磁场系统几何中心磁场强度的不确定度分量,磁场的相对合成标准不确定度ΔH/H达到1.8×10-3.证明了影响焦耳天平磁场最大的不确定度分量来源于线圈中加载电流的不确定度u(Ⅰ),它对磁场的相对合成标准不确定度ΔH/H影响达到1.8×10-3,因此推荐采用高稳定的恒流源来较少焦耳天平磁场的不确定度.
Introduction to International Ethical Standards Related to Genetics and Genomics
Seon-Hee Yim
2013-12-01
Full Text Available The rapid advances in genetic knowledge and technology raise various, sometimes unprecedented, ethical dilemmas in the scientific community as well as the public realm. To deal with these dilemmas, the international community has prepared and issued ethical standards in various formats. In this review, seven international standards regarding genetics and genomics will be briefly introduced in chronological order. Critical reflections on them will not be provided in this review, and naturally, they have their own problems and shortcomings. However, a common set of the principles expressed in them will be highlighted here, because they are still relevant, and many of them will be more relevant in the future. Some of the interesting contents will be selected and described. After that, the morality of one recent event related to whole-genome sequencing and person-identifiable genetic data will be explored based on those international standards.
Quantitative Relative Comparison of CFD Simulation Uncertainties for a Transonic Diffuser Problem
Hosder, Serhat; Grossman, Bernard; Haftka, Raphael T.; Mason, William H.; Watson, Layne T.
2004-01-01
Different sources of uncertainty in CFD simulations are illustrated by a detailed study of two-dimensional, turbulent, transonic flow in a converging-diverging channel. Runs were performed with the commercial CFD code GASP using different turbulence models, grid levels, and flux-limiters to see the effect of each on the CFD simulation uncertainties. Two flow conditions were studied by changing the exit pressure ratio: the first is a complex case with a strong shock and a separated flow region...
Relations between the technological standards and technological appropriation
Carlos Alberto PRADO GUERRERO
2010-06-01
Full Text Available The objective of this study is to analyze the educational practices of using Blackboard in blended learning environments with students of higher education to understand the relationship between technological appropriation and standards of educational technology. To achieve that goal, the following research question was raised: ¿To what extent are the standards of education technology with the appropriation of technology in blended learning environments in higher education related? The contextual framework of this work includes the following topics: the institution, teaching, teachers and students. The design methodology that was used is of a correlation type. Correlations were carried out to determine the frequency and level in the technological standards as well as the appropriation of technology. In the comparison of the results obtained by the students, the teachers and the platform; we found that students in the school study showed a high degree of technology ownership and this was the same for the performance shown on the technological standards. It was established that teachers play a key role in developing the technological appropriation of students and performance in technology standards.
Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul
2012-11-26
The aim of this work is to develop group-contribution(+) (GC(+)) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality of parameter estimation, such as the parameter covariance, the standard errors in predicted properties, and the confidence intervals. For parameter estimation, large data sets of experimentally measured property values of a wide range of chemicals (hydrocarbons, oxygenated chemicals, nitrogenated chemicals, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22 environment-related properties, which include the fathead minnow 96-h LC(50), Daphnia magna 48-h LC(50), oral rat LD(50), aqueous solubility, bioconcentration factor, permissible exposure limit (OSHA-TWA), photochemical oxidation potential, global warming potential, ozone depletion potential, acidification potential, emission to urban air (carcinogenic and noncarcinogenic), emission to continental rural air (carcinogenic and noncarcinogenic), emission to continental fresh water (carcinogenic and noncarcinogenic), emission to continental seawater (carcinogenic and noncarcinogenic), emission to continental natural soil (carcinogenic and noncarcinogenic), and emission to continental agricultural soil (carcinogenic and noncarcinogenic) have been modeled and analyzed. The application
Li, Xi-Zeng; Su, Bao-Xia
1996-01-01
It is found that the field of the combined mode of the probe wave and the phase-conjugate wave in the process of non-degenerate four-wave mixing exhibits higher-order squeezing to all even orders. And the generalized uncertainty relations in this process are also presented.
An Ontology for Uncertainty in Climate Change Projections
King, A. W.
2011-12-01
Paraphrasing Albert Einstein's aphorism about scientific quantification: not all uncertainty that counts can be counted, and not all uncertainty that can be counted counts. The meaning of the term "uncertainty" in climate change science and assessment is itself uncertain. Different disciplines and perspectives bring different nuances if not meanings of the term to the conversation. For many scientists, uncertainty is somehow associated with statistical dispersion and standard error. For many users of climate change information, uncertainty is more related to their confidence, or lack thereof, in climate models. These "uncertainties" may be related, but they are not identical, and there is considerable room for confusion and misunderstanding. A knowledge framework, a system of concepts and vocabulary, for communicating uncertainty can add structure to the characterization and quantification of uncertainty and aid communication among scientists and users. I have developed an ontology for uncertainty in climate change projections derived largely from the report of the W3C Uncertainty Reasoning for the World Wide Web Incubator Group (URW3-XG) dealing with the problem of uncertainty representation and reasoning on the World Wide Web. I have adapted this ontology for uncertainty about information to uncertainty about climate change. Elements of the ontology apply with little or no translation to the information of climate change projections, with climate change almost a use case. Other elements can be translated into language used in climate-change discussions; translating aleatory uncertainty in the UncertaintyNature class as irreducible uncertainty is an example. I have added classes for source of uncertainty (UncertaintySource) (different model physics, for example) and metrics of uncertainty (UncertaintyMetric), at least, in the case of the latter, for those instances of uncertainty that can be quantified (i.e., counted). The statistical standard deviation isa member
Okubo, Sho; Nakayama, Hirotaka; Iwakuni, Kana; Inaba, Hajime; Sasada, Hiroyuki
2011-11-21
We determine the absolute frequencies of 56 rotation-vibration transitions of the ν(3) band of CH(4) from 88.2 to 90.5 THz with a typical uncertainty of 2 kHz corresponding to a relative uncertainty of 2.2 × 10(-11) over an average time of a few hundred seconds. Saturated absorption lines are observed using a difference-frequency-generation source and a cavity-enhanced absorption cell, and the transition frequencies are measured with a fiber-laser-based optical frequency comb referenced to a rubidium atomic clock linked to the international atomic time. The determined value of the P(7) F(2)((2)) line is consistent with the International Committee for Weights and Measures recommendation within the uncertainty.
Casola, J. H.; Huber, D.
2013-12-01
Many media, academic, government, and advocacy organizations have achieved sophistication in developing effective messages based on scientific information, and can quickly translate salient aspects of emerging climate research and evolving observations. However, there are several ways in which valid messages can be misconstrued by decision makers, leading them to inaccurate conclusions about the risks associated with climate impacts. Three cases will be discussed: 1) Issues of spatial scale in interpreting climate observations: Local climate observations may contradict summary statements about the effects of climate change on larger regional or global spatial scales. Effectively addressing these differences often requires communicators to understand local and regional climate drivers, and the distinction between a 'signal' associated with climate change and local climate 'noise.' Hydrological statistics in Missouri and California are shown to illustrate this case. 2) Issues of complexity related to extreme events: Climate change is typically invoked following a wide range of damaging meteorological events (e.g., heat waves, landfalling hurricanes, tornadoes), regardless of the strength of the relationship between anthropogenic climate change and the frequency or severity of that type of event. Examples are drawn from media coverage of several recent events, contrasting useful and potentially confusing word choices and frames. 3) Issues revolving around climate sensitivity: The so-called 'pause' or 'hiatus' in global warming has reverberated strongly through political and business discussions of climate change. Addressing the recent slowdown in warming yields an important opportunity to raise climate literacy in these communities. Attempts to use recent observations as a wedge between climate 'believers' and 'deniers' is likely to be counterproductive. Examples are drawn from Congressional testimony and media stories. All three cases illustrate ways that decision
Relative subtest scatter in the WAIS-IV standardization sample.
Binder, Laurence M; Binder, Adrienne L
2011-01-01
The frequencies of differences between highest and lowest subtest scores as a function of highest subtest score (relative scatter), are reported for the standardization sample of the Wechsler Adult Intelligence Scale-IV (WAIS-IV). Large differences between highest and lowest subtest scores were common. The degree of relative scatter was related to the height of the highest subtest score. For the 10 core WAIS-IV subtests, the correlation between the level of the highest subtest score and the amount of scatter was r = .62; for all 15 subtests the correlation was. 63. The level of the highest subtest score was more strongly related to scatter than was Full Scale IQ. Clinical implications for inferring cognitive impairment and estimating premorbid abilities are discussed. When considering the possibility of acquired cognitive impairment, we recommend caution in the interpretation of subtest score differences.
Relating Standardized Visual Perception Measures to Simulator Visual System Performance
Kaiser, Mary K.; Sweet, Barbara T.
2013-01-01
Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).
Mapping Heat-related Risks for Community-based Adaptation Planning under Uncertainty
Bai, Yingjiu; Kaneko, Ikuyo; Kobayashi, Hikaru; Kurihara, Kazuo; Sasaki, Hidetaka; Murata, Akihiko; Takayabu, Izuru
2016-04-01
Climate change is leading to more frequent and intense heat waves. Recently, epidemiologic findings on heat-related health impacts have reinforced our understanding of the mortality impacts of extreme heat. This research has several aims: 1) to promote climate prediction services with spatial and temporal information on heat-related risks, using GIS (Geographical Information System), and digital mapping techniques; 2) to propose a visualization approach to articulating the evolution of local heat-health responses over time and the evaluation of new interventions for the implementation of valid community-based adaptation strategies and reliable actionable planning; and 3) to provide an appropriate and simple method of adjusting bias and quantifying the uncertainty in future outcomes, so that regional climate projections may be transcribed into useful forms for a wide variety of different users. Following the 2003 European heat wave, climatologists, medical specialists, and social scientists expedited efforts to revise and integrate risk governance frameworks for communities to take appropriate and effective actions themselves. Recently, the Coupled Model Intercomparison Project (CMIP) methodology has made projections possible for anyone wanting to openly access state-of-the-art climate model outputs and climate data to provide the backbone for decisions. Furthermore, the latest high-solution regional climate model (RCM) has been a huge increase in the volumes of data available. In this study, we used high-quality hourly projections (5-km resolution) from the Non-Hydrostatic Regional Climate Model (NHRCM-5km), following the SRES-A1B scenario developed by the Meteorological Research Institute (MRI) and observational data from the Automated Meteorological Data Acquisition System, Japan Meteorological Agency (JMA). The NHRCM-5km is a dynamic downscaling of results from the MRI-AGCM3.2S (20-km resolution), an atmospheric general circulation model (AGCM) driven by the
Benkler, Erik; Sterr, Uwe
2015-01-01
The power spectral density in Fourier frequency domain, and the different variants of the Allan deviation (ADEV) in dependence on the averaging time are well established tools to analyse the fluctuation properties and frequency instability of an oscillatory signal. It is often supposed that the statistical uncertainty of a measured average frequency is given by the ADEV at a well considered averaging time. However, this approach requires further mathematical justification and refinement, which has already been done regarding the original ADEV for certain noise types. Here we provide the necessary background to use the modified Allan deviation (modADEV) and other two-sample deviations to determine the uncertainty of weighted frequency averages. The type of two-sample deviation used to determine the uncertainty depends on the method used for determination of the average. We find that the modADEV, which is connected with $\\Lambda$-weighted averaging, and the two sample deviation associated to a linear phase regr...
HSE management standards and stress-related work outcomes.
Kerr, Robert; McHugh, Marie; McCrory, Mark
2009-12-01
The UK Health and Safety Executive's (HSE) Management Standards (MS) approach has been developed to help organizations manage potential sources of work-related stress. Although there is general support for the assessment model adopted by this approach, to date, there has been no empirical investigation of the relationship between the actual MS (as measured by the final revised version of the HSE Indicator Tool) and stress-related work outcomes. To investigate the relationship between the HSE MS and the following stress-related work outcomes: 'job satisfaction', job-related anxiety and depression and errors/near misses. An anonymous cross-sectional questionnaire was distributed by either e-mail or post to all employees within a community-based Health and Social Services Trust. Respondents completed the HSE Indicator Tool, a job-related anxiety and depression scale, a job satisfaction scale and an aggregated measure of the number of errors/near misses witnessed. Associations between the HSE Indicator Tool responses and stress-related work outcomes were analysed with regression statistics. A total of 707 employees completed the questionnaire, representing a low response rate of 29%. Controlling for age, gender and contract type, the HSE MS (as measured by the HSE Indicator Tool) were positively associated with job satisfaction and negatively associated with 'job-related anxiety', 'job-related depression' and 'witnessed errors/near misses'. This study provides empirical evidence to support the use of the MS approach in tackling workplace stress.
Wei Song
Full Text Available As a large producer and consumer of wood building materials, China suffers product formaldehyde emissions (PFE but lacks systematic investigations and basic data on Chinese standard emission tests (CST, so this paper presented a first effort on this issue. The PFE of fiberboards, particleboards, blockboards, floorings, and parquets manufactured in Beijing region were characterized by the perforator extraction method (PE, 9-11 L and 40 L desiccator methods (D9, D40, and environmental chamber method (EC of the Chinese national standard GB 18580; based on statistics of PFE data, measurement uncertainties in CST were evaluated by the Monte Carlo method; moreover, PFE data correlations between tests were established. Results showed: (1 Different tests may give slightly different evaluations on product quality. In PE and D9 tests, blockboards and parquets reached E1 grade for PFE, which can be directly used in indoor environment; but in D40 and EC tests, floorings and parquets achieved E1. (2 In multiple tests, PFE data characterized by PE, D9, and D40 complied with Gaussian distributions, while those characterized by EC followed log-normal distributions. Uncertainties in CST were overall low, with uncertainties for 20 material-method combinations all below 7.5%, and the average uncertainty for each method under 3.5%, thus being acceptable in engineering application. A more complicated material structure and a larger test scale caused higher uncertainties. (3 Conventional linear models applied to correlating PFE values between PE, D9, and EC, with R2 all over 0.840, while novel logarithmic (exponential models can work better for correlations involving D40, with R2 all beyond 0.901. This research preliminarily demonstrated the effectiveness of CST, where results for D40 presented greater similarities to EC-the currently most reliable test for PFE, thus highlighting the potential of Chinese D40 as a more practical approach in production control and risk
The Publication and Distribution of Chinese Standards and Other Standards-related Products
Bai Demei
2005-01-01
@@ Standards Press of China (SPC), founded in October 1963, is the only publication center in China licensed to publish national standards, trade standards, and books concerned with standardization, quality, and other science and technology. The main publications are as following:
Suchak, Meghana
2014-01-01
The focus of the current study was on examining possible differences in college students' adjustment based on residency status (i.e., international Asian vs. domestic students) and illness status (i.e., having a family member with a chronic illness vs. not having a family member with a chronic illness). The study also examined the associations between overall college student adjustment and the family and illness-related factors of role conflict, uncertainty in illness, and illness-related com...
Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Godin-Beekmann, Sophie; Haefele, Alexander; Trickl, Thomas; Payen, Guillaume; Gabarrot, Frank
2016-08-01
A standardized approach for the definition and reporting of vertical resolution of the ozone and temperature lidar profiles contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. Two standardized definitions homogeneously and unequivocally describing the impact of vertical filtering are recommended. The first proposed definition is based on the width of the response to a finite-impulse-type perturbation. The response is computed by convolving the filter coefficients with an impulse function, namely, a Kronecker delta function for smoothing filters, and a Heaviside step function for derivative filters. Once the response has been computed, the proposed standardized definition of vertical resolution is given by Δz = δz × HFWHM, where δz is the lidar's sampling resolution and HFWHM is the full width at half maximum (FWHM) of the response, measured in sampling intervals. The second proposed definition relates to digital filtering theory. After applying a Laplace transform to a set of filter coefficients, the filter's gain characterizing the effect of the filter on the signal in the frequency domain is computed, from which the cut-off frequency fC, defined as the frequency at which the gain equals 0.5, is computed. Vertical resolution is then defined by Δz = δz/(2fC). Unlike common practice in the field of spectral analysis, a factor 2fC instead of fC is used here to yield vertical resolution values nearly equal to the values obtained with the impulse response definition using the same filter coefficients. When using either of the proposed definitions, unsmoothed signals yield the best possible vertical resolution Δz = δz (one sampling bin). Numerical tools were developed to support the implementation of these definitions across all NDACC lidar groups. The tools consist of ready-to-use "plug-in" routines written in several programming languages that can be inserted into any lidar data processing software and
The Evolution of Classroom Physics Knowledge in Relation to Certainty and Uncertainty
Tiberghien, Andrée; Cross, David; Sensevy, Gérard
2014-01-01
This paper deals with the joint construction of knowledge by the teacher and the students in a physics classroom. It is focused on the status of epistemic certainty/uncertainty of knowledge. The same element of knowledge can be introduced as possible and thus uncertain and then evolve towards a status of epistemic certainty; the status of other…
The Classical Limit of Minimal Length Uncertainty Relation:Revisit with the Hamilton-Jacobi Method
Guo, Xiaobo; Yang, Haitang
2015-01-01
The existence of a minimum measurable length could deform not only the standard quantum mechanics but also classical physics. The effects of the minimal length on classical orbits of particles in a gravitation field have been investigated before, using the deformed Poisson bracket or Schwarzschild metric. In this paper, we use the Hamilton-Jacobi method to study motions of particles in the context of deformed Newtonian mechanics and general relativity. Specifically, the precession of planetary orbits, deflection of light, and time delay in radar propagation are considered in this paper. We also set limits on the deformation parameter by comparing our results with the observational measurements. Finally, comparison with results from previous papers is given at the end of this paper.
The classical limit of minimal length uncertainty relation: revisit with the Hamilton-Jacobi method
Guo, Xiaobo; Wang, Peng; Yang, Haitang
2016-05-01
The existence of a minimum measurable length could deform not only the standard quantum mechanics but also classical physics. The effects of the minimal length on classical orbits of particles in a gravitation field have been investigated before, using the deformed Poisson bracket or Schwarzschild metric. In this paper, we first use the Hamilton-Jacobi method to derive the deformed equations of motion in the context of Newtonian mechanics and general relativity. We then employ them to study the precession of planetary orbits, deflection of light, and time delay in radar propagation. We also set limits on the deformation parameter by comparing our results with the observational measurements. Finally, comparison with results from previous papers is given at the end of this paper.
Gauge-Independent Scales Related to the Standard Model Vacuum Instability
Espinosa, Jose R.; Konstandin, Thomas; Riotto, Antonio
2016-01-01
The measured (central) values of the Higgs and top quark masses indicate that the Standard Model (SM) effective potential develops an instability at high field values. The scale of this instability, determined as the Higgs field value at which the potential drops below the electroweak minimum, is about $10^{11}$ GeV. However, such a scale is unphysical as it is not gauge-invariant and suffers from a gauge-fixing uncertainty of up to two orders of magnitude. Subjecting our system, the SM, to several probes of the instability (adding higher order operators to the potential; letting the vacuum decay through critical bubbles; heating up the system to very high temperature; inflating it) and asking in each case physical questions, we are able to provide several gauge-invariant scales related with the Higgs potential instability.
Concept for an International Standard related to Space Weather Effects on Space Systems
Tobiska, W. Kent; Tomky, Alyssa
There is great interest in developing an international standard related to space weather in order to specify the tools and parameters needed for space systems operations. In particular, a standard is important for satellite operators who may not be familiar with space weather. In addition, there are others who participate in space systems operations that would also benefit from such a document. For example, the developers of software systems that provide LEO satellite orbit determination, radio communication availability for scintillation events (GEO-to-ground L and UHF bands), GPS uncertainties, and the radiation environment from ground-to-space for commercial space tourism. These groups require recent historical data, current epoch specification, and forecast of space weather events into their automated or manual systems. Other examples are national government agencies that rely on space weather data provided by their organizations such as those represented in the International Space Environment Service (ISES) group of 14 national agencies. Designers, manufacturers, and launchers of space systems require real-time, operational space weather parameters that can be measured, monitored, or built into automated systems. Thus, a broad scope for the document will provide a useful international standard product to a variety of engineering and science domains. The structure of the document should contain a well-defined scope, consensus space weather terms and definitions, and internationally accepted descriptions of the main elements of space weather, its sources, and its effects upon space systems. Appendices will be useful for describing expanded material such as guidelines on how to use the standard, how to obtain specific space weather parameters, and short but detailed descriptions such as when best to use some parameters and not others; appendices provide a path for easily updating the standard since the domain of space weather is rapidly changing with new advances
George Maldonado
2009-09-01
Full Text Available Abstract: In a follow-up study of mortality among North American synthetic rubber industry workers, cumulative exposure to 1,3-butadiene was positively associated with leukemia. Problems with historical exposure estimation, however, may have distorted the association. To evaluate the impact of potential inaccuracies in exposure estimation, we conducted uncertainty analyses of the relation between cumulative exposure to butadiene and leukemia. We created the 1,000 sets of butadiene estimates using job-exposure matrices consisting of exposure values that corresponded to randomly selected percentiles of the approximate probability distribution of plant-, work area/job group-, and year specific butadiene ppm. We then analyzed the relation between cumulative exposure to butadiene and leukemia for each of the 1,000 sets of butadiene estimates. In the uncertainty analysis, the point estimate of the RR for the first non zero exposure category (>0–<37.5 ppm-years was most likely to be about 1.5. The rate ratio for the second exposure category (37.5–<184.7 ppm-years was most likely to range from 1.5 to 1.8. The RR for category 3 of exposure (184.7–<425.0 ppm-years was most likely between 2.1 and 3.0. The RR for the highest exposure category (425.0+ ppm-years was likely to be between 2.9 and 3.7. This range off RR point estimates can best be interpreted as a probability distribution that describes our uncertainty in RR point estimates due to uncertainty in exposure estimation. After considering the complete probability distributions of butadiene exposure estimates, the exposure-response association of butadiene and leukemia was maintained. This exercise was a unique example of how uncertainty analyses can be used to investigate and support an observed measure of effect when occupational exposure estimates are employed in the absence of direct exposure measurements.
Standard general relativity from Chern-Simons gravity
Izaurieta, F. [Departamento de Matematica y Fisica Aplicadas, Universidad, Catolica de la Santisima Concepcion, Alonso de Rivera 2850, Concepcion (Chile); Minning, P. [Departamento de Fisica, Universidad de Concepcion, Casilla 160-C, Concepcion (Chile); Perez, A. [Departamento de Fisica, Universidad de Concepcion, Casilla 160-C, Concepcion (Chile); Max Planck Institut fuer Gravitationsphysik, Albert Einstein, Institut. Am Muehlenberg1, D-14476 Golm bei Potsdam (Germany); Rodriguez, E. [Departamento de Matematica y Fisica Aplicadas, Universidad, Catolica de la Santisima Concepcion, Alonso de Rivera 2850, Concepcion (Chile); Salgado, P. [Departamento de Fisica, Universidad de Concepcion, Casilla 160-C, Concepcion (Chile)], E-mail: pasalgad@udec.cl
2009-07-13
Chern-Simons models for gravity are interesting because they provide a truly gauge-invariant action principle in the fiber-bundle sense. So far, their main drawback has largely been its perceived remoteness from standard General Relativity, based on the presence of higher powers of the curvature in the Lagrangian (except, remarkably, for three-dimensional spacetime). Here we report on a simple model that suggests a mechanism by which standard General Relativity in five-dimensional spacetime may indeed emerge at a special critical point in the space of couplings, where additional degrees of freedom and corresponding 'anomalous' Gauss-Bonnet constraints drop out from the Chern-Simons action. To achieve this goal, both the Lie algebra g and the symmetric g-invariant tensor that define the Chern-Simons Lagrangian are constructed by means of the Lie algebra S-expansion method with a suitable finite Abelian semigroup S. The results are generalized to arbitrary odd dimensions, and the possible extension to the case of eleven-dimensional supergravity is briefly discussed.
Optimal entropic uncertainty relation for successive measurements in quantum information theory
M D Srinivas
2003-06-01
We derive an optimal bound on the sum of entropic uncertainties of two or more observables when they are sequentially measured on the same ensemble of systems. This optimal bound is shown to be greater than or equal to the bounds derived in the literature on the sum of entropic uncertainties of two observables which are measured on distinct but identically prepared ensembles of systems. In the case of a two-dimensional Hilbert space, the optimum bound for successive measurements of two-spin components, is seen to be strictly greater than the optimal bound for the case when they are measured on distinct ensembles, except when the spin components are mutually parallel or perpendicular.
Standard guide for corrosion-related failure analysis
American Society for Testing and Materials. Philadelphia
2000-01-01
1.1 This guide covers key issues to be considered when examining metallic failures when corrosion is suspected as either a major or minor causative factor. 1.2 Corrosion-related failures could include one or more of the following: change in surface appearance (for example, tarnish, rust, color change), pin hole leak, catastrophic structural failure (for example, collapse, explosive rupture, implosive rupture, cracking), weld failure, loss of electrical continuity, and loss of functionality (for example, seizure, galling, spalling, swelling). 1.3 Issues covered include overall failure site conditions, operating conditions at the time of failure, history of equipment and its operation, corrosion product sampling, environmental sampling, metallurgical and electrochemical factors, morphology (mode) or failure, and by considering the preceding, deducing the cause(s) of corrosion failure. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibili...
Uncertainty Relations and Quantum Effects of Constraints in Chern-Simons Theory
Nakamura, M
2013-01-01
It is well known that Chern-Simons Theories are in the constrained systems and their total Hamiltonians become identically zero, because of their gauge invariance. While treating the constraints quantum mechanially, it will be expected taht there remain the quantum fluctuations due to the uncertainty principle. Using the projection operator method (POM) and the theory of dynamical constraints, such fluctuation terms are systematically derived in the case of Abelian Chern-Simons theory. It is shown that these terms produce the effective mass in the complex scalar fields coupled to the CS fields.
Time-of-arrival distribution for arbitrary potentials and Wigner's time-energy uncertainty relation
Baute, A D; Palao, J P; Muga, J G; Egusquiza, I L
2000-01-01
A realization of the concept of "crossing state" invoked, but not implemented, by Wigner, allows to advance in two important aspects of the time of arrival in quantum mechanics: (i) For free motion, we find that the limitations described by Aharonov et al. in Phys. Rev. A 57, 4130 (1998) for the time-of-arrival uncertainty at low energies for certain mesurement models are in fact already present in the intrinsic time-of-arrival distribution of Kijowski; (ii) We have also found a covariant generalization of this distribution for arbitrary potentials and positions.
Chart-Asa, Chidsanuphong; Gibson, Jacqueline MacDonald
2015-02-15
This paper develops and then demonstrates a new approach for quantifying health impacts of traffic-related particulate matter air pollution at the urban project scale that includes variability and uncertainty in the analysis. We focus on primary particulate matter having a diameter less than 2.5 μm (PM2.5). The new approach accounts for variability in vehicle emissions due to temperature, road grade, and traffic behavior variability; seasonal variability in concentration-response coefficients; demographic variability at a fine spatial scale; uncertainty in air quality model accuracy; and uncertainty in concentration-response coefficients. We demonstrate the approach for a case study roadway corridor with a population of 16,000, where a new extension of the University of North Carolina (UNC) at Chapel Hill campus is slated for construction. The results indicate that at this case study site, health impact estimates increased by factors of 4-9, depending on the health impact considered, compared to using a conventional health impact assessment approach that overlooks these variability and uncertainty sources. In addition, we demonstrate how the method can be used to assess health disparities. For example, in the case study corridor, our method demonstrates the existence of statistically significant racial disparities in exposure to traffic-related PM2.5 under present-day traffic conditions: the correlation between percent black and annual attributable deaths in each census block is 0.37 (t(114)=4.2, p<0.0001). Overall, our results show that the proposed new campus will cause only a small incremental increase in health risks (annual risk 6×10(-10); lifetime risk 4×10(-8)), compared to if the campus is not built. Nonetheless, the approach we illustrate could be useful for improving the quality of information to support decision-making for other urban development projects.
Hearing Aid–Related Standards and Test Systems
Ravn, Gert; Preves, David
2015-01-01
Many documents describe standardized methods and standard equipment requirements in the field of audiology and hearing aids. These standards will ensure a uniform level and a high quality of both the methods and equipment used in audiological work. The standards create the basis for measuring performance in a reproducible manner and independent from how and when and by whom parameters have been measured. This article explains, and focuses on, relevant acoustic and electromagnetic compatibility parameters and describes several test systems available. PMID:27516709
Post, W.M.; Dale, V.H.; DeAngelis, D.L.; Mann, L.K.; Mulholland, P.J.; O' Neill, R.V.; Peng, T.-H.; Farrell, M.P.
1990-01-01
The global carbon cycle is the dynamic interaction among the earth's carbon sources and sinks. Four reservoirs can be identified, including the atmosphere, terrestrial biosphere, oceans, and sediments. Atmospheric CO{sub 2} concentration is determined by characteristics of carbon fluxes among major reservoirs of the global carbon cycle. The objective of this paper is to document the knowns, and unknowns and uncertainties associated with key questions that if answered will increase the understanding of the portion of past, present, and future atmospheric CO{sub 2} attributable to fossil fuel burning. Documented atmospheric increases in CO{sub 2} levels are thought to result primarily from fossil fuel use and, perhaps, deforestation. However, the observed atmospheric CO{sub 2} increase is less than expected from current understanding of the global carbon cycle because of poorly understood interactions among the major carbon reservoirs. 87 refs.
Post, W. M.; Dale, V. H.; DeAngelis, D. L.; Mann, L. K.; Mulholland, P. J.; O`Neill, R. V.; Peng, T. -H.; Farrell, M. P.
1990-02-01
The global carbon cycle is the dynamic interaction among the earth's carbon sources and sinks. Four reservoirs can be identified, including the atmosphere, terrestrial biosphere, oceans, and sediments. Atmospheric CO{sub 2} concentration is determined by characteristics of carbon fluxes among major reservoirs of the global carbon cycle. The objective of this paper is to document the knowns, and unknowns and uncertainties associated with key questions that if answered will increase the understanding of the portion of past, present, and future atmospheric CO{sub 2} attributable to fossil fuel burning. Documented atmospheric increases in CO{sub 2} levels are thought to result primarily from fossil fuel use and, perhaps, deforestation. However, the observed atmospheric CO{sub 2} increase is less than expected from current understanding of the global carbon cycle because of poorly understood interactions among the major carbon reservoirs.
Fox, Jesse; Anderegg, Courtney
2014-11-01
Due to their pervasiveness and unique affordances, social media play a distinct role in the development of modern romantic relationships. This study examines how a social networking site is used for information seeking about a potential or current romantic partner. In a survey, Facebook users (N=517) were presented with Facebook behaviors categorized as passive (e.g., reading a partner's profile), active (e.g., "friending" a common third party), or interactive (e.g., commenting on the partner's wall) uncertainty reduction strategies. Participants reported how normative they perceived these behaviors to be during four possible stages of relationship development (before meeting face-to-face, after meeting face-to-face, casual dating, and exclusive dating). Results indicated that as relationships progress, perceived norms for these behaviors change. Sex differences were also observed, as women perceived passive and interactive strategies as more normative than men during certain relationship stages.
Tanner, C.; Muehlebach, H. [Eidgenoessische Materialpruefungs- und Forschungsanstalt (EMPA), Abt. Bauphysik, Duebendorf (Switzerland)
2004-06-01
Meanwhile it is common knowledge that air tightness of a building envelope is a quality factor providing several advantages. Measurement, assessment and comparison of air permeability characteristics of buildings is far from being simple: on the one hand a variety of nationally defined non-compatible air permeability indices exist, on the other hand detailed regulations or instructions on the preparation of a building before measurement are missing in actual standards. For instance, the measuring zone is often not clearly defined, and the various options of tightening a mechanical ventilation system also yield large deviations in the results. Also the date of measurement has a substantial influence: a measurement with leakage detection in the construction phase is useful and desired for improvement. But it is not valid for approval since the building will be changed until completion. For those reasons questions arise about the value of the air permeability determination. In addition, regarding recent low energy labelling schemes demanding very low permeability rates for heat saving reasons, it is known nowadays that only a poor relation exists between the air exchange rate n{sub 50} and the actual ventilation heat loss of an occupied low energy building. Proposals for standardised procedures with regard to building preparation and data evaluation are being discussed these days. Objectives are to find simple, pragmatic solutions for the main problems and to make known more clearly the chances and limitations of the complex assessment method. (orig.) [German] Es ist unbestritten, dass eine moeglichst luftdichte Gebaeudehuelle ein Qualitaetsfaktor ist, der viele Vorteile bringt. Wird eine Luftdurchlaessigkeits-Messung durchgefuehrt, werden die Beurteilungen und Vergleiche von Luftwechselzahlen schwierig. Einerseits werden laenderspezifisch unterschiedliche Kennwerte ermittelt, andererseits fehlen in den Normen meist detaillierte Ausfuehrungsbestimmungen fuer die
2007-01-01
Full Text Available This paper addresses two major challenges in climate change impact analysis on water resources systems: (i incorporation of a large range of potential climate change scenarios and (ii quantification of related modelling uncertainties. The methodology of climate change impact modelling is developed and illustrated through application to a hydropower plant in the Swiss Alps that uses the discharge of a highly glacierised catchment. The potential climate change impacts are analysed in terms of system performance for the control period (1961–1990 and for the future period (2070–2099 under a range of climate change scenarios. The system performance is simulated through a set of four model types, including the production of regional climate change scenarios based on global-mean warming scenarios, the corresponding discharge model, the model of glacier surface evolution and the hydropower management model. The modelling uncertainties inherent in each model type are characterised and quantified separately. The overall modelling uncertainty is simulated through Monte Carlo simulations of the system behaviour for the control and the future period. The results obtained for both periods lead to the conclusion that potential climate change has a statistically significant negative impact on the system performance.
Sun, Liang; Zheng, Zewei
2017-04-01
An adaptive relative pose control strategy is proposed for a pursue spacecraft in proximity operations on a tumbling target. Relative position vector between two spacecraft is required to direct towards the docking port of the target while the attitude of them must be synchronized. With considering the thrust misalignment of pursuer, an integrated controller for relative translational and relative rotational dynamics is developed by using norm-wise adaptive estimations. Parametric uncertainties, unknown coupled dynamics, and bounded external disturbances are compensated online by adaptive update laws. It is proved via Lyapunov stability theory that the tracking errors of relative pose converge to zero asymptotically. Numerical simulations including six degrees-of-freedom rigid body dynamics are performed to demonstrate the effectiveness of the proposed controller.
Matano, Francesca; Sambucini, Valeria
2016-11-01
In phase II single-arm studies, the response rate of the experimental treatment is typically compared with a fixed target value that should ideally represent the true response rate for the standard of care therapy. Generally, this target value is estimated through previous data, but the inherent variability in the historical response rate is not taken into account. In this paper, we present a Bayesian procedure to construct single-arm two-stage designs that allows to incorporate uncertainty in the response rate of the standard treatment. In both stages, the sample size determination criterion is based on the concepts of conditional and predictive Bayesian power functions. Different kinds of prior distributions, which play different roles in the designs, are introduced, and some guidelines for their elicitation are described. Finally, some numerical results about the performance of the designs are provided and a real data example is illustrated. Copyright © 2016 John Wiley & Sons, Ltd.
Duffy, P.; Keller, M. M.; Morton, D. C.
2016-12-01
Carbon accounting for REDD+ requires knowledge of deforestation, degradation, and associated changes in forest carbon stocks. Degradation is more difficult to detect than deforestation so SilvaCarbon, an US inter-agency effort, has set a priority to better characterize forest degradation effects on carbon loss. By combining information from forest inventory and lidar data products, impacts of deforestation, degradation, and associated changes in forest carbon stocks can be more accurately characterized across space. Our approach employs a hierarchical Bayesian modeling (HBM) framework where the assimilation of information from multiple sources is accomplished using a change of support (COS) technique. The COS formulation allows data from multiple spatial resolutions to be assimilated into an intermediate resolution. This approach is being applied in Paragominas, a jurisdiction in the eastern Brazilian Amazon with a high proportion of logged and burned degraded forests where political change has opened the way for REDD+. We build on a long history of research including our extensive studies of logging damage. Our primary objective is to quantify above-ground carbon stocks and corresponding uncertainty in a spatially explicit manner. A secondary objective is to quantify the relative contribution of lower level data products to the overall uncertainty, allowing for more focused subsequent data collection in the context of uncertainty reduction. This approach provides a mechanism to assimilate information from multiple sources to produce spatially-explicit maps of carbon stocks and changes with corresponding spatially explicit maps of uncertainty. Importantly, this approach also provides a mechanism that can be used to assess the value of information from specific data products.
Levy, J.I.; Baxter, L.K.; Schwartz, J. [Harvard University, Cambridge, MA (United States). Harvard School of Public Health
2009-07-15
The health-related damages associated with emissions from coal-fired power plants can vary greatly across facilities as a function of plant, site, and population characteristics, but the degree of variability and the contributing factors have not been formally evaluated. In this study, we modeled the monetized damages associated with 407 coal-fired power plants in the United States, focusing on premature mortality from fine particulate matter (PM2.5). We applied a reduced-form chemistry-transport model accounting for primary PM2.5 emissions and the influence of sulfur dioxide (SO{sub 2}) and nitrogen oxide (NOx) emissions on secondary particulate formation. Outputs were linked with a concentration-response function for PM2.5-related mortality that incorporated nonlinearities and model uncertainty. We valued mortality with a value of statistical life approach, characterizing and propagating uncertainties in all model elements. At the median of the plant-specific uncertainty distributions, damages across plants ranged from $30,000 to $500,000 per ton of PM2.5, $6,000 to $50,000 per ton of SO{sub 2}, $500 to $15,000 per ton of NOx, and $0.02 to $1.57 per kilowatt-hour of electricity generated. Variability in damages per ton of emissions was almost entirely explained by population exposure per unit emissions (intake fraction), which itself was related to atmospheric conditions and the population size at various distances from the power plant. Variability in damages per kilowatt-hour was highly correlated with SO{sub 2} emissions, related to fuel and control technology characteristics, but was also correlated with atmospheric conditions and population size at various distances.
New extended standard model, dark matters and relativity theory
Hwang, Jae-Kwang
2016-03-01
Three-dimensional quantized space model is newly introduced as the extended standard model. Four three-dimensional quantized spaces with total 12 dimensions are used to explain the universes including ours. Electric (EC), lepton (LC) and color (CC) charges are defined to be the charges of the x1x2x3, x4x5x6 and x7x8x9 warped spaces, respectively. Then, the lepton is the xi(EC) - xj(LC) correlated state which makes 3x3 = 9 leptons and the quark is the xi(EC) - xj(LC) - xk(CC) correlated state which makes 3x3x3 = 27 quarks. The new three bastons with the xi(EC) state are proposed as the dark matters seen in the x1x2x3 space, too. The matter universe question, three generations of the leptons and quarks, dark matter and dark energy, hadronization, the big bang, quantum entanglement, quantum mechanics and general relativity are briefly discussed in terms of this new model. The details can be found in the article titled as ``journey into the universe; three-dimensional quantized spaces, elementary particles and quantum mechanics at https://www.researchgate.net/profile/J_Hwang2''.
Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Spiesman, J.B. [Pacific Northwest Lab., Richland, WA (United States)
1995-11-01
This report provides the results of comparisons of the cited and latest versions of ANS, ASME, AWS and NFPA standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC`s Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review.
Uslar, Mathias; Specht, Michael; Daenekas, Christian; Trefke, Joern; Rohjans, Sebastian; Gonzalez, Jose M.; Rosinger, Christine; Bleiker, Robert [OFFIS - Institut fuer Informatik, Oldenburg (Germany)
2013-03-01
Introduction to Standardization for Smart Grids. Presents a tutorial and best practice of Smart Grid Prototype Projects. Written by leading experts in the field. Besides the regulatory and market aspects, the technical level dealing with the knowledge from multiple disciplines and the aspects of technical system integration to achieve interoperability and integration has been a strong focus in the Smart Grid. This topic is typically covered by the means of using (technical) standards for processes, data models, functions and communication links. Standardization is a key issue for Smart Grids due to the involvement of many different sectors along the value chain from the generation to the appliances. The scope of Smart Grid is broad, therefore, the standards landscape is unfortunately very large and complex. This is why the three European Standards Organizations ETSI, CEN and CENELEC created a so called Joint Working Group (JWG). This was the first harmonized effort in Europe to bring together the needed disciplines and experts delivering the final report in May 2011. After this approach proved useful, the Commission used the Mandate M/490: Standardization Mandate to European Standardization Organizations (ESOs) to support European Smart Grid deployment. The focal point addressing the ESO's response to M/490 will be the CEN, CENELEC and ETSI Smart Grids Coordination Group (SG-CG). Based on this mandate, meaningful standardization of architectures, use cases, communication technologies, data models and security standards takes place in the four existing working groups. This book provides an overview on the various building blocks and standards identified as the most prominent ones by the JWG report as well as by the first set of standards group - IEC 61850 and CIM, IEC PAS 62559 for documenting Smart Grid use cases, security requirements from the SGIS groups and an introduction on how to apply the Smart Grid Architecture Model SGAM for utilities. In addition
Binary trading relations and the limits of EDI standards
Damsgaard, Jan; Truex, D.
2000-01-01
This paper provides a critical examination of electronic data interchange (EDI) standards and their application in different types of trading relationships. It argues that EDI standards are not directly comparable to more stable sets of technical standards in that they are dynamically tested...... and negotiated in use with each trading exchange. It takes the position that EDI standards are an emergent language form and must mean different things at the institutional and local levels. Using the lens of emergent linguistic analysis it shows how the institutional and local levels must always be distinct...
Binary trading relations and the limits of EDI standards
Damsgaard, Jan; Truex, D.
2000-01-01
This paper provides a critical examination of electronic data interchange (EDI) standards and their application in different types of trading relationships. It argues that EDI standards are not directly comparable to more stable sets of technical standards in that they are dynamically tested...... and negotiated in use with each trading exchange. It takes the position that EDI standards are an emergent language form and must mean different things at the institutional and local levels. Using the lens of emergent linguistic analysis it shows how the institutional and local levels must always be distinct...
Cacais, F.L.; Delgado, J.U., E-mail: facacais@gmail.com [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Loayza, V.M. [Instituto Nacional de Metrologia (INMETRO), Rio de Janeiro, RJ (Brazil). Qualidade e Tecnologia
2016-07-01
In preparing solutions for the production of radionuclide metrology standards is necessary measuring the quantity Activity by mass. The gravimetric method by elimination is applied to perform weighing with smaller uncertainties. At this work is carried out the validation, by the Monte Carlo method, of the uncertainty calculation approach implemented by Lourenco and Bobin according to ISO GUM for the method by elimination. The results obtained by both uncertainty calculation methods were consistent indicating that were fulfilled the conditions for the application of ISO GUM in the preparation of radioactive standards. (author)
Large uncertainty in soil carbon modelling related to carbon input calculation method
Keel, Sonja G.; Leifeld, Jens; Taghizadeh-Toosi, Arezoo; Oleson, Jørgen E.
2016-04-01
A model-based inventory for carbon (C) sinks and sources in agricultural soils is being established for Switzerland. As part of this project, five frequently used allometric equations that estimate soil C inputs based on measured yields are compared. To evaluate the different methods, we calculate soil C inputs for a long-term field trial in Switzerland. This DOK experiment (bio-Dynamic, bio-Organic, and conventional (German: Konventionell)) compares five different management systems, that are applied to identical crop rotations. Average calculated soil C inputs vary largely between allometric equations and range from 1.6 t C ha-1 yr-1 to 2.6 t C ha-1 yr-1. Among the most important crops in Switzerland, the uncertainty is largest for barley (difference between highest and lowest estimate: 3.0 t C ha-1 yr-1). For the unfertilized control treatment, the estimated soil C inputs vary less between allometric equations than for the treatment that received mineral fertilizer and farmyard manure. Most likely, this is due to the higher yields in the latter treatment, i.e. the difference between methods might be amplified because yields differ more. To evaluate the influence of these allometric equations on soil C dynamics we simulate the DOK trial for the years 1977-2004 using the model C-TOOL (Taghizadeh-Toosi et al. 2014) and the five different soil C input calculation methods. Across all treatments, C-TOOL simulates a decrease in soil C in line with the experimental data. This decline, however, varies between allometric equations (-2.4 t C ha-1 to -6.3 t C ha-1 for the years 1977-2004) and has the same order of magnitude as the difference between treatments. In summary, the method to estimate soil C inputs is identified as a significant source of uncertainty in soil C modelling. Choosing an appropriate allometric equation to derive the input data is thus a critical step when setting up a model-based national soil C inventory. References Taghizadeh-Toosi A et al. (2014) C
Nikulin, Grigory; Bosshard, Thomas; Wilcke, Renate; Yang, Wei; Kjellström, Erik; Bärring, Lars
2015-04-01
Bias adjustment has become an integral part of pre-processing of climate simulations for use in impact modeling studies. Considered now as a necessary step to deal with inability of climate models to accurately simulate the present/recent climate, bias adjustment is a statistical approach missing physical arguments. Even if bias adjustment is widely used nowadays it is still a topic for debates and criticism. One of the main questions is what level of uncertainty does bias adjustment introduce to future climate projections? In this study, using an ensemble of the CORDEX-Africa simulations, we investigate potential impact of bias adjustment on the simulated rainy season in West Africa. A number of characteristics reflecting different aspects of the rainy season are used, namely: onset and cessation of the rainy season, mean intensity, total amount of precipitation and intra-seasonal variability within the rainy season. All these characteristics are evaluated in the original CORDEX-Africa simulations and in bias-adjusted ones for a reference period first and then future climate projections of these characteristics are compared between two ensembles. Additionally, we examine how bias adjustment may impact selection of a smaller more manageable ensemble of regional climate simulations from a grand one.
Sanchez-Vila, X.; de Barros, F.; Bolster, D.; Nowak, W.
2010-12-01
Assessing the potential risk of hydro(geo)logical supply systems to human population is an interdisciplinary field. It relies on the expertise in fields as distant as hydrogeology, medicine, or anthropology, and needs powerful translation concepts to provide decision support and policy making. Reliable health risk estimates need to account for the uncertainties in hydrological, physiological and human behavioral parameters. We propose the use of fault trees to address the task of probabilistic risk analysis (PRA) and to support related management decisions. Fault trees allow decomposing the assessment of health risk into individual manageable modules, thus tackling a complex system by a structural “Divide and Conquer” approach. The complexity within each module can be chosen individually according to data availability, parsimony, relative importance and stage of analysis. The separation in modules allows for a true inter- and multi-disciplinary approach. This presentation highlights the three novel features of our work: (1) we define failure in terms of risk being above a threshold value, whereas previous studies used auxiliary events such as exceedance of critical concentration levels, (2) we plot an integrated fault tree that handles uncertainty in both hydrological and health components in a unified way, and (3) we introduce a new form of stochastic fault tree that allows to weaken the assumption of independent subsystems that is required by a classical fault tree approach. We illustrate our concept in a simple groundwater-related setting.
Daugman, J G
1985-07-01
Two-dimensional spatial linear filters are constrained by general uncertainty relations that limit their attainable information resolution for orientation, spatial frequency, and two-dimensional (2D) spatial position. The theoretical lower limit for the joint entropy, or uncertainty, of these variables is achieved by an optimal 2D filter family whose spatial weighting functions are generated by exponentiated bivariate second-order polynomials with complex coefficients, the elliptic generalization of the one-dimensional elementary functions proposed in Gabor's famous theory of communication [J. Inst. Electr. Eng. 93, 429 (1946)]. The set includes filters with various orientation bandwidths, spatial-frequency bandwidths, and spatial dimensions, favoring the extraction of various kinds of information from an image. Each such filter occupies an irreducible quantal volume (corresponding to an independent datum) in a four-dimensional information hyperspace whose axes are interpretable as 2D visual space, orientation, and spatial frequency, and thus such a filter set could subserve an optimally efficient sampling of these variables. Evidence is presented that the 2D receptive-field profiles of simple cells in mammalian visual cortex are well described by members of this optimal 2D filter family, and thus such visual neurons could be said to optimize the general uncertainty relations for joint 2D-spatial-2D-spectral information resolution. The variety of their receptive-field dimensions and orientation and spatial-frequency bandwidths, and the correlations among these, reveal several underlying constraints, particularly in width/length aspect ratio and principal axis organization, suggesting a polar division of labor in occupying the quantal volumes of information hyperspace.(ABSTRACT TRUNCATED AT 250 WORDS)
Koerner, Naomi; Mejia, Teresa; Kusec, Andrea
2017-03-01
A number of studies have examined the association of intolerance of uncertainty (IU) to trait worry and generalized anxiety disorder (GAD). However, few studies have examined the extent of overlap between IU and other psychological constructs that bear conceptual resemblance to IU, despite the fact that IU-type constructs have been discussed and examined extensively within psychology and other disciplines. The present study investigated (1) the associations of IU, trait worry, and GAD status to a negative risk orientation, trait curiosity, indecisiveness, perceived constraints, self-oriented and socially prescribed perfectionism, intolerance of ambiguity, the need for predictability, and the need for order and structure and (2) whether IU is a unique correlate of trait worry and of the presence versus absence of Probable GAD, when overlap with other uncertainty-relevant constructs is accounted for. N = 255 adults completed self-report measures of the aforementioned constructs. Each of the constructs was significantly associated with IU. Only IU, and a subset of the other uncertainty-relevant constructs were correlated with trait worry or distinguished the Probable GAD group from the Non-GAD group. IU was the strongest unique correlate of trait worry and of the presence versus absence of Probable GAD. Indecisiveness, self-oriented perfectionism and the need for predictability were also unique correlates of trait worry or GAD status. Implications of the findings are discussed, in particular as they pertain to the definition, conceptualization, and cognitive-behavioral treatment of IU in GAD.
Setting Minimum Standards for Measuring Public Relations Effectiveness.
Lindenmann, Walter K.
1997-01-01
Reviews and discusses the new 28-page booklet defining "Guidelines and Standards for Measuring and Evaluating PR Effectiveness." States that it is the result of a nine-month project carried out by an eight-member task force. (PA)
梁丽军; 田琳琳; 薛锦锋; 沈磊
2016-01-01
目的：建立自动顶空-气相色谱(HS-GC)内标曲线法测定血中乙醇含量的不确定评估方法。方法从分析测定程序着手，依据不确定度评定的指导性文件，分析不确定度来源，量化不确定度分量，计算检测结果的合成标准不确定度和扩展不确定度。结果各相对不确定度来自于检材重复性检测为3.4%，乙醇标准溶液为0.71%，检材为0.61%，叔丁醇内标溶液为0.41%，标准曲线为1.1%，气相色谱仪为1.3%，血液中乙醇的相对扩展不确定度为3.9%。结论血液中乙醇含量的不确定度主要来源于检材重复性检测、气相色谱仪、乙醇标准曲线。%ObjectiveTo evaluate the uncertainty for the determination of ethanol in human blood by auto-headspace gas chromatography (HS-GC) with internal standard curve method.Methods Each source of uncertainty, arising from the procedure of testing, was analyzed and conifrmed according to the guidelines of the uncertainty in measurement . After each uncertainty component was quantized, the combined standard uncertainty and the expanded uncertainty of the result were calculated.Results The relative uncertainty brought from the measurement repeatability, standard solution of the ethanol, the sample of blood, internal standard solution of the tert-butyl alcohol, the calibration cure, gas chromatography were 3.4%,0.71%,0.61%,0.41%,1.1% and 1.3% respectively; the relative expanded uncertainty of ethanol in blood was 3.9%.Conclusion The measurement uncertainty of the concentration of ethanol was came primarily from the measurement repeatability of sample, HS-GC and standard curve of ethanol.
Scott, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daly, Don S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Ying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McJeon, Haewon C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moss, Richard H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Patel, Pralit L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Marty J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rice, Jennie S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhou, Yuyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2014-12-06
This report presents data and assumptions employed in an application of PNNL’s Global Change Assessment Model with a newly-developed Monte Carlo analysis capability. The model is used to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The report provides a summary of how residential and commercial buildings are modeled, together with assumptions made for the distributions of state–level population, Gross Domestic Product (GDP) per worker, efficiency and cost of residential and commercial energy equipment by end use, and efficiency and cost of residential and commercial building shells. The cost and performance of equipment and of building shells are reported separately for current building and equipment efficiency standards and for more aggressive standards. The report also details assumptions concerning future improvements brought about by projected trends in technology.
Barbara Mickowska
2013-02-01
Full Text Available The aim of this study was to assess the importance of validation and uncertainty estimation related to the results of amino acid analysis using the ion-exchange chromatography with post-column derivatization technique. The method was validated and the components of standard uncertainty were identified and quantified to recognize the major contributions to uncertainty of analysis. Estimated relative extended uncertainty (k=2, P=95% varied in range from 9.03% to 12.68%. Quantification of the uncertainty components indicates that the contribution of the calibration concentration uncertainty is the largest and it plays the most important role in the overall uncertainty in amino acid analysis. It is followed by uncertainty of area of chromatographic peaks and weighing procedure of samples. The uncertainty of sample volume and calibration peak area may be negligible. The comparison of CV% with estimated relative uncertainty indicates that interpretation of research results can be misled without uncertainty estimation.
Al-Hashimi, M H
2012-01-01
We consider a 1-parameter family of self-adjoint extensions of the Hamiltonian for a particle confined to a finite interval with perfectly reflecting boundary conditions. In some cases, one obtains negative energy states which seems to violate the Heisenberg uncertainty relation. We use this as a motivation to derive a generalized uncertainty relation valid for an arbitrarily shaped quantum dot with general perfectly reflecting walls in $d$ dimensions. In addition, a general uncertainty relation for non-Hermitean operators is derived and applied to the non-Hermitean momentum operator in a quantum dot. We also consider minimal uncertainty wave packets in this situation, and we prove that the spectrum depends monotonically on the self-adjoint extension parameter. In addition, we construct the most general boundary conditions for semiconductor heterostructures such as quantum dots, quantum wires, and quantum wells, which are characterized by a 4-parameter family of self-adjoint extensions. Finally, we consider p...
Investigative study of standards for Digital Repositories and related services
Foulonneau, Muriel; André, Francis
2007-01-01
This study is meant for institutional repository managers, service providers, repository software developers and generally, all players taking an active part in the creation of the digital repository infrastructure for e-research and e-learning. It reviews the current standards, protocols and applic
Zhu, T.; Vasiliev, A.; Ferroukhi, H.; Pautz, A.
2014-04-01
At the Paul Scherrer Institute (PSI), a methodology titled PSI-NUSS is under development for the propagation of nuclear data uncertainties into Criticality Safety Evaluation (CSE) with the Monte Carlo code MCNPX. The primary purpose is to provide a complementary option for the uncertainty assessment related to nuclear data, versus the traditional approach which relies on estimating biases/uncertainties based on validation studies against representative critical benchmark experiments. In the present paper, the PSI-NUSS methodology is applied to quantify nuclear data uncertainties for the OECD/NEA UACSA Exercise Phase I benchmark. One underlying reason is that PSI's CSE methodology developed so far and previously applied for this benchmark was based on using a more conventional approach, involving engineering guesses in order to estimate uncertainties in the calculated effective multiplication factor (keff). Therefore, as the PSI-NUSS methodology aims precisely at integrating a more rigorous treatment of the specific type of uncertainties from nuclear data for CSE, its application to the UACSA is conducted here: nuclear data related uncertainty component is estimated and compared to results obtained by other participants using different codes/libraries and methodologies.
On The Uncertainties Related To The Estimation Of Extreme Environmental Condition
Burcharth, Hans F.
1986-01-01
The calculation of the forces on the structural members of a structure in the sea is based on knowledge of the kinematics of the surrounding water and air. Therefore our goal is to establish some statistics for the related velocity and acceleration fields....
Lo Bianco, A.S.; Oliveira, H.P.S.; Peixoto, J.G.P., E-mail: abianco@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI)
2009-07-01
To implant the primary standard of the magnitude kerma in the air for X-ray between 10 - 50 keV, the National Metrology Laboratory of Ionizing Radiations (LNMRI) must evaluate all the uncertainties of measurement related with Victtoren chamber. So, it was evaluated the uncertainty of the kerma in the air consequent of the inaccuracy in the active volume of the chamber using the calculation of Monte Carlo as a tool through the Penelope software
Ramounet-Le Gall, B; Rateau, G; Abram, M C; Grillon, G; Ansoborlo, E; Bérard, P; Delforge, J; Fritsch, P
2003-01-01
The aim of this study was to compare dissolution parameter values for Pu from industrial MOX with different Pu contents. For this purpose, preliminary results obtained after inhalation exposure of rats to MOX containing 2.5% Pu are reported and compared to those obtained previously with MOX containing 5% Pu. Dissolution parameter values appear to increase when the amount of Pu decreases. Rapid fractions, f(r), of 4 x 10(-3) (s.d. = 2 x 10(-3)) and 1 x 10(-3) (s.d. = 6 x 10(-4)) and slow dissolution rates, s(s) of 2 x 10(-4) d(-1) (standard deviation, sigma = 5 x 10(-5)) and 5 x 10(-5) d(-1) (sigma = 1 x 10(-5)) were derived for MOX containing 2.5 and 5% of Pu, respectively. Simulations were performed to assess uncertainties on dose due to experimental errors. The relative standard deviations of the dose per unit intake (DPUI) due to f(r) (4-8%), are far less than those due to s(s) (about 20%), which is the main parameter altering the dose. Although quite different dissolution parameter values were derived, similar DPUIs were obtained for MOX aerosols containing 2.5 and 5% Pu which appear close to that for default Type S values.
The Rhetoric of Arrogance: The Public Relations Response of the Standard Oil Trust.
Boyd, Josh
2001-01-01
Illustrates one of the earliest American public relations debacles (ending in the dissolution of the Standard Oil Trust in 1911). Presents background on Standard Oil and offers an overview Ida Tarbell's influential "History of the Standard Oil company." Argues that Standard failed to respond to these accounts adequately, reinforcing…
Radożycki, Tomasz
2016-11-01
The probability density distributions for the ground states of certain model systems in quantum mechanics and for their classical counterparts are considered. It is shown, that classical distributions are remarkably improved by incorporating into them the Heisenberg uncertainty relation between position and momentum. Even the crude form of this incorporation makes the agreement between classical and quantum distributions unexpectedly good, except for the small area, where classical momenta are large. It is demonstrated that the slight improvement of this form, makes the classical distribution very similar to the quantum one in the whole space. The obtained results are much better than those from the WKB method. The paper is devoted to ground states, but the method applies to excited states too.
Investigative study of standards for digital repositories and related services
Foulonneau, Muriel; Badolato, Anne-Marie
2008-01-01
This study is meant for institutional repository managers, service providers, repository software developers and generally, all players taking an active part in the creation of the digital repository infrastructure for e-research and e-learning. It reviews the current standards, protocols and applications in the domain of digital repositories. Special attention is being paid to the interoperability of repositories to enhance the exchange of data in repositories. It aims to stimulate discussion about these topics and supports initiatives for the integration of and, where needed, development of
Hukkerikar, Amol; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M.; Sin, Gürkan; Gani, Rafiqul
2012-01-01
The aim of this work is to develop group-3 contribution+ (GC+)method (combined group-contribution (GC) method and atom connectivity index (CI)) based 15 property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of pro...
Xing Chen
2016-01-01
Full Text Available Despite the importance of adoption of mobile health services by an organization on the diffusion of mobile technology in the big data era, it has received minimal attention in literature. This study investigates how relative advantage and perceived credibility affect an organization’s adoption of mobile health services, as well as how environmental uncertainty changes the relationship of relative advantage and perceived credibility with adoption. A research model that integrates relative advantage, perceived credibility, environmental uncertainty, and an organization’s intention to use mobile health service is developed. Quantitative data are collected from senior managers and information systems managers in 320 Chinese healthcare organizations. The empirical findings show that while relative advantage and perceived credibility both have positive effects on an organization’s intention to use mobile health services, relative advantage plays a more important role than perceived credibility. Moreover, environmental uncertainty positively moderates the effect of relative advantage on an organization’s adoption of mobile health services. Thus, mobile health services in environments characterized with high levels of uncertainty are more likely to be adopted because of relative advantage than in environments with low levels of uncertainty.
Wang, Gang; Zhang, Zhonghua; Li, Zhengkun; Xu, Jinxin; You, Qiang
2016-02-01
Measurement of the mutual inductance is one of the key techniques in the joule balance to determine the Planck constant h, where a standard-square-wave compensation method was proposed to accurately measure the dc value of the mutual inductance. With this method, analog switches are used to compose an analog-switch signal generator to synthesize the excitation and compensation voltages. However, the accuracy of the compensation voltage is influenced by the non-ideal behaviors of analog-switches. In this paper, the effect from these non-ideal switches is analyzed in detail and evaluated with the equivalent circuits. A programmable Josephson voltage standard (PJVS) is used to generate a reference compensation voltage to measure the time integration of the voltage waveform generated by the analog-switch signal generator. Moreover, the effect is also evaluated experimentally by comparing the difference between the mutual inductance measured with the analog-switch signal generator and the value determined by the PJVS-analog-switch generator alternately in the same mutual inductance measurement system. The result shows that the impact of analog switches is 1.97 × 10-7 with an uncertainty of 1.83 × 10-7 (k = 1) and confirms that the analog switch method can be used regularly instead of the PJVS in the mutual inductance measurement for the joule balance experiment.
A consolidated and standardized relational database for ER data
Zygmunt, B.C.
1995-12-01
The three US Department of Energy (DOE) installations on the Oak Ridge Reservation (ORR) (Oak Ridge National Laboratory, Y-12, and K-25) were established during World War II as part of the Manhattan Project that ``built the bomb.`` That research, and work in more recent years, has resulted in the generation of radioactive materials and other toxic wastes. Lockheed Martin Energy Systems manages the three Oak Ridge installations (as well as the Environmental Restoration (ER) programs at the DOE plants in Portsmouth, Ohio, and Paducah, Kentucky). DOE Oak Ridge Operations has been mandated by federal and state agreements to provide a consolidated repository of environmental data and is tasked to support environmental data management activities at all five installations. The Oak Ridge Environmental Information System (OREIS) was initiated to fulfill these requirements. The primary use of OREIS data is to provide access to project results by regulators. A secondary use is to serve as background data for other projects. This paper discusses the benefits of a consolidated and standardized database; reasons for resistance to the consolidation of data; implementing a consolidated database, including attempts at standardization, deciding what to include in the consolidated database, establishing lists of valid values, and addressing quality control (QC) issues; and the evolution of a consolidated database, which includes developing and training a user community, dealing with configuration control issues, and incorporating historical data. OREIS is used to illustrate these topics.
Fontenot, Jonas D.; Bloch, Charles; Followill, David; Titt, Uwe; Newhauser, Wayne D.
2010-12-01
Theoretical calculations have shown that proton therapy can reduce the incidence of radiation-induced secondary malignant neoplasms (SMN) compared with photon therapy for patients with prostate cancer. However, the uncertainties associated with calculations of SMN risk had not been assessed. The objective of this study was to quantify the uncertainties in projected risks of secondary cancer following contemporary proton and photon radiotherapies for prostate cancer. We performed a rigorous propagation of errors and several sensitivity tests to estimate the uncertainty in the ratio of relative risk (RRR) due to the largest contributors to the uncertainty: the radiation weighting factor for neutrons, the dose-response model for radiation carcinogenesis and interpatient variations in absorbed dose. The interval of values for the radiation weighting factor for neutrons and the dose-response model were derived from the literature, while interpatient variations in absorbed dose were taken from actual patient data. The influence of each parameter on a baseline RRR value was quantified. Our analysis revealed that the calculated RRR was insensitive to the largest contributors to the uncertainty. Uncertainties in the radiation weighting factor for neutrons, the shape of the dose-risk model and interpatient variations in therapeutic and stray doses introduced a total uncertainty of 33% to the baseline RRR calculation.
Hong, Soo Jung; You, Kyung Han
2016-12-01
Using the 2013 HINTS 4 Cycle 2 data representing a general population sample, this study investigates the effects of patients' experiences of uncertainty about prostate cancer during doctor-patient communication, as well as patients' positive assessments of their cancer-related information-seeking experiences, on their fatalistic beliefs regarding cancer and their trust in physicians. Our tests show significant differences in trust in physicians among men who do and do not experience uncertainty about the prostate-specific antigen (PSA) test during doctor-patient communication. The analysis also indicates that individuals with experiences of uncertainty about the PSA test are more likely than those without such experiences of uncertainty to place their trust in doctors. However, no apparent difference or association exists when there are uncertainties relating to treatment choices regarding slow-growing cancer or treatment side effects. Nevertheless, as hypothesized, individuals who positively evaluate their cancer-related information-seeking experiences are less likely to have fatalistic beliefs about cancer. Furthermore, patients' positive assessments are highly predictive of their levels of trust in their physicians. Additionally, tests of interaction effects show that individuals' levels of education moderate the association between uncertainty experiences about the PSA test and both cancer fatalism and trust in physicians. Further implications and limitations of the study are discussed.
Evaluation of measurement uncertainty of glucose in clinical chemistry.
Berçik Inal, B; Koldas, M; Inal, H; Coskun, C; Gümüs, A; Döventas, Y
2007-04-01
The definition of the uncertainty of measurement used in the International Vocabulary of Basic and General Terms in Metrology (VIM) is a parameter associated with the result of a measurement, which characterizes the dispersion of the values that could reasonably be attributed to the measurand. Uncertainty of measurement comprises many components. In addition to every parameter, the measurement uncertainty is that a value should be given by all institutions that have been accredited. This value shows reliability of the measurement. GUM, published by NIST, contains uncertainty directions. Eurachem/CITAC Guide CG4 was also published by Eurachem/CITAC Working Group in the year 2000. Both of them offer a mathematical model, for uncertainty can be calculated. There are two types of uncertainty in measurement. Type A is the evaluation of uncertainty through the statistical analysis and type B is the evaluation of uncertainty through other means, for example, certificate reference material. Eurachem Guide uses four types of distribution functions: (1) rectangular distribution that gives limits without specifying a level of confidence (u(x)=a/ radical3) to a certificate; (2) triangular distribution that values near to the same point (u(x)=a/ radical6); (3) normal distribution in which an uncertainty is given in the form of a standard deviation s, a relative standard deviation s/ radicaln, or a coefficient of variance CV% without specifying the distribution (a = certificate value, u = standard uncertainty); and (4) confidence interval.
Masahito, Hayashi [ERATO, Quantum Computation and Information Project, Japan Science and Technology Agency, Tokyo (Japan); Reynaud, S. [Universite Pierre et Marie Curie, Lab. Kastler Brossel, 75 - Paris (France); Jaekel, M.Th. [Ecole Nationale Superieure de Chimie, Lab. de Physique Theorique 75 - Paris (France); Fiuraaek, J. [Palacky Univ., Dept. of Optics (Czech Republic); Garcia-Patron, R.; Cerf, N.J. [QUIC, Ecole Polytechnique, Universite Libre de Bruxelles, Brussels (Belgium); Hage, B.; Chelkowski, S.; Franzen, A.; Lastzka, N.; Vahlbruch, N.; Danzmann, K.; Schnabel, R. [Hannover Univ., Institut Faur Atom- und Molekaulphysik, Max-Planck-Institut, Gravitationsphysik (Albert-Einstein-Institut) (Germany); Hassan, S.S. [Bahrain Univ., Dept. of Mathematics, College of Science (Bahrain); Joshi, A. [Arkansas, Univ., Dept. of Physics, Fayetteville, AR (United States); Jakob, M. [ARC Seibersdorf Research GmbH (ARCS), Tech Gate Vienna, Vienna (Austria); Bergou, J.A. [New York City Univ., Dept. of Physics, Hunter College, NY (United States); Kozlovskii, A.V. [P.N.Lebedev Physical Institute, Moscow (Russian Federation); Prakash, H. [Allahabad Univ., Dept. of Physics (India)]|[Allahabad Univ., M. N. Saha Centre of Space Studies, Institute of Interdisciplinary Studies (India); Kumar, R. [Allahabad Univ., Dept. of Physics (India)]|[Udai Pratap Autonomous College (India)
2005-07-01
The purpose of the conference was to bring together people working in the field of quantum optics, with special emphasis on non-classical light sources and related areas, quantum computing, statistical mechanics and mathematical physics. As a novelty, this edition will include the topics of quantum imaging, quantum phase noise and number theory in quantum mechanics. This document gives the program of the conference and gathers the abstracts.
Bartley, David; Lidén, Göran
2008-08-01
The reporting of measurement uncertainty has recently undergone a major harmonization whereby characteristics of a measurement method obtained during establishment and application are combined componentwise. For example, the sometimes-pesky systematic error is included. A bias component of uncertainty can be often easily established as the uncertainty in the bias. However, beyond simply arriving at a value for uncertainty, meaning to this uncertainty if needed can sometimes be developed in terms of prediction confidence in uncertainty-based intervals covering what is to be measured. To this end, a link between concepts of accuracy and uncertainty is established through a simple yet accurate approximation to a random variable known as the non-central Student's t-distribution. Without a measureless and perpetual uncertainty, the drama of human life would be destroyed. Winston Churchill.
Relating Admissibility Standards for Digital Evidence to Attack Scenario Reconstruction
Changwei Liu
2014-09-01
Full Text Available Attackers tend to use complex techniques such as combining multi-step, multi-stage attack with anti-forensic tools to make it difficult to find incriminating evidence and reconstruct attack scenarios that can stand up to the expected level of evidence admissibility in a court of law. As a solution, we propose to integrate the legal aspects of evidence correlation into a Prolog based reasoner to address the admissibility requirements by creating most probable attack scenarios that satisfy admissibility standards for substantiating evidence. Using a prototype implementation, we show how evidence extracted by using forensic tools can be integrated with legal reasoning to reconstruct network attack scenarios. Our experiment shows this implemented reasoner can provide pre-estimate of admissibility on a digital crime towards an attacked network.
2012-03-21
... HUMAN SERVICES Food and Drug Administration Standards for Private Laboratory Analytical Packages and Introduction to Laboratory Related Portions of the Food Modernization Safety Act for Private Laboratory... Administration (FDA) is announcing two meetings entitled ``Standards for Private Laboratory Analytical...
The relation between bone demineralization, physical activity and anthropometric standards
Milena Barbosa Camara
2017-03-01
Full Text Available This paper aimed to verify the correlation between bone mineral density and the level of physical activity, as well as the food intake and the anthropometric parameters. It intended to analyse the bone mineral density (BMD of menopausal women through the bone densitometry test (DO in the lumbar region (L1 to L4, femoral neck and total femur, and also use Bouchard’s self-recall of daily activities; employing the food record from Buker and Stuart to dose and quantify the daily intake of calcium and vitamin D. The data were analysed via Kolmogorov-Smirnov’s test, and default value of α = 0.05 was set to compare the BMD averages. It was observed that one hundred percent of the assessed individuals had a BMD level below the average fixed by WHO: 14.4% with osteopenia and 85.6% with osteoporosis; a lower BMD in the femoral area (0.721g and the biggest loss among the sedentary ones (0.698g. It was noticed that there was a correlation between the physical activities and the BMD only when associated with anthropometric standards and the daily ingestion of vitamin D.
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.
Lindley, Dennis V
2013-01-01
Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.
周涛; 王同兴
2005-01-01
The sources of uncertainty of relative atomic mass include measurement errors and isotopic fractionation of terrestrial samples. Measurement errors are composed of measurements of atomic masses and isotopic abundances, the later includes uncertainty of correction factor K and isotopic ratios of natural samples. Through differential of seven factors to gain their propagation factors, the uncertainty of correction factors K can be calculated. With the same differential calculation, the uncertainty of relative atomic mass can be obtained.
Uncertainty Relation for Chaos
Yahalom, A; Levitan, J; Elgressy, G; Horwitz, L P; Ben-Zion, Y
2011-01-01
A necessary condition for the emergence of chaos is given. It is well known that the emergence of chaos requires a positive exponent which entails diverging trajectories. Here we show that this is not enough. An additional necessary condition for the emergence of chaos in the region where the trajectory of the system goes through, is that the product of the maximal positive exponent times the duration in which the system configuration point stays in the unstable region should exceed unity. We give a theoretical analysis justifying this result and a few examples.
Standardized reporting for rapid relative effectiveness assessments of pharmaceuticals.
Kleijnen, Sarah; Pasternack, Iris; Van de Casteele, Marc; Rossi, Bernardette; Cangini, Agnese; Di Bidino, Rossella; Jelenc, Marjetka; Abrishami, Payam; Autti-Rämö, Ilona; Seyfried, Hans; Wildbacher, Ingrid; Goettsch, Wim G
2014-11-01
Many European countries perform rapid assessments of the relative effectiveness (RE) of pharmaceuticals as part of the reimbursement decision making process. Increased sharing of information on RE across countries may save costs and reduce duplication of work. The objective of this article is to describe the development of a tool for rapid assessment of RE of new pharmaceuticals that enter the market, the HTA Core Model® for Rapid Relative Effectiveness Assessment (REA) of Pharmaceuticals. Eighteen member organisations of the European Network of Health Technology Assessment (EUnetHTA) participated in the development of the model. Different versions of the model were developed and piloted in this collaboration and adjusted accordingly based on feedback on the content and feasibility of the model. The final model deviates from the traditional HTA Core Model® used for assessing other types of technologies. This is due to the limited scope (strong focus on RE), the timing of the assessment (just after market authorisation), and strict timelines (e.g. 90 days) required for performing the assessment. The number of domains and assessment elements was limited and it was decided that the primary information sources should preferably be a submission file provided by the marketing authorisation holder and the European Public Assessment Report. The HTA Core Model® for Rapid REA (version 3.0) was developed to produce standardised transparent RE information of pharmaceuticals. Further piloting can provide input for possible improvements, such as further refining the assessment elements and new methodological guidance on relevant areas.
Heisenberg's uncertainty principle
Busch, Paul; Heinonen, Teiko; Lahti, Pekka
2007-01-01
Heisenberg's uncertainty principle is usually taken to express a limitation of operational possibilities imposed by quantum mechanics. Here we demonstrate that the full content of this principle also includes its positive role as a condition ensuring that mutually exclusive experimental options can be reconciled if an appropriate trade-off is accepted. The uncertainty principle is shown to appear in three manifestations, in the form of uncertainty relations: for the widths of the position and...
European standardization for the management of space-related projects
Teresa IDZIKOWSKA
2017-06-01
Full Text Available A project is a temporary endeavour designed to produce a unique product, service or result, with a defined beginning and end (usually constrained by time, funding or deliverables, which is undertaken to meet unique goals and objectives, typically to bring about beneficial change or added value. Space projects involve a complex process and usually take many years to be developed. The development of complex project requires the cooperation of several organizations, which share a common goal: namely, to create a product that satisfies the consumer’s needs (technical performance within cost and schedule constraints. To reach this goal, corresponding technical activities, as well as human and financial resources, need to be organized and coordinated in a wellorganized manner. Project management is the discipline of initiating, planning, executing, controlling, and finalizing the work of a team in the achievement of specific goals and meeting specific success criteria. This involves the application of knowledge, skills, tools, and techniques to project activities in order to meet project requirements. The paper is a review of how issues related to space management requirements has been addressed in Europe.
Uncertainty analysis in MCNP5 calculations for brachytherapy treatment
Gerardy, I., E-mail: gerardy@isib.be [Institut Superieur Industriel de Bruxelles, 150, Rue Royale, B-1000 Brussels (Belgium); Rodenas, J.; Gallardo, S. [Departamento de Ingenieria Quimica y Nuclear, Universidad Politecnica de Valencia (Spain)
2011-08-15
The Monte Carlo (MC) method can be applied to simulate brachytherapy treatment planning. The MCNP5 code gives, together with results, a statistical uncertainty associated with them. However, the latter is not the only existing uncertainty related to the simulation and other uncertainties must be taken into account. A complete analysis of all sources of uncertainty having some influence on results of the simulation of brachytherapy treatment is presented in this paper. This analysis has been based on the recommendations of the American Association for Physicist in Medicine (AAPM) and of the International Standard Organisation (ISO).
A broadband chip-scale optical frequency synthesizer at 2.7 × 10(-16) relative uncertainty.
Huang, Shu-Wei; Yang, Jinghui; Yu, Mingbin; McGuyer, Bart H; Kwong, Dim-Lee; Zelevinsky, Tanya; Wong, Chee Wei
2016-04-01
Optical frequency combs-coherent light sources that connect optical frequencies with microwave oscillations-have become the enabling tool for precision spectroscopy, optical clockwork, and attosecond physics over the past decades. Current benchmark systems are self-referenced femtosecond mode-locked lasers, but Kerr nonlinear dynamics in high-Q solid-state microresonators has recently demonstrated promising features as alternative platforms. The advance not only fosters studies of chip-scale frequency metrology but also extends the realm of optical frequency combs. We report the full stabilization of chip-scale optical frequency combs. The microcomb's two degrees of freedom, one of the comb lines and the native 18-GHz comb spacing, are simultaneously phase-locked to known optical and microwave references. Active comb spacing stabilization improves long-term stability by six orders of magnitude, reaching a record instrument-limited residual instability of [Formula: see text]. Comparing 46 nitride frequency comb lines with a fiber laser frequency comb, we demonstrate the unprecedented microcomb tooth-to-tooth relative frequency uncertainty down to 50 mHz and 2.7 × 10(-16), heralding novel solid-state applications in precision spectroscopy, coherent communications, and astronomical spectrography.
A broadband chip-scale optical frequency synthesizer at 2.7 × 10−16 relative uncertainty
Huang, Shu-Wei; Yang, Jinghui; Yu, Mingbin; McGuyer, Bart H.; Kwong, Dim-Lee; Zelevinsky, Tanya; Wong, Chee Wei
2016-01-01
Optical frequency combs—coherent light sources that connect optical frequencies with microwave oscillations—have become the enabling tool for precision spectroscopy, optical clockwork, and attosecond physics over the past decades. Current benchmark systems are self-referenced femtosecond mode-locked lasers, but Kerr nonlinear dynamics in high-Q solid-state microresonators has recently demonstrated promising features as alternative platforms. The advance not only fosters studies of chip-scale frequency metrology but also extends the realm of optical frequency combs. We report the full stabilization of chip-scale optical frequency combs. The microcomb’s two degrees of freedom, one of the comb lines and the native 18-GHz comb spacing, are simultaneously phase-locked to known optical and microwave references. Active comb spacing stabilization improves long-term stability by six orders of magnitude, reaching a record instrument-limited residual instability of 3.6mHz/τ. Comparing 46 nitride frequency comb lines with a fiber laser frequency comb, we demonstrate the unprecedented microcomb tooth-to-tooth relative frequency uncertainty down to 50 mHz and 2.7 × 10−16, heralding novel solid-state applications in precision spectroscopy, coherent communications, and astronomical spectrography. PMID:27152341
Existing and Past Methods of Test and Rating Standards Related to Integrated Heat Pump Technologies
Reedy, Wayne R. [Sentech, Inc.
2010-07-01
This report evaluates existing and past US methods of test and rating standards related to electrically operated air, water, and ground source air conditioners and heat pumps, 65,000 Btu/hr and under in capacity, that potentiality incorporate a potable water heating function. Two AHRI (formerly ARI) standards and three DOE waivers were identified as directly related. Six other AHRI standards related to the test and rating of base units were identified as of interest, as they would form the basis of any new comprehensive test procedure. Numerous other AHRI and ASHRAE component test standards were also identified as perhaps being of help in developing a comprehensive test procedure.
77 FR 50757 - Charging Standard Administrative Fees for Nonprogram-Related Information
2012-08-22
... ADMINISTRATION Charging Standard Administrative Fees for Nonprogram-Related Information AGENCY: Social Security... FURTHER INFORMATION CONTACT: Karen Huelskamp, Social Security Administration, Office of Finance, 6401... request for information is for any purpose not directly related to the administration of the...
Uncertainty in hydrological signatures
McMillan, Hilary; Westerberg, Ida
2015-04-01
Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty
Modelling of Transport Projects Uncertainties
Salling, Kim Bang; Leleur, Steen
2009-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....
Hukkerikar, Amol; Kalakul, Sawitree; Sarup, Bent
2012-01-01
The aim of this work is to develop group-3 contribution+ (GC+)method (combined group-contribution (GC) method and atom connectivity index (CI)) based 15 property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated...
Xu, B.; Plante, A. F.; Johnson, A. H.; Pan, Y.
2014-12-01
Estimating forest soil carbon and nitrogen (CN) is critical to understanding ecosystem responses to changing climate, disturbance and forest management practices. Most of the uncertainty in soil CN cycling is associated with the difficulty in characterizing soil properties in field sampling because forest soils can be rocky, inaccessible and spatially heterogeneous. A composite coring technique is broadly applied as the standard FIA soil sampling protocol. However, the accuracy of this method might be limited by soil compaction, rock obstruction and plot selection problems during sampling. In contrast, the quantitative soil pit sampling method may avoid these problems and provides direct measurements of soil mass, rock volume and CN concentration representative of a larger ground surface area. In this study, the two sampling methods were applied in 60 forest plots, randomly located in three research areas in the Delaware River Basin in the U.S. Mid-Atlantic region. In each of the plots, one quantitative soil pit was excavated and three soil cores were collected. Our results show that average soil bulk density in the top 20 cm mineral soil measured from the soil cores was consistently lower than bulk density measured by soil pits. However, the volume percentage of coarse fragments measured by the core method was also significantly lower than the pit method. Conversely, CN concentrations were greater in core samples compared to pit samples. The resulting soil carbon content (0-20 cm) was estimated to be 4.1 ± 0.4 kg m-2 in the core method compared to 4.5 ± 0.4 kg m-2 in the pit method. Lower bulk density but higher CN concentration and lower coarse fragments content from the cores have offset each other, resulting in no significant differences in CN content from the soil pit method. Deeper soil (20-40 cm), which is not accessible in the core method, accounted for 29% of the total soil carbon stock (0-40 cm) in the pit method. Our results suggest that, although soil
Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.
2002-01-01
The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.
Moyers, M. F., E-mail: MFMoyers@roadrunner.com [Shanghai Proton and Heavy Ion Center, Shanghai, China 201321 (China)
2014-06-15
Purpose: Adequate evaluation of the results from multi-institutional trials involving light ion beam treatments requires consideration of the planning margins applied to both targets and organs at risk. A major uncertainty that affects the size of these margins is the conversion of x ray computed tomography numbers (XCTNs) to relative linear stopping powers (RLSPs). Various facilities engaged in multi-institutional clinical trials involving proton beams have been applying significantly different margins in their patient planning. This study was performed to determine the variance in the conversion functions used at proton facilities in the U.S.A. wishing to participate in National Cancer Institute sponsored clinical trials. Methods: A simplified method of determining the conversion function was developed using a standard phantom containing only water and aluminum. The new method was based on the premise that all scanners have their XCTNs for air and water calibrated daily to constant values but that the XCTNs for high density/high atomic number materials are variable with different scanning conditions. The standard phantom was taken to 10 different proton facilities and scanned with the local protocols resulting in 14 derived conversion functions which were compared to the conversion functions used at the local facilities. Results: For tissues within ±300 XCTN of water, all facility functions produced converted RLSP values within ±6% of the values produced by the standard function and within 8% of the values from any other facility's function. For XCTNs corresponding to lung tissue, converted RLSP values differed by as great as ±8% from the standard and up to 16% from the values of other facilities. For XCTNs corresponding to low-density immobilization foam, the maximum to minimum values differed by as much as 40%. Conclusions: The new method greatly simplifies determination of the conversion function, reduces ambiguity, and in the future could promote
Moyers, M F
2014-06-01
Adequate evaluation of the results from multi-institutional trials involving light ion beam treatments requires consideration of the planning margins applied to both targets and organs at risk. A major uncertainty that affects the size of these margins is the conversion of x ray computed tomography numbers (XCTNs) to relative linear stopping powers (RLSPs). Various facilities engaged in multi-institutional clinical trials involving proton beams have been applying significantly different margins in their patient planning. This study was performed to determine the variance in the conversion functions used at proton facilities in the U.S.A. wishing to participate in National Cancer Institute sponsored clinical trials. A simplified method of determining the conversion function was developed using a standard phantom containing only water and aluminum. The new method was based on the premise that all scanners have their XCTNs for air and water calibrated daily to constant values but that the XCTNs for high density/high atomic number materials are variable with different scanning conditions. The standard phantom was taken to 10 different proton facilities and scanned with the local protocols resulting in 14 derived conversion functions which were compared to the conversion functions used at the local facilities. For tissues within ±300 XCTN of water, all facility functions produced converted RLSP values within ±6% of the values produced by the standard function and within 8% of the values from any other facility's function. For XCTNs corresponding to lung tissue, converted RLSP values differed by as great as ±8% from the standard and up to 16% from the values of other facilities. For XCTNs corresponding to low-density immobilization foam, the maximum to minimum values differed by as much as 40%. The new method greatly simplifies determination of the conversion function, reduces ambiguity, and in the future could promote standardization between facilities. Although it
Khabarova, K. Yu.; Kudeyarov, K. S.; Kolachevsky, N. N.
2017-06-01
Research and development in the field of optical clocks based on ultracold atoms and ions have enabled the relative uncertainty in frequency to be reduced down to a few parts in 1018. The use of novel, precise frequency comparison methods opens up new possibilities for basic research (sensitive tests of general relativity, a search for a drift of fundamental constants and a search for ‘dark matter’) as well as for state-of-the-art navigation and gravimetry. We discuss the key methods that are used in creating precision clocks (including transportable clocks) based on ultracold atoms and ions and the feasibility of using them in resolving current relativistic gravimetry issues.
Traceability and uncertainty estimation in coordinate metrology
Hansen, Hans Nørgaard; Savio, Enrico; De Chiffre, Leonardo
2001-01-01
National and international standards have defined performance verification procedures for coordinate measuring machines (CMMs) that typically involve their ability to measure calibrated lengths and to a certain extent form. It is recognised that, without further analysis or testing, these results...... are insufficient to determine the task specific uncertainty of most measurements. Therefore, performance verification methods defined in current standards do not guarantee traceability of measurements performed with a CMM for all measurement tasks, and procedures for the assessment of task-related uncertainties...... are required. Depending on the requirements for uncertainty level, different approaches may be adopted to achieve traceability. Especially in the case of complex measurement situations and workpieces the procedures are not trivial. This paper discusses the establishment of traceability in coordinate metrology...
Martínez, K; Rivera-Austrui, J; Adrados, M A; Abalos, M; Llerena, J J; van Bavel, B; Rivera, J; Abad, E
2009-07-31
The analysis of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) and dioxin-like polychlorinated biphenyls (dl-PCBs) present in stack gas emissions and solid residues from incinerators will be mandatory in the foreseeable future. European standard EN-1948 is in the process of being updated through the addition of a new Part 4 related to the analysis of the 12 dl-PCBs. Therefore, either a comprehensive and reliable method capable of analyzing all of these 29 compounds (12 dl-PCBs and 17 2,3,7,8-PCDD/Fs) needs to be developed, or the existing PCDD/F analytical procedure must be adapted to include the dl-PCBs. This study has taken the latter approach of modifying PCDD/F methodology and in particular the fractionation step, by isolating dioxins and dl-PCBs into separate fractions ready for high resolution gas chromatography coupled to high resolution mass spectrometry (HRGC/HRMS) analysis. Results obtained from the analysis of Certified Reference Materials (CRM-490 and CRM-615) and fly ashes from the European Committee for Standardization (CEN) intercalibration study demonstrated that the proposed methodology is appropriate to determine the dl-PCBs in accordance with the impending European standard EN-1948. Uncertainty values obtained during the validation of the analytical methodology were 13% total I-TEQ (International Toxic Equivalent) for PCDD/Fs and 31% total WHO-TEQ (World Health Organization Toxic Equivalent) in the case of dl-PCBs. In addition, 'real' samples such as emissions and fly ashes were successfully analyzed following the proposed analytical method.
Garretson, J L; Wiseman, H M; Pope, D T; Pegg, D T [Centre for Quantum Dynamics, School of Science, Griffith University, Brisbane 4111 (Australia)
2004-06-01
A which-way measurement destroys the twin-slit interference pattern. Bohr argued that this can be attributed to the Heisenberg uncertainty relation: distinguishing between two slits a distance s apart gives the particle a random momentum transfer {rho} of order h/s. This was accepted for more than 60 years, until Scully, Englert and Walther (SEW) proposed a which-way scheme that, they claimed, entailed no momentum transfer. Storey, Tan, Collett and Walls (STCW), on the other hand, proved a theorem that, they claimed, showed that Bohr was right. This work reviews and extends a recent proposal (Wiseman 2003 Phys. Lett. A 311 285) to resolve the issue using a weak-valued probability distribution for momentum transfer, P{sub wv}({rho}). We show that P{sub wv}({rho}) must be nonzero for some {rho}: vertical bar {rho} vertical bar > h/6s. However, its moments can be identically zero, such as in the experiment proposed by SEW. This is possible because P{sub wv}({rho}) is not necessarily positive definite. Nevertheless, it is measurable experimentally in a way understandable to a classical physicist. The new results in this paper include the following. We introduce a new measure of spread for P{sub wv}({rho}): half the length of the unit-confidence interval. We conjecture that it is never less than h/4s, and find numerically that it is approximately h/1.59s for an idealized version of the SEW scheme with infinitely narrow slits. For this example, the moments of P{sub wv}({rho}), and of the momentum distributions, are undefined unless a process of apodization is used. However, we show that by considering successively smoother initial wavefunctions, successively more moments of both P{sub wv}({rho}) and the momentum distributions become defined. For this example the moments of P{sub wv}({rho}) are zero, and these moments are equal to the changes in the moments of the momentum distribution. We prove that this relation also holds for schemes in which the moments of P{sub wv
Garretson, J. L.; Wiseman, H. M.; Pope, D. T.; Pegg, D. T.
2004-06-01
A which-way measurement destroys the twin-slit interference pattern. Bohr argued that this can be attributed to the Heisenberg uncertainty relation: distinguishing between two slits a distance s apart gives the particle a random momentum transfer \\wp of order h/s. This was accepted for more than 60 years, until Scully, Englert and Walther (SEW) proposed a which-way scheme that, they claimed, entailed no momentum transfer. Storey, Tan, Collett and Walls (STCW), on the other hand, proved a theorem that, they claimed, showed that Bohr was right. This work reviews and extends a recent proposal (Wiseman 2003 Phys. Lett. A 311 285) to resolve the issue using a weak-valued probability distribution for momentum transfer, P_{\\mathrm {wv}}(\\wp) . We show that P_{\\mathrm {wv}}(\\wp) must be nonzero for some \\wp: |\\wp |>h/6s . However, its moments can be identically zero, such as in the experiment proposed by SEW. This is possible because P_{\\mathrm {wv}}(\\wp) is not necessarily positive definite. Nevertheless, it is measurable experimentally in a way understandable to a classical physicist. The new results in this paper include the following. We introduce a new measure of spread for P_{\\mathrm {wv}}(\\wp) : half the length of the unit-confidence interval. We conjecture that it is never less than h/4s, and find numerically that it is approximately h/1.59s for an idealized version of the SEW scheme with infinitely narrow slits. For this example, the moments of P_{\\mathrm {wv}}(\\wp) , and of the momentum distributions, are undefined unless a process of apodization is used. However, we show that by considering successively smoother initial wavefunctions, successively more moments of both P_{\\mathrm {wv}}(\\wp) and the momentum distributions become defined. For this example the moments of P_{\\mathrm {wv}}(\\wp) are zero, and these moments are equal to the changes in the moments of the momentum distribution. We prove that this relation also holds for schemes in which
New Standards relative to water meters; La futura normative relativa a contadores de agua
Navarro Cabeza, I. M.
2003-07-01
New standards relative to water meters no-existent till now are being developed in Europe. Nowadays there is specific European legislation but it is obsolete due to the constant evolution of the technology. The future Measuring Instrument Directive will need European harmonized standards to complement it and these standards will be included in the national normative body of each Member State. (Author) 6 refs.
2011-03-18
... Supplemental Standards of Ethical Conduct for Employees of the Federal Labor Relations Authority AGENCY... Authority (FLRA), with the concurrence of the Office of Government Ethics (OGE), is adopting as final, without change, the interim FLRA rule that supplements the executive-branch-wide Standards of...
2012-05-17
... HUMAN SERVICES 45 CFR Part 153 RIN 0938-AR07 Patient Protection and Affordable Care Act; Standards... ] entitled, ``Patient Protection and Affordable Care Act; Standards Related to Reinsurance, Risk Corridors... section 553(b) of the Administrative Procedure Act (APA) (5 U.S.C. 553(b)). However, we can waive...
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Liu, Baoding
2015-01-01
When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...
Shope, Christopher L.; Angeroth, Cory E.
2015-01-01
Effective management of surface waters requires a robust understanding of spatiotemporal constituent loadings from upstream sources and the uncertainty associated with these estimates. We compared the total dissolved solids loading into the Great Salt Lake (GSL) for water year 2013 with estimates of previously sampled periods in the early 1960s.We also provide updated results on GSL loading, quantitatively bounded by sampling uncertainties, which are useful for current and future management efforts. Our statistical loading results were more accurate than those from simple regression models. Our results indicate that TDS loading to the GSL in water year 2013 was 14.6 million metric tons with uncertainty ranging from 2.8 to 46.3 million metric tons, which varies greatly from previous regression estimates for water year 1964 of 2.7 million metric tons. Results also indicate that locations with increased sampling frequency are correlated with decreasing confidence intervals. Because time is incorporated into the LOADEST models, discrepancies are largely expected to be a function of temporally lagged salt storage delivery to the GSL associated with terrestrial and in-stream processes. By incorporating temporally variable estimates and statistically derived uncertainty of these estimates,we have provided quantifiable variability in the annual estimates of dissolved solids loading into the GSL. Further, our results support the need for increased monitoring of dissolved solids loading into saline lakes like the GSL by demonstrating the uncertainty associated with different levels of sampling frequency.
Vaginismus and dyspareunia : Relationship with general and sex-related moral standards
Borg, Charmaine; de Jong, Peter J.; Schultz, Willibrord Weijmar
2011-01-01
Introduction. Relatively strong adherence to conservative values and/or relatively strict sex-related moral standards logically restricts the sexual repertoire and will lower the threshold for experiencing negative emotions in a sexual context. In turn, this may generate withdrawal and avoidance beh
Vaginismus and dyspareunia : Relationship with general and sex-related moral standards
Borg, Charmaine; de Jong, Peter J.; Schultz, Willibrord Weijmar
2011-01-01
Introduction. Relatively strong adherence to conservative values and/or relatively strict sex-related moral standards logically restricts the sexual repertoire and will lower the threshold for experiencing negative emotions in a sexual context. In turn, this may generate withdrawal and avoidance beh
Sinner, K.; Teasley, R. L.
2016-12-01
Groundwater models serve as integral tools for understanding flow processes and informing stakeholders and policy makers in management decisions. Historically, these models tended towards a deterministic nature, relying on historical data to predict and inform future decisions based on model outputs. This research works towards developing a stochastic method of modeling recharge inputs from pipe main break predictions in an existing groundwater model, which subsequently generates desired outputs incorporating future uncertainty rather than deterministic data. The case study for this research is the Barton Springs segment of the Edwards Aquifer near Austin, Texas. Researchers and water resource professionals have modeled the Edwards Aquifer for decades due to its high water quality, fragile ecosystem, and stakeholder interest. The original case study and model that this research is built upon was developed as a co-design problem with regional stakeholders and the model outcomes are generated specifically for communication with policy makers and managers. Recently, research in the Barton Springs segment demonstrated a significant contribution of urban, or anthropogenic, recharge to the aquifer, particularly during dry period, using deterministic data sets. Due to social and ecological importance of urban water loss to recharge, this study develops an evaluation method to help predicted pipe breaks and their related recharge contribution within the Barton Springs segment of the Edwards Aquifer. To benefit groundwater management decision processes, the performance measures captured in the model results, such as springflow, head levels, storage, and others, were determined by previous work in elicitation of problem framing to determine stakeholder interests and concerns. The results of the previous deterministic model and the stochastic model are compared to determine gains to stakeholder knowledge through the additional modeling
Sneek, E.J. [Quality of Environmental Data - QUENDA, Arnheim (Netherlands)
2003-04-01
The new European directives for ambient air quality are setting data quality objectives for the measurement of air polluting compounds. On request of the European Commission a working group of CEN 264 ''Air quality'' has developed standards with which these data quality objectives could be realised. Next to the measurement method itself, sampling procedure and quality assurance and quality control, these standards are dealing with type approval tests of instruments and uncertainty calculations of measurement data. Explanation is given about some of the considerations for the choices made by the working group in developing the requirements for performance characteristics, in developing the test methods to establish the values of these performance characteristics and in developing the uncertainty calculations. (orig.)
Uncertainties in repository modeling
Wilson, J.R.
1996-12-31
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.
Arroyo Mina, José Santiago; Daniel A. Revollo Fernández; Aguilar Ibarra, Alonso; Georgantzis, Nikolaos
2016-01-01
This paper presents the results of economic experiments run among fishermen from the Mexican and Colombian Pacific. The experimental design aims at studying behavior under uncertainty concerning the possible effects of climate change on fisheries. We find that subjects’ risk-aversion diminishes the level of catches and changes fishing practices (e.g. adopting marine reserves), provided that fishermen have ex ante information on possible climatic consequences. Furthermore, social preferences (...
张泳; 周诚; 姚琼
2012-01-01
参与标准竞争日益成为市场竞争的一个新特征。现有文献对标准竞争在消费者市场影响机理的研究还非常有限。通过实验法探讨了标准竞争如何影响消费者新产品购买行为。研究结果表明：消费者在做购买决策过程中受到不同类型不确定性的影响;标准竞争对购买决策过程产生调节作用,使得消费者更加关注成本等限制性因素,因此成本相关的不确定性对消费者在标准竞争市场上的新产品购买决策具有更强的影响力。%Participating in standards competition is becoming a new feature of market competition,the existing research on the influencing mechanism of standards competition on the consumer market is very limited.This article discusses the impact of standards competition on consumers＇ new product purchasing behavior.The results show that standards competition affects the relative importance of the different types of uncertainties relating to consumers＇ adoption,and standards competition moderates consumer to pay more attention to restrictive factors relating to cost,so the impact of uncertainty relating to cost is stronger in markets with standards competition.The results of our study have very important practical significance for enterprises to develop the targeted marketing strategy.
Nguyen, Daniel Xuyen
This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...
Uncertainty in magnetic activity indices
2008-01-01
Magnetic activity indices are widely used in theoretical studies of solar-terrestrial coupling and space weather prediction. However, the indices suffer from various uncertainties, which limit their application and even mislead to incorrect conclu-sion. In this paper we analyze three most popular indices, Kp, AE and Dst. Three categories of uncertainties in magnetic indices are discussed: "data uncertainty" originating from inadequate data processing, "station uncertainty" caused by in-complete station covering, and "physical uncertainty" stemming from unclear physical mechanism. A comparison between magnetic disturbances and related indices indicate that the residual Sq will cause an uncertainty of 1―2 in K meas-urement, the uncertainty in saturated AE is as much as 50%, and the uncertainty in Dst index caused by the partial ring currents is about a half of the partial ring cur-rent.
Uncertainty in magnetic activity indices
XU WenYao
2008-01-01
Magnetic activity indices are widely used in theoretical studies of solar-terrestrial coupling and space weather prediction. However, the indices suffer from various uncertainties, which limit their application and even mislead to incorrect conclu-sion. In this paper we analyze three most popular indices, Kp, AE and Dst. Three categories of uncertainties in magnetic indices are discussed: "data uncertainty" originating from inadequate data processing, "station uncertainty" caused by in-complete station covering, and "physical uncertainty" stemming from unclear physical mechanism. A comparison between magnetic disturbances and related indices indicate that the residual Sq will cause an uncertainty of 1-2 in K meas-urement, the uncertainty in saturated AE is as much as 50%, and the uncertainty in Dst index caused by the partial ring currents is about a half of the partial ring cur-rent.
Han, Seong Won
2016-01-01
Students' science-related career expectations are important for predicting their future science, technology, engineering, and mathematics (STEM)-related educational and occupational attainments. This study examines the degree to which standards-based external examinations are associated with a student's propensity for pursuing science-related…
Uncertainty in Forest Net Present Value Estimations
Ilona Pietilä
2010-09-01
Full Text Available Uncertainty related to inventory data, growth models and timber price fluctuation was investigated in the assessment of forest property net present value (NPV. The degree of uncertainty associated with inventory data was obtained from previous area-based airborne laser scanning (ALS inventory studies. The study was performed, applying the Monte Carlo simulation, using stand-level growth and yield projection models and three alternative rates of interest (3, 4 and 5%. Timber price fluctuation was portrayed with geometric mean-reverting (GMR price models. The analysis was conducted for four alternative forest properties having varying compartment structures: (A a property having an even development class distribution, (B sapling stands, (C young thinning stands, and (D mature stands. Simulations resulted in predicted yield value (predicted NPV distributions at both stand and property levels. Our results showed that ALS inventory errors were the most prominent source of uncertainty, leading to a 5.1–7.5% relative deviation of property-level NPV when an interest rate of 3% was applied. Interestingly, ALS inventory led to significant biases at the property level, ranging from 8.9% to 14.1% (3% interest rate. ALS inventory-based bias was the most significant in mature stand properties. Errors related to the growth predictions led to a relative standard deviation in NPV, varying from 1.5% to 4.1%. Growth model-related uncertainty was most significant in sapling stand properties. Timber price fluctuation caused the relative standard deviations ranged from 3.4% to 6.4% (3% interest rate. The combined relative variation caused by inventory errors, growth model errors and timber price fluctuation varied, depending on the property type and applied rates of interest, from 6.4% to 12.6%. By applying the methodology described here, one may take into account the effects of various uncertainty factors in the prediction of forest yield value and to supply the
赵学忠
2016-01-01
According to the test method of plane strain fracture toughness for metal material,the compact tension specimens were adopted to evaluate the combined standard uncertainty,and the uncertainty equations and the calculated values were given.Through the analysis,it was considered that the standard uncertainty was most affected by the condition load P Q ,and the specimen geometry factor f (α)had a minor effect on the standard uncertainty.The accuracy and precision of the measurement of average crack length were the main influencing factor of the stress intensity factor.Finally,the influence on the crack flatness and measuring precision was analyzed and discussed from the aspects of specimen machining,clamp design,crack prefabrication and crack measurement.%根据金属材料平面应变断裂韧度的测试方法，选取紧凑拉伸试样进行合成标准不确定度评定，给出了不确定度的表达式及计算值。通过分析认为：条件载荷PQ 对标准不确定度的影响最大，试样几何形状因子f(α)的影响次之；平均裂纹长度测量的准确性和精度是影响应力强度因子的主要因素之一。最后，分析讨论了试样加工、夹具设计、裂纹预制、裂纹测量对裂纹平直度和测量精度的影响。
WANG Yong
2016-05-01
Full Text Available As points of interest (POIon the internet, exists widely incomplete addresses and inconsistent literal expressions, a fast standardization processing method of network POIs address information based on spatial constraints was proposed. Based on the model of the extensible address expression, first of all, address information of POI was segmented and extracted. Address elements are updated by means of matching with the address tree layer by layer. Then, by defining four types of positional relations, corresponding set are selected from standard POI library as candidate for enrichment and amendment of non-standard address. At last, the fast standardized processing of POI address information was achieved with the help of backtracking address elements with minimum granularity. Experiments in this paper proved that the standardization processing of an address can be realized by means of this method with higher accuracy in order to build the address database.
Generalized uncertainty principles
Machluf, Ronny
2008-01-01
The phenomenon in the essence of classical uncertainty principles is well known since the thirties of the last century. We introduce a new phenomenon which is in the essence of a new notion that we introduce: "Generalized Uncertainty Principles". We show the relation between classical uncertainty principles and generalized uncertainty principles. We generalized "Landau-Pollak-Slepian" uncertainty principle. Our generalization relates the following two quantities and two scaling parameters: 1) The weighted time spreading $\\int_{-\\infty}^\\infty |f(x)|^2w_1(x)dx$, ($w_1(x)$ is a non-negative function). 2) The weighted frequency spreading $\\int_{-\\infty}^\\infty |\\hat{f}(\\omega)|^2w_2(\\omega)d\\omega$. 3) The time weight scale $a$, ${w_1}_a(x)=w_1(xa^{-1})$ and 4) The frequency weight scale $b$, ${w_2}_b(\\omega)=w_2(\\omega b^{-1})$. "Generalized Uncertainty Principle" is an inequality that summarizes the constraints on the relations between the two spreading quantities and two scaling parameters. For any two reason...
Conover, David R.
2014-09-11
The purpose of this document is to identify laws, rules, model codes, codes, standards, regulations, specifications (CSR) related to safety that could apply to stationary energy storage systems (ESS) and experiences to date securing approval of ESS in relation to CSR. This information is intended to assist in securing approval of ESS under current CSR and to identification of new CRS or revisions to existing CRS and necessary supporting research and documentation that can foster the deployment of safe ESS.
Development of a Dynamic Lidar Uncertainty Framework
Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County
2017-08-07
As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict
A standard curve based method for relative real time PCR data processing
Krause Andreas
2005-03-01
Full Text Available Abstract Background Currently real time PCR is the most precise method by which to measure gene expression. The method generates a large amount of raw numerical data and processing may notably influence final results. The data processing is based either on standard curves or on PCR efficiency assessment. At the moment, the PCR efficiency approach is preferred in relative PCR whilst the standard curve is often used for absolute PCR. However, there are no barriers to employ standard curves for relative PCR. This article provides an implementation of the standard curve method and discusses its advantages and limitations in relative real time PCR. Results We designed a procedure for data processing in relative real time PCR. The procedure completely avoids PCR efficiency assessment, minimizes operator involvement and provides a statistical assessment of intra-assay variation. The procedure includes the following steps. (I Noise is filtered from raw fluorescence readings by smoothing, baseline subtraction and amplitude normalization. (II The optimal threshold is selected automatically from regression parameters of the standard curve. (III Crossing points (CPs are derived directly from coordinates of points where the threshold line crosses fluorescence plots obtained after the noise filtering. (IV The means and their variances are calculated for CPs in PCR replicas. (V The final results are derived from the CPs' means. The CPs' variances are traced to results by the law of error propagation. A detailed description and analysis of this data processing is provided. The limitations associated with the use of parametric statistical methods and amplitude normalization are specifically analyzed and found fit to the routine laboratory practice. Different options are discussed for aggregation of data obtained from multiple reference genes. Conclusion A standard curve based procedure for PCR data processing has been compiled and validated. It illustrates that
韩春红
2016-01-01
To satisfy the requirement of the online vibrating liquid density meter compulsory verification, an online vibrating density standard device should be established. Through ana-lysing the sources of measurement uncertainty for the density standard device, the evaluation of the measurement uncertainty is carried out for each input component, such as repeatabili-ty, a first-class standard density meter, frequency meter resolution and the temperature measurement. The expanded uncertainty of density standard device is 0.32 kg/m3, and the verification/calibration work for the liquid density meter with the accuracy of 0.5 degree could be carried out.%为满足在线振动管液体密度计的强制检定要求,需建立在线振动管密度标准装置.通过对密度标准装置的测量不确定度来源分析,分别对重复性、一等标准密度计、频率计分辨率、温度测量等各输入分量进行测量不确定度的评定.评定密度标准装置的扩展不确定度为0.32 kg/m3,符合国家计量检定系统表的要求,可以开展准确度等级为0.5级的液体密度计的检定/校准工作.
Technical Review of Law Enforcement Standards and Guides Relative to Incident Management
Stenner, Robert D.; Salter, R.; Stanton, J. R.; Fisher, D.
2009-03-24
In an effort to locate potential law enforcement-related standards that support incident management, a team from the Pacific Northwest National Laboratory (PNNL) contacted representatives from the National Institute of Standards-Office of Law Enforcement Standards (NIST-OLES), National Institute of Justice (NIJ), Federal Bureau of Investigation (FBI), Secret Service, ASTM International committees that have a law enforcement focus, and a variety of individuals from local and regional law enforcement organizations. Discussions were held with various state and local law enforcement organizations. The NIJ has published several specific equipment-related law enforcement standards that were included in the review, but it appears that law enforcement program and process-type standards are developed principally by organizations that operate at the state and local level. Input is provided from state regulations and codes and from external non-government organizations (NGOs) that provide national standards. The standards that are adopted from external organizations or developed independently by state authorities are available for use by local law enforcement agencies on a voluntary basis. The extent to which they are used depends on the respective jurisdictions involved. In some instances, use of state and local disseminated standards is mandatory, but in most cases, use is voluntary. Usually, the extent to which these standards are used appears to depend on whether or not jurisdictions receive certification from a “governing” entity due to their use and compliance with the standards. In some cases, these certification-based standards are used in principal but without certification or other compliance monitoring. In general, these standards appear to be routinely used for qualification, selection for employment, and training. In these standards, the term “Peace Officer” is frequently used to refer to law enforcement personnel. This technical review of national law
Model Uncertainty for Bilinear Hysteric Systems
Sørensen, John Dalsgaard; Thoft-Christensen, Palle
In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...... density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model...... uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used....
Budiman, Harry; Mulyana, Muhammad Rizky; Zuas, Oman
2017-01-01
Uncertainty estimation for the gravimetric dilution of four calibration gas mixtures [carbon dioxide (CO2), carbon monoxide (CO), and methane (CH4) in helium (He) Balance] have been carried out according to the International Organization for Standardization (ISO) of "Guide to the Expression of Uncertainty in Measurement". The uncertainty of the composition of gas mixtures was evaluated to measure the quality, reliability, and comparability of the prepared calibration gas mixtures. The analytical process for the uncertainty estimation is comprised of four main stages such as specification of measurand, identification, quantification of the relevant uncertainty sources, and combination of the individual uncertainty sources. In this study, important uncertainty sources including weighing, gas cylinder, component gas, certified calibration gas mixture (CCGM) added, and purity of the He balance were examined to estimate the final uncertainty of composition of diluted calibration gas mixtures. The results shows that the uncertainties of gravimetric dilution of the four calibration gas mixtures (CO2, CO, and CH4 in He Balance) were found in the range of 5.974% - 7.256% that were expressed as %relative of expanded uncertainty at 95% of confidence level (k=2). The major contribution of sources uncertainty to the final uncertainty arose from the uncertainty related to the certified calibration gas mixture (CCGM) which was the uncertainty value stated in the CCGM certificate. The verification of calibration gas mixtures composition shows that the gravimetric values of calibration gas mixtures were consistent with the results of measurement using gas chromatography flame ionization detector equipped by methanizer.
12 CFR 335.121 - Listing standards related to audit committees.
2010-01-01
... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Listing standards related to audit committees. 335.121 Section 335.121 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND... audit committees. The provisions of the applicable SEC regulation under section 10(A)(m) of the...
2011-07-15
... Affordable Care Act; Standards Related to Reinsurance, Risk Corridors and Risk Adjustment; Proposed Rule #0... OF HEALTH AND HUMAN SERVICES 45 CFR Part 153 RIN 0938-AR07 Patient Protection and Affordable Care Act... corridors, and risk adjustment consistent with title I of the Patient Protection and Affordable Care Act...
Evidence of Shifting Standards in Judgments of Male and Female Parents' Job-Related Ability
Fuegen, Kathleen; Endicott, Nicole F.
2010-01-01
We tested the hypothesis, derived from the shifting standards model of stereotyping, that parenthood would polarize judgments of men's and women's job-related ability. One hundred thirty-five attorneys evaluated the resume of a recent law school graduate. The resume depicted the graduate as male or female and as either single or married with two…
45 CFR 148.170 - Standards relating to benefits for mothers and newborns.
2010-10-01
... to Benefits § 148.170 Standards relating to benefits for mothers and newborns. (a) Hospital length of... hospital, the hospital length of stay begins at the time the mother or newborn is admitted as a hospital...) Discharge of newborn. If a decision to discharge a newborn child earlier than the period specified in...
Review of international standards for dosemeters.
Behrens, R; Ambrosi, P
2008-01-01
International standards for radiation protection dosemeters are published by the International Electrotechnical Commission and the International Organization for Standardization. Several standards exist side by side, although they treat the same measuring task, and specify different requirements, so that dosemeters of different quality result. In this paper, the quality of dosemeters is compared by calculating the uncertainty of dose measurements for dosemeters, which just basely fulfil the respective standard. The results are related to general yardsticks on uncertainty laid down by international organisations. Furthermore, technical differences are standards and addressed and a method to make them conform is presented.
Smith-Nelson, Mark A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cutler, Theresa Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hutchinson, Jesson D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-02
Momentum is a neutron multiplicity analysis software package that calculates a variety of parameters associated with Feynman histograms. While most of these parameters are documented in Cifarelli and Smith-Nelson, there are some parameters which are not. Most prominent of these are the uncertainties in the standard moments, and this paper will explicitly document these parameters. This paper will also document the higher-order Y_{n} parameters and their associated ω_{n} functions because they may be useful for future applications. The generation of what is referred to as Feynman histograms is not explained here.
Nguyen, Daniel Xuyen
This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...... the high rate of exit seen in the first years of exporting. Finally, when faced with multiple countries in which to export, some firms will choose to sequentially export in order to slowly learn more about its chances for success in untested markets....
Jang, Misuk; Jeon, Jong Seon; Kang, Hyun Sik; Kim, Seoung Rae [NESS, Daejeon (Korea, Republic of)
2016-10-15
In this paper, we would introduce and review technical standards related to sodium fire and plutonium criticality safety. This paper may be helpful to identify considerations in the development of equipment, standards, and etc., to meet the safety requirements in the design, construction and operating of TFFF, KAPF and SFR. The feasibility and conceptual designs are being examined on related facilities, for example, TRU Fuel Fabrication Facilities (TFFF), Korea Advanced Pyro-process Facility (KAPF), and Sodium Cooled Fast Reactor (SFR), in Korea. However, the safety concerns of these facilities have been controversial in part because of the Sodium fire accident and Plutonium related radiation safety caused by transport and handling accident. Thus, many researches have been performed to ensure safety and various documents including safety requirements have been developed. In separating and reducing the long-lived radioactive transuranic(TRU) in the spent nuclear fuel, reusing as the potential energy of uranium fuel resources and reducing the high level wastes, TFFF would be receiving the attention of many people. Thus, people would wonder whether compliance with technical standards that ensures safety. For new facility design, one of the important tasks is to review of technical standards, especially for sodium and Plutonium because of water related highly reactive characteristics and criticality hazard respectively. We have introduced and reviewed two important technical standards for TFFF, which are sodium fire and plutonium criticality safety, in this paper. This paper would provide a brief guidance, about how to start and what is important, to people who are responsible for the initial design to operation of TFFF.
A New Framework for Quantifying Lidar Uncertainty
Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.; Churchfield, Matthew J.
2017-03-24
As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.
Saviano, Alessandro Morais; Francisco, Fabiane Lacerda; Ostronoff, Celina Silva; Lourenço, Felipe Rebello
2015-01-01
The aim of this study was to develop, optimize, and validate a microplate bioassay for relative potency determination of linezolid in pharmaceutical samples using quality-by-design and design space approaches. In addition, a procedure is described for estimating relative potency uncertainty based on microbiological response variability. The influence of culture media composition was studied using a factorial design and a central composite design was adopted to study the influence of inoculum proportion and triphenyltetrazolium chloride in microbial growth. The microplate bioassay was optimized regarding the responses of low, medium, and high doses of linezolid, negative and positive controls, and the slope, intercept, and correlation coefficient of dose-response curves. According to optimization results, design space ranges were established using: (a) low (1.0 μg/mL), medium (2.0 μg/mL), and high (4.0 μg/mL) doses of pharmaceutical samples and linezolid chemical reference substance; (b) Staphylococcus aureus ATCC 653 in an inoculum proportion of 10%; (c) antibiotic No. 3 culture medium pH 7.0±0.1; (d) 6 h incubation at 37.0±0.1ºC; and (e) addition of 50 μL of 0.5% (w/v) triphenyltetrazolium chloride solution. The microplate bioassay was linear (r2=0.992), specific, precise (repeatability RSD=2.3% and intermediate precision RSD=4.3%), accurate (mean recovery=101.4%), and robust. The overall measurement uncertainty was reasonable considering the increased variability inherent in microbiological response. Final uncertainty was comparable with those obtained with other microbiological assays, as well as chemical methods.
Modelling of Transport Projects Uncertainties
Salling, Kim Bang; Leleur, Steen
2012-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....
MACROECONOMIC UNCERTAINTY AND PRIVATE INVESTMENT IN GHANA: AN EMPIRICAL INVESTIGATION
William Bekoe
2013-01-01
Full Text Available In spite of the progress made in economic performance over the years, the Ghanaian economy continues to be bedevilled by a host of constraints. Among these constraints are low levels of savings and investments which have raised serious concerns among economists and policy makers with respect to the sustainability of the achievements attained so far. This study attempts to investigate empirically the link between investments and uncertainty using dataset from Ghana covering the period 1975 to 2008. In the empirical analysis, the paper aims at separating ordinary variability from uncertainty by the construction of measures of uncertainty for some key macroeconomic indicators and using them to assess their impact on investment behaviour within an econometric framework including other acceptable determinants of investment. The Phillip-Hansen cointegration test confirms the existence of long-run equilibrium relationship between private investment, standard determinants of investment, and macroeconomic uncertainty. Result from the study shows that on the whole the investment-uncertainty link reveals a significant negative effect of all macroeconomic uncertainty indicator variables on private investment with the exception of real exchange rate volatility. The values for price of capital uncertainty, real GDP growth uncertainty, and terms of trade uncertainty are large in absolute terms. The regression result further reveals that private investment displays important inertia and shows slow adjustment process towards long-run equilibrium. Lastly, the summary measure of macroeconomic uncertainty which encompasses the first principal components of the conditional variances of the five macroeconomic variables shows a consistent indirect effect on private investment. Generally we found macroeconomic uncertainties to be more detrimental to private investment growth in the long-run relative to the short-run.
Importance of hydrological uncertainty assessment methods in climate change impact studies
Honti, M.; Scheidegger, A.; Stamm, C.
2014-01-01
Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with a recent boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the following decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The source, structure and composition of uncertainty depended strongly on the uncertainty assessment method. This demonstrated that one could arrive to rather different conclusions about predictive uncertainty for the same
The importance of hydrological uncertainty assessment methods in climate change impact studies
Honti, M.; Scheidegger, A.; Stamm, C.
2014-08-01
Climate change impact assessments have become more and more popular in hydrology since the middle 1980s with a recent boost after the publication of the IPCC AR4 report. From hundreds of impact studies a quasi-standard methodology has emerged, to a large extent shaped by the growing public demand for predicting how water resources management or flood protection should change in the coming decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The choice of uncertainty assessment method actually determined what sources of uncertainty could be identified at all. This demonstrated that one could arrive at rather different conclusions about the causes behind
范梦璇
2015-01-01
<正>Employ change-related uncertainty is a condition that under current continually changing business environment,the organizations also have to change,the change include strategic direction,structure and staffing levels to help company to keep competitive(Armenakis&Bedeian,1999);However;these
Larsman, P; Thorn, S; Søgaard, K
2009-01-01
The current study investigated the associations between work-related perceived stress and surface electromyographic (sEMG) parameters (muscle activity and muscle rest) during standardized simulated computer work (typing, editing, precision, and Stroop tasks). It was part of the European case......-control study, NEW (Neuromuscular assessment in the Elderly Worker). The present cross-sectional study was based on a questionnaire survey and sEMG measurements among Danish and Swedish female computer users aged 45 or older (n=49). The results show associations between work-related perceived stress...... and trapezius muscle activity and rest during standardized simulated computer work, and provide partial empirical support for the hypothesized pathway of stress induced muscle activity in the association between an adverse psychosocial work environment and musculoskeletal symptoms in the neck and shoulder....
Uncertainty in artificial intelligence
Levitt, TS; Lemmer, JF; Shachter, RD
1990-01-01
Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i
A new standard nomenclature for proteins related to Apx and Shroom
Staub Olivier
2006-04-01
Full Text Available Abstract Shroom is a recently-described regulator of cell shape changes in the developing nervous system. This protein is a member of a small family of related proteins that are defined by sequence similarity and in most cases by some link to the actin cytoskeleton. At present these proteins are named Shroom, APX, APXL, and KIAA1202. In light of the growing interest in this family of proteins, we propose here a new standard nomenclature.
Gamal G.L.Nashed
2012-01-01
A perfect fluid with self-similarity of the second kind is studied within the framework of the teleparallel equivalent of general relativity (TEGR).A spacetime which is not asymptotically flat is derived.The energy conditions of this spacetime are studied.It is shown that after some time the strong energy condition is not enough to satisfy showing a transition from standard matter to dark energy.The singularities of this solution are discussed.
Interpreting uncertainty terms.
Holtgraves, Thomas
2014-08-01
Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.
Fergus, Thomas A; Wu, Kevin D
2013-10-01
Although it is understood that assessment tools require evaluation using diverse samples, such evaluations are relatively rare. There are obstacles to such work, but it remains important to pursue psychometric data in broad samples. As such, we evaluated measurement invariance and population heterogeneity of two versions of a widely used measure in the anxiety literature--the Intolerance of Uncertainty Scale (IUS)--among self-identifying White (N = 1,185) and Black (N = 301) students. Data from multiple-groups confirmatory factor analysis supported the equivalence of the equal form and factor loadings of both IUS versions in White and Black respondents. However, specific IUS items functioned differently in the two groups, with more IUS items appearing biased in the full-length relative to the short-form version. Correlations between IUS factors and worry were equivalent among White and Black respondents. We discuss the implications of these results for future research.
Renormalization group: New relations between the parameters of the Standard Model
Juárez W., S. Rebeca; Kielanowski, Piotr; Mora, Gerardo; Bohm, Arno
2017-09-01
We analyze the renormalization group equations for the Standard Model at the one and two loops levels. At one loop level we find an exact constant of evolution built from the product of the quark masses and the gauge couplings g1 and g3 of the U (1) and SU (3) groups. For leptons at one loop level we find that the ratio of the charged lepton mass and the power of g1 varies ≃ 4 ×10-5 in the whole energy range. At the two loop level we have found two relations between the quark masses and the gauge couplings that vary ≃ 4% and ≃ 1%, respectively. For leptons at the two loop level we have derived a relation between the charged lepton mass and the gauge couplings g1 and g2 that varies ≃ 0.1%. This analysis significantly simplifies the picture of the renormalization group evolution of the Standard Model and establishes new important relations between its parameters. There is also included a discussion of the gauge invariance of our relations and its possible relation to the reduction of couplings method.
Renormalization group: New relations between the parameters of the Standard Model
S. Rebeca Juárez W.
2017-09-01
Full Text Available We analyze the renormalization group equations for the Standard Model at the one and two loops levels. At one loop level we find an exact constant of evolution built from the product of the quark masses and the gauge couplings g1 and g3 of the U(1 and SU(3 groups. For leptons at one loop level we find that the ratio of the charged lepton mass and the power of g1 varies ≃4×10−5 in the whole energy range. At the two loop level we have found two relations between the quark masses and the gauge couplings that vary ≃4% and ≃1%, respectively. For leptons at the two loop level we have derived a relation between the charged lepton mass and the gauge couplings g1 and g2 that varies ≃0.1%. This analysis significantly simplifies the picture of the renormalization group evolution of the Standard Model and establishes new important relations between its parameters. There is also included a discussion of the gauge invariance of our relations and its possible relation to the reduction of couplings method.
Characterizing Epistemic Uncertainty for Launch Vehicle Designs
Novack, Steven D.; Rogers, Jim; Al Hassan, Mohammad; Hark, Frank
2016-01-01
NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk, and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results, and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods, such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty, are rendered obsolete, since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods. This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper describes how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.
Rami Ahmad El-Nabulsi
2015-08-01
Full Text Available Recently, non-standard Lagrangians have gained a growing importance in theoretical physics and in the theory of non-linear differential equations. However, their formulations and implications in general relativity are still in their infancies despite some advances in contemporary cosmology. The main aim of this paper is to fill the gap. Though non-standard Lagrangians may be defined by a multitude form, in this paper, we considered the exponential type. One basic feature of exponential non-standard Lagrangians concerns the modified Euler-Lagrange equation obtained from the standard variational analysis. Accordingly, when applied to spacetime geometries, one unsurprisingly expects modified geodesic equations. However, when taking into account the time-like paths parameterization constraint, remarkably, it was observed that mutually discrete gravity and discrete spacetime emerge in the theory. Two different independent cases were obtained: A geometrical manifold with new spacetime coordinates augmented by a metric signature change and a geometrical manifold characterized by a discretized spacetime metric. Both cases give raise to Einstein’s field equations yet the gravity is discretized and originated from “spacetime discreteness”. A number of mathematical and physical implications of these results were discussed though this paper and perspectives are given accordingly.
Spin, localization and uncertainty of relativistic fermions
Céleri, Lucas C; Terno, Daniel R
2016-01-01
We describe relations between several relativistic spin observables and derive a Lorentz-invariant characteristic of a reduced spin density matrix. A relativistic position operator that satisfies all the properties of its non-relativistic analogue does not exist. Instead we propose two causality-preserving positive operator-valued measures (POVM) that are based on projections onto one-particle and antiparticle spaces, and on the normalized energy density. They predict identical expectation values for position. The variances differ by less than a quarter of the squared de Broglie wavelength and coincide in the non-relativistic limit. Since the resulting statistical moment operators are not canonical conjugates of momentum, the Heisenberg uncertainty relations need not hold. Indeed, the energy density POVM leads to a lower uncertainty. We reformulate the standard equations of the spin dynamics by explicitly considering the charge-independent acceleration, allowing a consistent treatment of backreaction and incl...
Network planning under uncertainties
Ho, Kwok Shing; Cheung, Kwok Wai
2008-11-01
One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a
An earlier paper (Hattis et al., 2003) developed a quantitative likelihood-based statistical analysis of the differences in apparent sensitivity of rodents to mutagenic carcinogens across three life stages (fetal, birth-weaning, and weaning-60 days) relative to exposures in adult...
Modelling of Transport Projects Uncertainties
Salling, Kim Bang; Leleur, Steen
2009-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario...
Michael V. Levin
2015-01-01
Full Text Available The article considers the main statements of the international standard ISO/IEC 24763, that defines the requirements to the competency framework and to the competency information management, description, evaluation and merging methods in different IT-systems. The necessity for competency evaluation IT-systems development is considered in terms of advanced staff training, creating e-portfolios and career development.
Valentina Fabi
2016-02-01
Full Text Available The interactions between building occupants and control systems have a high influence on energy consumption and on indoor environmental quality. In the perspective of a future of “nearly-zero” energy buildings, it is crucial to analyse the energy-related interactions deeply to predict realistic energy use during the design stage. Since the reaction to thermal, acoustic, or visual stimuli is not the same for every human being, monitoring the behaviour inside buildings is an essential step to assert differences in energy consumption related to different interactions. Reliable information concerning occupants’ behaviours in a building could contribute to a better evaluation of building energy performances and design robustness, as well as supporting the development of occupants’ education to energy awareness. The present literature survey enlarges our understanding of which environmental conditions influence occupants’ manual controlling of the system in offices and by consequence the energy consumption. The purpose of this study was to investigate the possible drivers for light-switching to model occupant behaviour in office buildings. The probability of switching lighting systems on or off was related to the occupancy and differentiated for arrival, intermediate, and departure periods. The switching probability has been reported to be higher during the entering or the leaving time in relation to contextual variables. In the analysis of switch-on actions, users were often clustered between those who take daylight level into account and switch on lights only if necessary and people who totally disregard the natural lighting. This underlines the importance of how individuality is at the base of the definition of the different types of users.
Benchmarking observational uncertainties for hydrology (Invited)
McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.
2013-12-01
There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has
DPRESS: Localizing estimates of predictive uncertainty
Clark Robert D
2009-07-01
Full Text Available Abstract Background The need to have a quantitative estimate of the uncertainty of prediction for QSAR models is steadily increasing, in part because such predictions are being widely distributed as tabulated values disconnected from the models used to generate them. Classical statistical theory assumes that the error in the population being modeled is independent and identically distributed (IID, but this is often not actually the case. Such inhomogeneous error (heteroskedasticity can be addressed by providing an individualized estimate of predictive uncertainty for each particular new object u: the standard error of prediction su can be estimated as the non-cross-validated error st* for the closest object t* in the training set adjusted for its separation d from u in the descriptor space relative to the size of the training set. The predictive uncertainty factor γt* is obtained by distributing the internal predictive error sum of squares across objects in the training set based on the distances between them, hence the acronym: Distributed PRedictive Error Sum of Squares (DPRESS. Note that st* and γt*are characteristic of each training set compound contributing to the model of interest. Results The method was applied to partial least-squares models built using 2D (molecular hologram or 3D (molecular field descriptors applied to mid-sized training sets (N = 75 drawn from a large (N = 304, well-characterized pool of cyclooxygenase inhibitors. The observed variation in predictive error for the external 229 compound test sets was compared with the uncertainty estimates from DPRESS. Good qualitative and quantitative agreement was seen between the distributions of predictive error observed and those predicted using DPRESS. Inclusion of the distance-dependent term was essential to getting good agreement between the estimated uncertainties and the observed distributions of predictive error. The uncertainty estimates derived by DPRESS were
2012-08-27
... AGENCY First Draft Documents Related to the Review of the National Ambient Air Quality Standards for... of the Ozone National Ambient Air Quality Standards: First External Review Draft. The Agency is...-001; July 2012), please contact Ms. Karen Wesson, Office of Air Quality Planning and Standards...
Quark-Lepton Mass Relation in a Realistic A4 Extension of the Standard Model
King, S F; Peinado, E; Valle, J W F
2013-01-01
We propose a realistic A4 extension of the Standard Model involving a particular quark-lepton mass relation, namely that the ratio of the third family mass to the geometric mean of the first and second family masses are equal for down-type quarks and charged leptons. This relation, which is approximately renormalization group invariant, is usually regarded as arising from the Georgi-Jarlskog relations, but in the present model there is no unification group or supersymmetry. In the neutrino sector we propose a simple modification of the so called Zee-Wolfenstein mass matrix pattern which allows an acceptable reactor angle along with a deviation of the atmospheric and solar angles from their bi-maximal values. Quark masses, mixing angles and CP violation are well described by a numerical fit.
Uncertainties and Solutions Related to Use of WRB (2007) in the Boreo-nemoral zone, Case of Latvia
Kasparinskis, Raimonds; Nikodemus, Olgerts; Rolavs, Nauris
2014-05-01
Relatively high diversity of soils groups according to the WRB (2007) classification is observed in forest ecosystems in the boreo-nemoral zone in Latvia. This is due to the geological genesis of area and environmental conditions (Kasparinskis, Nikodemus, 2012), as well as historical land use and management (Nikodemus et al., 2013). Due to the relatively young soils, Albic, Spodic and Cambic horizons are relatively weakly expressed in many cases. Relatively well developed Albic horizons occur in sandy forest soils, but unusually well expressed Spodic features are observed. In some cases there is a Cambic horizon, however location of Cambisols in the WRB (2007) soil classification sequence does not provide an opportunity to classify these soils as Cambisols, but they are classified as Arenosols. This sequence does not reflect the logical sheme of soil development, and therefore raises the question about location of Podzols, Arenosols and Cambisols in the sequence of WRB (2007) soil classification. Soils with two parent materials (abrupt textural change) are relatively common in Latvia, where conceptually on the small scale mapping results in classification as the soil group Planosols, but in many cases there is occurrence of Fluvic materials, as parent material in the upper part of the soil profile is formed by Baltic Ice lake sandy sediments - this leads to question about the location of Fluvisols and Planosols in the sequence of the WRB (2007) soil classification. Soil research has found cases, where a relatively well developed Spodic horizon was established as the result of ground water table depth in areas of abrupt textural change. In this case the profile corresponds to the soil group of Podzols, however in some cases - Gleysols not Planosols due to a high ground water table. Therefore there is a need for discussion also about the location of Podzols and Planosols in the sequence of the WRB (2007) soil classification. The above mentioned questions raise
Lupu, Sergiu; Gazit, Doron
2015-01-01
The large nucleon-nucleon scattering length, and the isospin approximate symmetry, are low energy properties of quantum chromodynamics (QCD). These entail correlations in the binding energies of light nuclei, e.g., the A=3 iso-multiplet, and Tjon's correlation between the binding energy of three and four body nuclei. Using a new representation of these, we establish that they translate into a correlation between different short-range contributions to three body forces in chiral effective field theory of low-energy nuclear physics. We demonstrate that these correlations should be taken into account in order to avoid fine-tuning in the calibration of three body forces. We relate this to the role of correlations in uncertainty quantification of non-renormalizable effective field theories of the nuclear regime. In addition, we show that correlations can be useful in assessing the importance of forces induced by renormalization group (RG) transformations. We give numerical evidence that such RG transformations can...
Uncertainty in Air Quality Modeling.
Fox, Douglas G.
1984-01-01
Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions.The group agreed that air quality model results should be viewed as containing both reducible error and inherent uncertainty. Reducible error results from improper or inadequate meteorological and air quality data inputs, and from inadequacies in the models. Inherent uncertainty results from the basic stochastic nature of the turbulent atmospheric motions that are responsible for transport and diffusion of released materials. Modelers should acknowledge that all their predictions to date contain some associated uncertainty and strive also to quantify uncertainty.How can the uncertainty be quantified? There was no consensus from the group as to precisely how uncertainty should be calculated. One subgroup, which addressed statistical procedures, suggested that uncertainty information could be obtained from comparisons of observations and predictions. Following recommendations from a previous AMS workshop on performance evaluation (Fox. 1981), the subgroup suggested construction of probability distribution functions from the differences between observations and predictions. Further, they recommended that relatively new computer-intensive statistical procedures be considered to improve the quality of uncertainty estimates for the extreme value statistics of interest in regulatory applications.A second subgroup, which addressed the basic nature of uncertainty in a stochastic system, also recommended that uncertainty be quantified by consideration of the differences between observations and predictions. They suggested that the average of the difference squared was appropriate to isolate the inherent uncertainty that
M. C. Peel
2014-05-01
. Based on 100 stochastic replicates of each GCM run at each catchment, within-GCM uncertainty was assessed in relative form as the standard deviation expressed as a percentage of the mean of the 100 replicate values of each variable. The average relative within-GCM uncertainty from the 17 catchments and 5 GCMs for 2015–2044 (A1B were: MAP 4.2%, SDP 14.2%, MAT 0.7%, MAR 10.1% and SDR 17.6%. The Gould–Dincer Gamma procedure was applied to each annual runoff time-series for hypothetical reservoir capacities of 1× MAR and 3× MAR and the average uncertainty in reservoir yield due to within-GCM uncertainty from the 17 catchments and 5 GCMs were: 25.1% (1× MAR and 11.9% (3× MAR. Our approximation of within-GCM uncertainty is expected to be an underestimate due to not replicating the GCM trend. However, our results indicate that within-GCM uncertainty is important when interpreting climate change impact assessments. Approximately 95% of values of MAP, SDP, MAT, MAR, SDR and reservoir yield from 1× MAR or 3× MAR capacity reservoirs are expected to fall within twice their respective relative uncertainty (standard deviation/mean. Within-GCM uncertainty has significant implications for interpreting climate change impact assessments that report future changes within our range of uncertainty for a given variable – these projected changes may be due solely to within-GCM uncertainty. Since within-GCM variability is amplified from precipitation to runoff and then to reservoir yield, climate change impact assessments that do not take into account within-GCM uncertainty risk providing water resources management decision makers with a sense of certainty that is unjustified.
Farrance, Ian; Frenkel, Robert
2014-02-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship
Marcatto, Francesco; D'Errico, Giuseppe; Di Blas, Lisa; Ferrante, Donatella
2011-01-01
The aim of this paper is to present a preliminary validation of an Italian adaptation of the HSE Management Standards Work-Related Stress Indicator Tool (IT), an instrument for assessing work-related stress at the organizational level, originally developed in Britain by the Health and Safety Executive. A scale that assesses the physical work environment has been added to the original version of the IT. 190 employees of the University of Trieste have been enrolled in the study. A confirmatory analysis showed a satisfactory fit of the eight-factors structure of the instrument. Further psychometric analysis showed adequate internal consistency of the IT scales and good criterion validity, as evidenced by the correlations with self-perception of stress, work satisfaction and motivation. In conclusion, the Indicator Tool proved to be a valid and reliable instrument for the assessment of work-related stress at the organizational level, and it is also compatible with the instructions provided by the Ministry of Labour and Social Policy (Circular letter 18/11/2010).
MOMENTS OF UNCERTAINTY: ETHICAL CONSIDERATIONS AND EMERGING CONTAMINANTS.
Cordner, Alissa; Brown, Phil
2013-09-01
Science on emerging environmental health threats involves numerous ethical concerns related to scientific uncertainty about conducting, interpreting, communicating, and acting upon research findings, but the connections between ethical decision making and scientific uncertainty are under-studied in sociology. Under conditions of scientific uncertainty, researcher conduct is not fully prescribed by formal ethical codes of conduct, increasing the importance of ethical reflection by researchers, conflicts over research conduct, and reliance on informal ethical standards. This paper draws on in-depth interviews with scientists, regulators, activists, industry representatives, and fire safety experts to explore ethical considerations of moments of uncertainty using a case study of flame retardants, chemicals widely used in consumer products with potential negative health and environmental impacts. We focus on the uncertainty that arises in measuring people's exposure to these chemicals through testing of their personal environments or bodies. We identify four sources of ethical concerns relevant to scientific uncertainty: 1) choosing research questions or methods, 2) interpreting scientific results, 3) communicating results to multiple publics, and 4) applying results for policy-making. This research offers lessons about professional conduct under conditions of uncertainty, ethical research practice, democratization of scientific knowledge, and science's impact on policy.
Han, Eunyoung; Yang, Wonkyung; Lee, Sooyeun; Kim, Eunmi; In, Sangwhan; Choi, Hwakyung; Lee, Sangki; Chung, Heesun; Song, Joon Myong
2011-03-20
The quantitative analysis of 11-nor-D(9)-tetrahydrocannabinol-9-carboxylic acid (THCCOOH) in hair requires a sensitive method to detect a low-pg level. Before applying the method to real hair samples, the method was validated; in this study, we examined the uncertainty obtained from around the cut-off level of THCCOOH in hair. We calculated the measurement uncertainty (MU) of THCCOOH in hair as follows: specification of the measurand, identification of parameters using "cause and effect" diagrams, quantification of the uncertainty contributions using three factors, the uncertainty of weighing the hair sample, the uncertainty from calibrators and the calibration curve, and the uncertainty of the method precision. Finally, we calculated the degrees of freedom and the expanded uncertainty (EU). The concentration of THCCOOH in the hair sample with its EU was (0.60 ± 0.1) × 10(-4)ng/mg. The relative uncertainty percent for the measurand 0.60 × 10(-4)ng was 9.13%. In this study, we also selected different concentrations of THCCOOH in real hair samples and then calculated the EU, the relative standard uncertainty (RSU) of the concentration of THCCOOH in the test sample [u(r)(c0)], the relative uncertainty percent, and the effective degree of freedom (v(eff)). When the concentrations of THCCOOH approached the cut-off level, u(r)(c0) and the relative uncertainty percent increased but absolute EU and v(eff) decreased.
Fabi, Valentina; Andersen, Rune Korsholm; Corgnati, Stefano
2016-01-01
energy use during the design stage. Since the reaction to thermal, acoustic, or visual stimuli is not the same for every human being, monitoring the behaviour inside buildings is an essential step to assert differences in energy consumption related to different interactions. Reliable information...... conditions influence occupants' manual controlling of the system in offices and by consequence the energy consumption. The purpose of this study was to investigate the possible drivers for light-switching to model occupant behaviour in office buildings. The probability of switching lighting systems on or off...... who take daylight level into account and switch on lights only if necessary and people who totally disregard the natural lighting. This underlines the importance of how individuality is at the base of the definition of the different types of users....
Roberts, Delia; Gebhardt, Deborah L; Gaskill, Steven E; Roy, Tanja C; Sharp, Marilyn A
2016-06-01
The use of physical employment standards (PES) has helped ensure that workers have the physical attributes necessary to complete their jobs in a safe and efficient manner. However, PES used in the selection processes have not always reflected the critical physical requirements of the job tasks. Women generally have smaller anthropometric stature than men, less muscle mass, and therefore less strength, power, and endurance, particularly in the upper body. Nonetheless, these attributes in themselves are not valid grounds for exclusion from employment in physically demanding occupations. Selection standards based upon size or strength, irrespective of the job requirements, have resulted in the barring of capable women from physically demanding jobs, claims of gender bias, and costly litigations. To ensure all individuals are provided with equal access to employment, accurate characterization of the critical physical requirements of the job is paramount. This paper summarizes the existing research related to disparities between the sexes that contribute to sex differences in job performance in physically demanding occupations including physical and legal factors. Strategies for mitigating these differences in the setting of PES and the meeting of minimum employment standards are discussed. Where available, injury rates for women and men in physically demanding occupations are presented and the etiology considered. Finally, areas for further research are identified.
Impact of discharge data uncertainty on nutrient load uncertainty
Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars
2016-04-01
Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.
Systematic review: work-related stress and the HSE management standards.
Brookes, K; Limbert, C; Deacy, C; O'Reilly, A; Scott, S; Thirlaway, K
2013-10-01
The Health and Safety Executive (HSE) has defined six management standards representing aspects of work that, if poorly managed, are associated with lower levels of employee health and productivity, and increased sickness absence. The HSE indicator tool aims to measure organizations' performance in managing the primary stressors identified by the HSE management standards. The aims of the study are to explore how the HSE indicator tool has been implemented within organizations and to identify contexts in which the tool has been used, its psychometric properties and relationships with alternative measures of well-being and stress. Studies that matched specific criteria were included in the review. Abstracts were considered by two researchers to ensure a reliable process. Full texts were obtained when abstracts met the inclusion criteria. Thirteen papers were included in the review. Using factor analysis and measures of reliability, the studies suggest that the HSE indicator tool is a psychometrically sound measure. The tool has been used to measure work-related stress across different occupational groups, with a clear relationship between the HSE tool and alternative measures of well-being. Limitations of the tool and recommendations for future research are discussed. The HSE indicator tool is a psychometrically sound measure of organizational performance against the HSE management standards. As such it can provide a broad overview of sources of work-related stress within organizations. More research is required to explore the use of the tool in the design of interventions to reduce stress, and its use in different contexts and with different cultural and gender groups.
Ruwaard, Jeroen; Lange, Alfred; Bouwman, Manon; Broeksteeg, Janneke; Schrieken, Bart
2007-01-01
The aim of this study was to assess the effects of a 7-week standardized cognitive behavioural treatment of work-related stress conducted via e-mail. A total of 342 people applied for treatment in reaction to a newspaper article. Initial screening reduced the sample to a heterogeneous (sub)clinical group of 239 participants. Participants were assigned randomly to a waiting list condition (n = 62), or to immediate treatment (n = 177). A follow-up was conducted 3 years after inception of the treatment. The outcome measures used were the Depression Anxiety Stress Scales (DASS-42) and the Emotional Exhaustion scale of the Maslach Burnout Inventory - General Survey (MBI-GS). Fifty participants (21%) dropped out. Both groups showed statistically significant improvements. Intention-to-treat analysis of covariance (ANCOVAs) revealed that participants in the treatment condition improved significantly more than those in the waiting control condition (0.001 or = d > or = 0.5 (anxiety)). The between-group effects ranged from d = 0.6 (stress) to d = 0.1 (anxiety). At follow-up, the effects were more pronounced, but this result requires replication in view of high attrition at follow-up. The results warrant further research on Internet-driven standardized cognitive behavioural therapy for work-related stress. Such research should include the direct comparison of this treatment with face-to-face treatment, and should address the optimal level of therapist contact in Internet-driven treatment.
Sklerov, Jason H; Couper, Fiona J
2011-09-01
An estimate was made of the measurement uncertainty for blood ethanol testing by headspace gas chromatography. While uncertainty often focuses on compliance to a single threshold level (0.08 g/100 mL), the existence of multiple thresholds, related to enhanced sentencing, subject age, or commercial vehicle licensure, necessitate the use of an estimate with validity across multiple specification levels. The uncertainty sources, in order of decreasing magnitude, were method reproducibility, linear calibration, recovery, calibrator preparation, reference material, and sample preparation. A large set of reproducibility data was evaluated (n = 15,433) in order to encompass measurement variability across multiple conditions, operators, instruments, concentrations and timeframes. The relative, combined standard uncertainty was calculated as ±2.7%, with an expanded uncertainty of ±8.2% (99.7% level of confidence, k = 3). Bias was separately evaluated through a recovery study using standard reference material from a national metrology institute. The uncertainty estimate was verified through the use of proficiency test (PT) results. Assigned values for PT results and their associated uncertainties were calculated as robust means (x*) and standard deviations (s*) of participant values. Performance scores demonstrated that the uncertainty estimate was appropriate across the full range of PT concentrations (0.010-0.370 g/100 mL). The use of PT data as an empirical estimate of uncertainty was not examined. Until providers of blood ethanol PT samples include details on how an assigned value is obtained along with its uncertainty and traceability, the use of PT data should be restricted to the role of verification of uncertainty estimates.
Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling
T. O. Sonnenborg
2015-04-01
Full Text Available Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project forced by the same CO2 scenario (A1B. The changes from the reference period (1991–2010 to the future period (2081–2100 in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.
Heitmann, Carina Yvonne; Feldker, Katharina; Neumeister, Paula; Zepp, Britta Maria; Peterburs, Jutta; Zwitserlood, Pienie; Straube, Thomas
2016-04-01
Our understanding of altered emotional processing in social anxiety disorder (SAD) is hampered by a heterogeneity of findings, which is probably due to the vastly different methods and materials used so far. This is why the present functional magnetic resonance imaging (fMRI) study investigated immediate disorder-related threat processing in 30 SAD patients and 30 healthy controls (HC) with a novel, standardized set of highly ecologically valid, disorder-related complex visual scenes. SAD patients rated disorder-related as compared with neutral scenes as more unpleasant, arousing and anxiety-inducing than HC. On the neural level, disorder-related as compared with neutral scenes evoked differential responses in SAD patients in a widespread emotion processing network including (para-)limbic structures (e.g. amygdala, insula, thalamus, globus pallidus) and cortical regions (e.g. dorsomedial prefrontal cortex (dmPFC), posterior cingulate cortex (PCC), and precuneus). Functional connectivity analysis yielded an altered interplay between PCC/precuneus and paralimbic (insula) as well as cortical regions (dmPFC, precuneus) in SAD patients, which emphasizes a central role for PCC/precuneus in disorder-related scene processing. Hyperconnectivity of globus pallidus with amygdala, anterior cingulate cortex (ACC) and medial prefrontal cortex (mPFC) additionally underlines the relevance of this region in socially anxious threat processing. Our findings stress the importance of specific disorder-related stimuli for the investigation of altered emotion processing in SAD. Disorder-related threat processing in SAD reveals anomalies at multiple stages of emotion processing which may be linked to increased anxiety and to dysfunctionally elevated levels of self-referential processing reported in previous studies. © 2016 Wiley Periodicals, Inc.
Coalition Formation under Uncertainty
2010-03-01
Unfortunately, many current approaches to coalition formation lack provi- sions for uncertainty. This prevents application of coalition formation techniques ...should also include mechanisms and processing techniques that provide stabil- ity, scalability, and, at a minimum, optimality relative to agent beliefs...relocate a piano . For the sake of simplicity, assume payment is divided evenly among the participants in the move (i.e., each mover has the same utility or
Optimizing production under uncertainty
Rasmussen, Svend
This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept o...... the relative benefits and of using the state-contingent approach in a norma-tive context, compared to the EV model....
Uncertainty in artificial intelligence
Shachter, RD; Henrion, M; Lemmer, JF
1990-01-01
This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und
Meteorological uncertainty and rainfall downscaling
J. von Hardenberg
2007-05-01
Full Text Available We explore the sources of forecast uncertainty in a mixed dynamical-stochastic ensemble prediction chain for small-scale precipitation, suitable for hydrological applications. To this end, we apply the stochastic downscaling method RainFARM to each member of ensemble limited-area forecasts provided by the COSMO-LEPS system. Aim of the work is to quantitatively compare the relative weights of the meteorological uncertainty associated with large-scale synoptic conditions (represented by the ensemble of dynamical forecasts and of the uncertainty due to small-scale processes (represented by the set of fields generated by stochastic downscaling. We show that, in current operational configurations, small- and large-scale uncertainties have roughly the same weight. These results can be used to pinpoint the specific components of the prediction chain where a better estimate of forecast uncertainty is needed.
Stereo-particle image velocimetry uncertainty quantification
Bhattacharya, Sayantan; Charonko, John J.; Vlachos, Pavlos P.
2017-01-01
Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric
汤妍雯; 温金萍; 曹帅英; 李淑萍; 刘俊保
2014-01-01
定值是标准物质研制的重要环节，也是标准物质研制水平高低的体现。标准物质研制中定值的准确和不确定度度评定的合理则对标准物质研制的水平和推广应用起着重要的作用。对均匀性、稳定性检验后符合标准物质研制的乙丙橡胶（EPDM）4050样品，联合多家实验室进行定值。EPDM 4050门尼粘度标准物质参与定值的各实验室的数据在0.01的显著水平时，定值的数据处于正态分布、无可疑值、处于等精度，定值数据的平均值为 EPDM 4050门尼粘度标准物质的标称值，EPDM 4050标准物质在 ML(1+4)100℃、ML(1+8)100℃下的标称值分别为42.4±0.5、40.4±0.4。%Setting value is an important step in preparation of standard reference material; it also embodies the development level of standard substance. Setting value in preparation of standard reference material and evaluation of uncertainty degree have great effect on development and application of standard reference material. The uniformity and stability test of EPDM 4050 sample showed that it was in accord with the standard reference material, then combining several laboratories, setting value of EPDM 4050 sample was carried out. When the fixed value of the data from the laboratories was at 0.01 significant level, fixed value data were in normal distribution, no doubt, in equal precision, the average of fixed value data was nominal value of Mooney viscosity standard material EPDM 4050, nominal values of EPDM 4050 standard substance in the ML (1+4 100 ℃) and ML (1+8) 100 ℃ were 42.4±0.5, 40.4±0.4.
Glysson, G. Douglas; Skinner, John V.
1991-01-01
In the late 1950's, intense demands for water and growing concerns about declines in the quality of water generated the need for more water-resources data. About thirty Federal agencies, hundreds of State, county and local agencies, and many private organizations had been collecting water data. However, because of differences in procedures and equipment, many of the data bases were incompatible. In 1964, as a step toward establishing more uniformity, the Bureau of the Budget (now the Office of Management and Budget, OMB) issued 'Circular A-67' which presented guidelines for collecting water data and also served as a catalyst for creating the Office of Water Data Coordination (OWDC) within the U.S. Geological Survey. This paper discusses past, present, and future aspects of the relation between methods in the National Handbook and standards published by ASTM (American Society for Testing and Materials) Committee D-19 on Water's Subcommittee D-19.07 on Sediment, Geomorphology, and Open Channel Flow. The discussion also covers historical aspects of standards - development work jointly conducted by OWDC and ASTM.
Realising the Uncertainty Enabled Model Web
Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.
2012-12-01
The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address
Physical and Model Uncertainty for Fatigue Design of Composite Material
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard
The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...
Statistical characterization of roughness uncertainty and impact on wind resource estimation
Kelly, Mark C.; Ejsing Jørgensen, Hans
2017-01-01
In this work we relate uncertainty in background roughness length (z0) to uncertainty in wind speeds, where the latter are predicted at a wind farm location based on wind statistics observed at a different site. Sensitivity of predicted winds to roughness is derived analytically for the industry......-standard European Wind Atlas method, which is based on the geostrophic drag law. We statistically consider roughness and its corresponding uncertainty, in terms of both z0 derived from measured wind speeds as well as that chosen in practice by wind engineers. We show the combined effect of roughness uncertainty...... arising from differing wind-observation and turbine-prediction sites; this is done for the case of roughness bias as well as for the general case. For estimation of uncertainty in annual energy production (AEP), we also develop a generalized analytical turbine power curve, from which we derive a relation...
Vámos, Tibor
The gist of the paper is the fundamental uncertain nature of all kinds of uncertainties and consequently a critical epistemic review of historical and recent approaches, computational methods, algorithms. The review follows the development of the notion from the beginnings of thinking, via the Aristotelian and Skeptic view, the medieval nominalism and the influential pioneering metaphors of ancient India and Persia to the birth of modern mathematical disciplinary reasoning. Discussing the models of uncertainty, e.g. the statistical, other physical and psychological background we reach a pragmatic model related estimation perspective, a balanced application orientation for different problem areas. Data mining, game theories and recent advances in approximation algorithms are discussed in this spirit of modest reasoning.
Ma, Weina; Yang, Liu; Lv, Yanni; Fu, Jia; Zhang, Yanmin; He, Langchong
2017-06-23
The equilibrium dissociation constant (KD) of drug-membrane receptor affinity is the basic parameter that reflects the strength of interaction. The cell membrane chromatography (CMC) method is an effective technique to study the characteristics of drug-membrane receptor affinity. In this study, the KD value of CMC relative standard method for the determination of drug-membrane receptor affinity was established to analyze the relative KD values of drugs binding to the membrane receptors (Epidermal growth factor receptor and angiotensin II receptor). The KD values obtained by the CMC relative standard method had a strong correlation with those obtained by the frontal analysis method. Additionally, the KD values obtained by CMC relative standard method correlated with pharmacological activity of the drug being evaluated. The CMC relative standard method is a convenient and effective method to evaluate drug-membrane receptor affinity. Copyright © 2017 Elsevier B.V. All rights reserved.
Uncertainty principle in larmor clock
QIAO Chuan; REN Zhong-Zhou
2011-01-01
It is well known that the spin operators of a quantum particle must obey uncertainty relations.We use the uncertainty principle to study the Larmor clock.To avoid breaking the uncertainty principle,Larmor time can be defined as the ratio of the phase difference between a spin-up particle and a spin-down particle to the corresponding Larmor frequency.The connection between the dwell time and the Larmor time has also been confirmed.Moreover,the results show that the behavior of the Larmor time depends on the height and width of the barrier.
The Precautionary Principle and statistical approaches to uncertainty
Keiding, Niels; Budtz-Jørgensen, Esben
2003-01-01
Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification...
The Effect of the International Accounting Standards on the Related Party Transactions Disclosure
Stauropoulos Antonios
2011-01-01
Full Text Available Problem statement: Several recent North American corporate scandals have brought attention to the potential for accounting manipulations associated with Related Party Transactions (RPTs, which have lead to a decline in perceived earnings quality. We examine the value relevance of disclosed RPTs in Greek corporations. Approach: We focus on two types of RPTs: sales of goods and sales of assets, using a value relevance approach. Results: From 2002-2007, we find that the reported earnings of firms selling goods or assets to related parties exhibit a lower valuation coefficient than those of firms in Greece without such transactions. This result is not observed during 2005-2007 after a new fair value measurement rule for RPTs came into effect. Conclusion: Our evidence suggests that the new RPT regulation in Greece is perceived to be effective at reducing the potential misuse of RPTs for earnings management purposes. Since RPTs have been the subject of numerous scandals in North America, our evidence from the Greek stock markets suggests that new RPT accounting standards could prove an efficient solution to this issue.
Moore, C; Pure, K; Furrow, D
1990-06-01
2 experiments examined children's understanding of the expression of speaker certainty and uncertainty and its relation to their developing theory of mind. In the first experiment, 80 children between 3 and 6 years of age were presented with a task in which they had to guess the location of an object hidden in 1 of 2 boxes. As clues to location, the children were presented with contrasting pairs of statements by 2 puppets. Different trials contained all of the possible pairwise combinations of either the modal verbs must, might, and could or the modal adjuncts probably, possibly, and maybe. Results showed that while 3-year-olds did not differentiate between any of the modal contrasts presented, 4-year-olds and older children were able to find the hidden object on the basis of what they heard. Performance was best for contrasts involving a highly certain term (either must or probably) paired with a less certain term (might, could, possibly, and maybe). Experiment 2 was designed to determine whether competence with modal terms was related to competence with mental terms in the same task, and whether performance on the certainty task was related to other aspects of the child's understanding of the nature of beliefs. 26 4-year-olds were presented with the certainty task, involving both modal and mental terms, and with tasks assessing their understanding of false beliefs, representational change, and the appearance-reality distinction. Results showed that all of these tasks were intercorrelated, implying that what may develop at 4 years of age may be a general understanding of the representational nature of belief.
Non-standard employment relations and wages among school leavers in the Netherlands
Vries, M.R. de; Wolbers, M.H.J.
2005-01-01
Non-standard (alternatively, flexible) employment has become common in the Netherlands, and viewed as an important weapon for combating youth unemployment. However, if such jobs are 'bad', non-standard employment becomes a matter of concern. In addition, non-standard employment may hit the least qua
Evaluation of uncertainty in the Norwegian emission inventory
Rypdal, Kristin
1999-10-01
The uncertainty in estimating discharges is systematically examined for all source categories in the IPCC standard report. The uncertainty in the values is estimated quantitatively. This indicates an uncertainty in the yearly discharge of climatic gases in Norway of {+-} 10-20 %. The methane discharge from waste deposits, laughing gas from agriculture and perfluoric carbons from the aluminium production contribute to the major uncertainties in the climatic gas account. The uncertainty tendency (percentage change from a basic year to a final year) is estimated by aid of sensitivity analysis. The analysis indicate that a reduction or increase in the discharge of climatic gases in percentage (expressed in CO{sub 2} equivalents) compared to a basic year is relatively unaffected by mistakes in level and tendency for the single climatic gases. Exception exists for cases where the discharge of a climatic gas or a discharge from a single source that show substantially different tendencies from the tendency of the total discharges. A complete evaluation indicates that the uncertainty in tendency is more than {+-} 1 percentage point for the period of 1990 to 2010. The major routines used for avoiding mistakes in the account are assumed to be comparison with earlier estimates, with corresponding estimates from other counties and comparison of different calculation methods. 5 figs., 52 tabs., 12 refs.
Effects of standard humic materials on relative bioavailability of NDL-PCBs in juvenile swine.
Matthieu Delannoy
Full Text Available Young children with their hand-to-mouth activity may be exposed to contaminated soils. However few studies assessing exposure of organic compounds sequestrated in soil were realized. The present study explores the impact of different organic matters on retention of NDL-PCBs during digestive processes using commercial humic substances in a close digestive model of children: the piglet. Six artificial soils were used. One standard soil, devoid of organic matter, and five amended versions of this standard soil with either fulvic acid, humic acid, Sphagnum peat, activated carbon or a mix of Sphagnum peat and activated carbon (95∶5 (SPAC were prepared. In order to compare the different treatments, we use spiked oil and negative control animals. Forty male piglets were randomly distributed in 7 contaminated and one control groups (n = 5 for each group. During 10 days, the piglets were fed artificial soil or a corn oil spiked with 19,200 ng of Aroclor 1254 per g of dry matter (6,000 ng.g⁻¹ of NDL-PCBs to achieve an exposure dose of 1,200 ng NDL-PCBs.Kg⁻¹ of body weight per day. NDL-PCBs in adipose tissue were analyzed by GC-MS. Fulvic acid reduced slightly the bioavailability of NDL-PCBs compared to oil. Humic acid and Sphagnum peat reduced it significantly higher whereas activated carbon reduced the most. Piglets exposed to soil containing both activated carbon and Shagnum peat exhibited a lower reduction than soil with only activated carbon. Therefore, treatment groups are ordered by decreasing value of relative bioavailability as following: oil ≥ fulvic acid>Sphagnum peat ≥ Sphagnum peat and activated carbon ≥ Humic acid>>activated carbon. This suggests competition between Sphagnum peat and activated carbon. The present study highlights that quality of organic matter does have a significant effect on bioavailability of sequestrated organic compounds.
National Oceanic and Atmospheric Administration, Department of Commerce — This dataset represents sediment size prediction uncertainty from a sediment spatial model developed for the New York offshore spatial planning area. The model also...
Tian, Kaiyu
2016-03-01
The development of Science of Acupuncture and Moxibustion should be in accord with the trend of standardization and internationalization of the science of acupuncture and moxibustion. Based on the arrangement of chapters and sections in the textbook, 29 national standards, 6 standards or guidelines made by World Health Or- ganization(WHO) and 1 standard out of International Standardization Organization (ISO) are classified and intro- duced. It is suggested that the above contents should be considered as the evidence when the textbook is reedited. Also, it is proposed that humanization should be supplemented and the newest research findings should be traced.
Analysis of Chinese Accounting Standards for the Oil and Gas Industry and Related Enterprises
无
2006-01-01
Accounting standards are the tools for distribution of the revenues. Their development trend is influenced by their stakeholders. The evolution of American oil and gas accounting standards has been shaped by the profit-maximizing process of American oil and gas company shareholders, which for outside lobbying relied on their huge capital and organization. The development and perfection of Chinese new oil and gas accounting standards should consider not only the criterion of standards but also the real political fact in China oil and gas industry. The research on oil and gas accounting standards is an academic study as well as a political analysis.
Verification of uncertainty budgets
Heydorn, Kaj; Madsen, B.S.
2005-01-01
The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data, and th...
赵秀芬; 侯玮; 郭宇飞; 关桂云
2014-01-01
Objective:To establish density bottle method to evalu-ate uncertainty in determination of relative density of smoothing toner. Method:According to JJF1059.1-2012 evaluation and ex-pression of uncertainty in measurement and GB/T 13531.4 2013 general methods on determination of cosmetics-determination of relative density, measurement model was established, and cause-effect diagram of sources of uncertainty was drawn up, and each component of uncertainty was calculated and classified, the expanded uncertainty was U=0.0002. Result:The results of rela-tive density of smoothing toner was 1.009 7 ±0.000 2. Conclu-sion:The study can provide reference basis for uncertainty eval-uation in determination of relative density.%目的：院建立密度瓶法测定柔肤水相对密度不确定度评定的方法。方法院依据JJF1059.1-2012《测量不确定度评定与表示》，采用GB/T 13531.4要2013《化妆品通用检验方法相对密度的测定》，建立测量模型，绘制不确定度来源因果图，对检测过程中引入的不确定度进行了分类和量化，得到扩展不确定度U=0.0002。结果院柔肤水相对密度测定结果表示为1.0097依0.0002。结论院研究为相对密度测定的不确定度评定提供参考依据。
Development of standards and a cost model for coal agglomeration and related studies
Nelson, S.G.; Kuby, O.A.; Korosi, F.A.; Paulin, M.O.
1982-02-26
Several topics concerning coal agglomeration and fixed-bed coal gasification, as they relate to an agglomeration-process development program presently being performed for the Department of Energy, are discussed in this report. Specific topics include an examination of the performance of coals in fixed-bed gasifiers, the development of properties' standards by which agglomerates produced in the program may be compared, the development of a cost model to judge the economic feasibility of coal agglomeration for potential users and the maximum binder levels to be considered in the program, the definition of a suitable briquette size for coal gasification, and a study of upgrading methods at the mines to improve agglomeration. Extensive property data and the results of a number of special tests on six coals (Pittsburgh No. 8 bituminous coal, Illinois No. 6 bituminous coal, Wyoming Bighorn subbituminous coal, Montana Rosebud No. 14 subbituminous coal, North Dakota Indian Head lignite and Pennsylvania Nanoth anthracite coal) and on FMC formcoke and Simplex briquettes are reported.
United States Government Regulations and International Standards Related to Food Analysis
Nielsen, S. Suzanne
Knowledge of government regulations relevant to the chemical analysis of foods is extremely important to persons working in the food industry. Federal laws and regulations reinforce the efforts of the food industry to provide wholesome foods, to inform consumers about the nutritional composition of foods, and to eliminate economic frauds. In some cases, they dictate what ingredients a food must contain, what must be tested, and the procedures used to analyze foods for safety factors and quality attributes. This chapter describes the US federal regulations related to the composition of foods. The reader is referred to references (1-4) for comprehensive coverage of US food laws and regulations. Many of the regulations referred to in this chapter are published in the various titles of the Code of Federal Regulations (CFR) (5). This chapter also includes information about food standards and safety practices established by international organizations. Internet addresses are given at the end of this chapter for many of the government agencies, organizations, and documents discussed.
A protocol for assessment of uncertainty and strength of emissions data
Risbey, James S.; Sluijs, J.P. van der; Ravetz, Jerome R.
2006-01-01
This method is intended to assist in characterizing uncertainties in emissions data for the Mileubalans and to identify critical issues related to uncertainty. The method assesses both quantitative and qualitative dimensions of uncertainty. Quantitative uncertainties are expressed by assigning proba
Collective Uncertainty Entanglement Test
Rudnicki, Łukasz; Życzkowski, Karol
2011-01-01
For a given pure state of a composite quantum system we analyze the product of its projections onto a set of locally orthogonal separable pure states. We derive a bound for this product analogous to the entropic uncertainty relations. For bipartite systems the bound is saturated for maximally entangled states and it allows us to construct a family of entanglement measures, we shall call collectibility. As these quantities are experimentally accessible, the approach advocated contributes to the task of experimental quantification of quantum entanglement, while for a three-qubit system it is capable to identify the genuine three-party entanglement.
论测量的不确定度%On uncertainty of measurement
孙建文
2012-01-01
简单介绍了＂测量不确定度＂的概念,结合相关规范提出了对测量不确定度的具体要求,阐述了如何确定测量不确定度,包括不确定度的来源识别,建立测量过程的模型,逐项评定标准不确定度等内容,以指导实践。%The paper introduces the concept of the uncertainty of the measurement,illustrates how to identify the uncertainty by combining relative regulation＇s requirement on the measurement uncertainty,including the origin identification of the uncertainty,the model for building the measurement process and the gradual evaluation standards of the uncertainty,so as to direct the practice.
马志敏; 孙耀; 张波; 唐启升
2004-01-01
Standard metabolic rates of Schlegels black rockfish with different body weights are determined in laboratory by using the flow-through respirometer at 11.2 ℃, 14.7 ℃, 18.0℃ and 23.6 ℃. The results indicate that the standard metabolic rates increase with the increase of body weight at different temperatures. Relationship between them could be described as Rs = a InW b. The mean of standard metabolic rate is significantly different among groups, but the b values are not. The standard metabolic rates of amended standard body weights decrease with the increase of temperature, and the mean of standard metabolic rate is also significantly different among groups when the standard body weights are 48.6 g, 147.9 g, and 243.1 g.Relationship between them could be described as Rsw = me-bT . The relations of standard metabolic rate ( Rs ) or relative metabolic rate ( Rs ) to body weight and temperature yield the following equations: Rs = 1.160 W0.752 e-9.494/7 and Rs1= 1.160 W0.254e-9.494/7.
Wildfire Decision Making Under Uncertainty
Thompson, M.
2013-12-01
Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.
2012-03-01
certify to : ISO 9001 (QMS), ISO 14001 (EMS), TS 16949 (US Automotive) etc. 2 3 DoD QSM 4.2 standard ISO /IEC 17025:2005 Each has uncertainty...298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Laboratories – ISO /IEC 17025 Inspection Bodies – ISO /IEC 17020 RMPs – ISO Guide 34 (Reference...Materials) PT Providers – ISO 17043 Product Certifiers – ISO Guide 65 Government Programs: DoD ELAP, EPA Energy Star, CPSC Toy Safety, NRC, NIST
Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei
2012-12-01
To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.
Characterizing Sources of Uncertainty in IRT Scale Scores
Yang, Ji Seung; Hansen, Mark; Cai, Li
2011-01-01
Traditional estimators of item response theory (IRT) scale scores ignore uncertainty carried over from the item calibration process, which can lead to incorrect estimates of standard errors of measurement (SEM). Here, we review a variety of approaches that have been applied to this problem and compare them on the basis of their statistical methods and goals. We then elaborate on the particular flexibility and usefulness of a Multiple Imputation (MI) based approach, which can be easily applied to tests with mixed item types and multiple underlying dimensions. This proposed method obtains corrected estimates of individual scale scores, as well as their SEM. Furthermore, this approach enables a more complete characterization of the impact of parameter uncertainty by generating confidence envelopes (intervals) for item tracelines, test information functions, conditional SEM curves, and the marginal reliability coefficient. The MI based approach is illustrated through the analysis of an artificial data set, then applied to data from a large educational assessment. A simulation study was also conducted to examine the relative contribution of item parameter uncertainty to the variability in score estimates under various conditions. We found that the impact of item parameter uncertainty is generally quite small, though there are some conditions under which the uncertainty carried over from item calibration contributes substantially to variability in the scores. This may be the case when the calibration sample is small relative to the number of item parameters to be estimated, or when the IRT model fit to the data is multidimensional. PMID:23049139
Car, Nicholas; Cox, Simon; Fitch, Peter
2015-04-01
With earth-science datasets increasingly being published to enable re-use in projects disassociated from the original data acquisition or generation, there is an urgent need for associated metadata to be connected, in order to guide their application. In particular, provenance traces should support the evaluation of data quality and reliability. However, while standards for describing provenance are emerging (e.g. PROV-O), these do not include the necessary statistical descriptors and confidence assessments. UncertML has a mature conceptual model that may be used to record uncertainty metadata. However, by itself UncertML does not support the representation of uncertainty of multi-part datasets, and provides no direct way of associating the uncertainty information - metadata in relation to a dataset - with dataset objects.We present a method to address both these issues by combining UncertML with PROV-O, and delivering resulting uncertainty-enriched provenance traces through the Linked Data API. UncertProv extends the PROV-O provenance ontology with an RDF formulation of the UncertML conceptual model elements, adds further elements to support uncertainty representation without a conceptual model and the integration of UncertML through links to documents. The Linked ID API provides a systematic way of navigating from dataset objects to their UncertProv metadata and back again. The Linked Data API's 'views' capability enables access to UncertML and non-UncertML uncertainty metadata representations for a dataset. With this approach, it is possible to access and navigate the uncertainty metadata associated with a published dataset using standard semantic web tools, such as SPARQL queries. Where the uncertainty data follows the UncertML model it can be automatically interpreted and may also support automatic uncertainty propagation . Repositories wishing to enable uncertainty propagation for all datasets must ensure that all elements that are associated with uncertainty
Uncertainty in tsunami sediment transport modeling
Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.
2016-01-01
Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.
Uncertainty estimation of continuous in-situ greenhouse gas observation
Karion, A.; Verhulst, K. R.; Kim, J.; Sloop, C.; Salameh, P.; Ghosh, S.
2016-12-01
Global trends in urbanization have focused community interest in urban greenhouse gas (GHG) emissions, leading to many recent studies on GHG emissions from cities. Many efforts to quantify urban GHG emissions have focused on establishing long-term, relatively dense networks of tower or roof-based continuous in-situ GHG observations. Here we introduce in-situ measurements from a network of tower sites in the Washington DC and Baltimore urban regions (NorthEast Corridor), designed specifically for use in atmospheric inversions to determine fossil-fuel emissions of carbon dioxide (Lopez-Coto et al, in review). Such flux estimation techniques rely on an understanding of the uncertainty associated with each observation, however, and how this uncertainty changes with concentration or site conditions. We have developed an uncertainty estimation method for continuous measurements made using the Earth Networks, Inc. GHG observing system, based on Picarro CRDS analyzers and GCWerks processing software. We find that the largest uncertainty component is due to the extrapolation of the calibration based on one reference gas standard, and that this uncertainty component is linearly dependent on the measured mole fraction. The uncertainty estimation has been developed for and applied to the LA Megacities project (Verhulst et al., in prep) and the NorthEast Corridor measurements, but can also be applied at other sites across the US. Establishing robust uncertainty estimates for these GHG observations relative to the WMO scales will allow these data to be incorporated in atmospheric inversion models along with other continental and global observations. *Certain commercial equipment is identified in this work in order to specify the experimental procedure adequately. Such identification is not intended to imply recommendation or endorsement by NIST, nor is it intended to imply that the materials or equipment identified are necessarily the best available for the purpose.
1977-08-01
Documents relevant to the development and implementation of the California energy insulation standards for new residential buildings were evaluated and a survey was conducted to determine problems encountered in the implementation, enforcement, and design aspects of the standards. The impact of the standards on enforcement agencies, designers, builders and developers, manufacturers and suppliers, consumers, and the building process in general is summarized. The impact on construction costs and energy savings varies considerably because of the wide variation in prior insulation practices and climatic conditions in California. The report concludes with a series of recommendations covering all levels of government and the building process. (MCW)
Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation
Zhao, T.; Cai, X.; Yang, D.
2010-12-01
Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover
MECHANICAL CHARACTERIZATION OF HEAT-TREATED ASH WOOD IN RELATION WITH STRUCTURAL TIMBER STANDARDS
Simon HANNOUZ
2015-06-01
Full Text Available Heat treatment is an attractive method to enhance wood durability, and valorize local hardwood species with natural low durability. Yet no standard allows the certification of such products. This study first aims to observe the influence of heat treatment on the different mechanical properties. The standard mechanical tests; bending, tension parallel and perpendicular to grain, compression parallel and perpendicular to grain and shear, have been performed on native and heat-treated woods samples. The measurements are then compared to values of EN 338 standard. Results reveal that shear strength is the property most affected by heat treatment and that the modulus of elasticity perpendicular to grain is increased. The values given by EN 338 standard are generally safe with the exception of shear strength which is underestimated by current relationships. It is suggested that new relationships have to be provided for heat-treated wood, taking into account the loss of shear resistance.
翟晚枫; 张春水; 高利生
2012-01-01
依据JJF 1059-1999《测量不确定度评定与表示》,以检测大麻树脂中四氢大麻酚含量的实验为例,评定了采用高效液相色谱外标工作曲线法进行含量测定的不确定度.建立了高效色谱外标工作曲线法测定样品含量的数学模型,分析了不确定度的来源并给出了量化结果,合成扩展不确定度为0.15％(k=2).%The determination uncertainty of tetrahydrocannabinol in cannabis resin by HPLC external standard working curve method was evaluated based on JJF 1059?999 Evaluation and Expression of Uncertainty in Measurement. The mathematical models to evaluate the uncertainty of quantitative results of HPLC external standard working curve method were established, and each source and uncertainty component was analysized, and the combined expanded uncertainty was 0.15%(k=2).
Uncertainty in geological and hydrogeological data
B. Nilsson
2007-09-01
Full Text Available Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.
Review on Generalized Uncertainty Principle
Tawfik, Abdel Nasser
2015-01-01
Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.
Gürhan UYSAL
2008-01-01
Full Text Available This study explores the impact of resource uncertainty and relational exchange between customer and supplier. Resource uncertainty involves factors as resource concentration, resource availability uncertainty and resource interconnectedness. The necessary data has been collected from 134 companies in Marmara Region through a questionnaire. This study, therefore, adopts factor, correlation and regression analyses to test impact of resource uncertainty on relational exchange. Data analysis reveals that resource concentration and resource availability uncertainty do not have an impact on relational exchange between customer and supplier and resource interconnectedness influences relational exchange. Furthermore, One-way Anova tests demonstrate that resource concentration, resource availability uncertainty and resource interconnectedness do not significantly differentiate on control variables such as industry, foundation year, revenues and number of employees.
RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY
Salaymeh, S.; Ashley, W.; Jeffcoat, R.
2010-06-17
It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.
Dishwashing water recycling system and related water quality standards for military use.
Church, Jared; Verbyla, Matthew E; Lee, Woo Hyoung; Randall, Andrew A; Amundsen, Ted J; Zastrow, Dustin J
2015-10-01
As the demand for reliable and safe water supplies increases, both water quality and available quantity are being challenged by population growth and climate change. Greywater reuse is becoming a common practice worldwide; however, in remote locations of limited water supply, such as those encountered in military installations, it is desirable to expand its classification to include dishwashing water to maximize the conservation of fresh water. Given that no standards for dishwashing greywater reuse by the military are currently available, the current study determined a specific set of water quality standards for dishwater recycling systems for U.S. military field operations. A tentative water reuse standard for dishwashing water was developed based on federal and state regulations and guidelines for non-potable water, and the developed standard was cross-evaluated by monitoring water quality data from a full-scale dishwashing water recycling system using an innovative electrocoagulation and ultrafiltration process. Quantitative microbial risk assessment (QMRA) was also performed based on exposure scenarios derived from literature data. As a result, a specific set of dishwashing water reuse standards for field analysis (simple, but accurate) was finalized as follows: turbidity (reuse and will be expected to ensure that water quality is safe for field operations, but not so stringent that design complexity, cost, and operational and maintenance requirements will not be feasible for field use. In addition the parameters can be monitored using simple equipment in a field setting with only modest training requirements and real-time or rapid sample turn-around. This standard may prove useful in future development of civilian guidelines.
Worry, Intolerance of Uncertainty, and Statistics Anxiety
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
Worry, Intolerance of Uncertainty, and Statistics Anxiety
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
Non-scalar uncertainty: Uncertainty in dynamic systems
Martinez, Salvador Gutierrez
1992-01-01
The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an
Ramanjaneyulu, P. S.; Sayi, Y. S.; Ramakumar, K. L.
2008-08-01
Quantification of boron in diverse materials of relevance in nuclear technology is essential in view of its high thermal neutron absorption cross section. A simple and sensitive method has been developed for the determination of boron in uranium-aluminum-silicon alloy, based on leaching of boron with 6 M HCl and H 2O 2, its selective separation by solvent extraction with 2-ethyl hexane 1,3-diol and quantification by spectrophotometry using curcumin. The method has been evaluated by standard addition method and validated by inductively coupled plasma-atomic emission spectroscopy. Relative standard deviation and absolute detection limit of the method are 3.0% (at 1 σ level) and 12 ng, respectively. All possible sources of uncertainties in the methodology have been individually assessed, following the International Organization for Standardization guidelines. The combined uncertainty is calculated employing uncertainty propagation formulae. The expanded uncertainty in the measurement at 95% confidence level (coverage factor 2) is 8.840%.
Olesen, Bjarne W.; de Carli, Michele
2011-01-01
According to the Energy Performance of Buildings Directive (EPBD) all new European buildings (residential, commercial, industrial, etc.) must since 2006 have an energy declaration based on the calculated energy performance of the building, including heating, ventilating, cooling and lighting...... systems. This energy declaration must refer to the primary energy or CO2 emissions. The European Organization for Standardization (CEN) has prepared a series of standards for energy performance calculations for buildings and systems. This paper presents related standards for heating systems. The relevant...... CEN-standards are presented and a sample calculation of energy performance is made for a small single family house, an office building and an industrial building in three different geographical locations: Stockholm, Brussels, and Venice. The additional heat losses from heating systems can be 10...
Parton Distribution Function Uncertainties
Giele, Walter T.; Kosower, David A.; Giele, Walter T.; Keller, Stephane A.; Kosower, David A.
2001-01-01
We present parton distribution functions which include a quantitative estimate of its uncertainties. The parton distribution functions are optimized with respect to deep inelastic proton data, expressing the uncertainties as a density measure over the functional space of parton distribution functions. This leads to a convenient method of propagating the parton distribution function uncertainties to new observables, now expressing the uncertainty as a density in the prediction of the observable. New measurements can easily be included in the optimized sets as added weight functions to the density measure. Using the optimized method nowhere in the analysis compromises have to be made with regard to the treatment of the uncertainties.
Oppong C
1993-01-01
Examines the ILO's constitution, mandates and International Labour Standards in the areas of female worker protection, and promotion of equality and population. Highlights some current labour problems specific to women, including 'invisible' labour and its lack of documentation, the lack of equality, and the need for protection. Also highlights nine gender issues, both of labour and population. Bibliography, statistical tables and abstract in French.
1991-12-01
Extremely high standards may also cause individuals to perceive the feedback system as unfair and not credible ( Dornbusch & Scott, 1975). Further...system fairness ( Dornbusch & Scott, 1975; Jacobs, Jacobs, Feldman, & Cavior, 1973), and increases performance by allowing people to make accurate...to internalize them (Erez & Kanfer, 1983). Third, reactions against the system may be less likely ( Dornbusch & Scott, 1975). The framing of
76 FR 50117 - Commission Rules and Forms Related to the FASB's Accounting Standards Codification
2011-08-12
... paragraph 305(a) by removing ``FASB, Statement of Financial Accounting Standards No. 52, `Foreign Currency... paragraph 20 (December 1981)'' and adding in its place ``FASB ASC paragraph 830-20-35-3 (Foreign Currency... place ``FASB ASC Topic 830, Foreign Currency Matters''. 0 g. Amend Instruction 4.B. of the...
Comparing Teachers' Literacy-Related Knowledge to Their State's Standards for Reading
McCombes-Tolis, Jule; Feinn, Richard
2008-01-01
This study compared elementary and special education teachers' knowledge of when K-3 students develop key reading competencies, their knowledge of who is responsible for teaching K-3 students key reading competencies, and teachers' perceptions of their own instructionally relevant competencies to those standards articulated within their state's…
Permissible limits for uncertainty of measurement in laboratory medicine.
Haeckel, Rainer; Wosniok, Werner; Gurr, Ebrhard; Peil, Burkhard
2015-07-01
The international standard ISO 15189 requires that medical laboratories estimate the uncertainty of their quantitative test results obtained from patients' specimens. The standard does not provide details how and within which limits the measurement uncertainty should be determined. The most common concept for establishing permissible uncertainty limits is to relate them on biological variation defining the rate of false positive results or to base the limits on the state-of-the-art. The state-of-the-art is usually derived from data provided by a group of selected medical laboratories. The approach on biological variation should be preferred because of its transparency and scientific base. Hitherto, all recommendations were based on a linear relationship between biological and analytical variation leading to limits which are sometimes too stringent or too permissive for routine testing in laboratory medicine. In contrast, the present proposal is based on a non-linear relationship between biological and analytical variation leading to more realistic limits. The proposed algorithms can be applied to all measurands and consider any quantity to be assured. The suggested approach tries to provide the above mentioned details and is a compromise between the biological variation concept, the GUM uncertainty model and the technical state-of-the-art.
Andres, T.H
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Lorentz Invariance Violation and Generalized Uncertainty Principle
Tawfik, A; Ali, A Farag
2016-01-01
Recent approaches for quantum gravity are conjectured to give predictions for a minimum measurable length, a maximum observable momentum and an essential generalization for the Heisenberg uncertainty principle (GUP). The latter is based on a momentum-dependent modification in the standard dispersion relation and leads to Lorentz invariance violation (LIV). The main features of the controversial OPERA measurements on the faster-than-light muon neutrino anomaly are used to calculate the time of flight delays $\\Delta t$ and the relative change $\\Delta v$ in the speed of neutrino in dependence on the redshift $z$. The results are compared with the OPERA measurements. We find that the measurements are too large to be interpreted as LIV. Depending on the rest mass, the propagation of high-energy muon neutrino can be superluminal. The comparison with the ultra high energy cosmic rays seems to reveals an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly ...
Kolker, Eugene; Hogan, Jason M; Higdon, Roger; Kolker, Natali; Landorf, Elizabeth; Yakunin, Alexander F; Collart, Frank R; van Belle, Gerald
2007-10-01
Mixtures of known proteins have been very useful in the assessment and validation of methods for high-throughput (HTP) MS (MS/MS) proteomics experiments. However, these test mixtures have generally consisted of few proteins at near equal concentration or of a single protein at varied concentrations. Such mixtures are too simple to effectively assess the validity of error rates for protein identification and differential expression in HTP MS/MS studies. This work aimed at overcoming these limitations and simulating studies of complex biological samples. We introduced a pair of 54-protein standard mixtures of variable concentrations with up to a 1000-fold dynamic range in concentration and up to ten-fold expression ratios with additional negative controls (infinite expression ratios). These test mixtures comprised 16 off-the-shelf Sigma-Aldrich proteins and 38 Shewanella oneidensis proteins produced in-house. The standard proteins were systematically distributed into three main concentration groups (high, medium, and low) and then the concentrations were varied differently for each mixture within the groups to generate different expression ratios. The mixtures were analyzed with both low mass accuracy LCQ and high mass accuracy FT-LTQ instruments. In addition, these 54 standard proteins closely follow the molecular weight distributions of both bacterial and human proteomes. As a result, these new standard mixtures allow for a much more realistic assessment of approaches for protein identification and label-free differential expression than previous mixtures. Finally, methodology and experimental design developed in this work can be readily applied in future to development of more complex standard mixtures for HTP proteomics studies.
Force calibration using errors-in-variables regression and Monte Carlo uncertainty evaluation
Bartel, Thomas; Stoudt, Sara; Possolo, Antonio
2016-06-01
An errors-in-variables regression method is presented as an alternative to the ordinary least-squares regression computation currently employed for determining the calibration function for force measuring instruments from data acquired during calibration. A Monte Carlo uncertainty evaluation for the errors-in-variables regression is also presented. The corresponding function (which we call measurement function, often called analysis function in gas metrology) necessary for the subsequent use of the calibrated device to measure force, and the associated uncertainty evaluation, are also derived from the calibration results. Comparisons are made, using real force calibration data, between the results from the errors-in-variables and ordinary least-squares analyses, as well as between the Monte Carlo uncertainty assessment and the conventional uncertainty propagation employed at the National Institute of Standards and Technology (NIST). The results show that the errors-in-variables analysis properly accounts for the uncertainty in the applied calibrated forces, and that the Monte Carlo method, owing to its intrinsic ability to model uncertainty contributions accurately, yields a better representation of the calibration uncertainty throughout the transducer’s force range than the methods currently in use. These improvements notwithstanding, the differences between the results produced by the current and by the proposed new methods generally are small because the relative uncertainties of the inputs are small and most contemporary load cells respond approximately linearly to such inputs. For this reason, there will be no compelling need to revise any of the force calibration reports previously issued by NIST.
Uncertainty and Cognitive Control
Faisal eMushtaq
2011-10-01
Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.
Uncertainty of measurement: an immunology laboratory perspective.
Beck, Sarah C; Lock, Robert J
2015-01-01
'Measurement uncertainty of measured quantity values' (ISO15189) requires that the laboratory shall determine the measurement uncertainty for procedures used to report measured quantity values on patients' samples. Where we have numeric data measurement uncertainty can be expressed as the standard deviation or as the co-efficient of variation. However, in immunology many of the assays are reported either as semi-quantitative (i.e. an antibody titre) or qualitative (positive or negative) results. In the latter context, measuring uncertainty is considerably more difficult. There are, however, strategies which can allow us to minimise uncertainty. A number of parameters can contribute to making measurements uncertain. These include bias, precision, standard uncertainty (expressed as standard deviation or coefficient of variation), sensitivity, specificity, repeatability, reproducibility and verification. Closely linked to these are traceability and standardisation. In this article we explore the challenges presented to immunology with regard to measurement uncertainty. Many of these challenges apply equally to other disciplines working with qualitative or semi-quantitative data.
Carlos Guillermo Carreno-Bodensiek
2016-12-01
Full Text Available This work presents the results of a research process applied to a sample of companies in the steel and metalworking sector in Boyacá, Colombia. The active workers are evaluated over the Occupational Competency Standards related to their daily activities. It also aims to highlight the formation priority of human talent for business, according to build up a level of competitiveness. Also, seeks to meet the need to train and develop skills and competencies in the workforce, taking into account the concepts of experts about training and developing proposals for management. This research is consistent with global trends in education and the requirements of standardization of training, why diagnoses and designs are focused on the functions of the companies related to the Standards of Competency.
McDuffie, Amy Roth; Drake, Corey; Choppin, Jeffrey; Davis, Jon D.; Magaña, Margarita V.; Carson, Cynthia
2017-01-01
In this study, U.S. middle school teachers' perceptions of Common Core State Standards for Mathematics (CCSSM), CCSSM-related assessments, teacher evaluation processes, and resources for implementing CCSSM were investigated. Using a mixed methods design, a national sample of 366 teachers was surveyed, and 24 teachers were interviewed. Findings…
Sanderson, P.; Johnson, I.T.; Mahters, J.C.; Powers, H.J.; Downes, C.S.; McGlynn, A.P.; Dare, R.; Kampman, E.
2004-01-01
The UK Food Standards Agency convened a group of expert scientists to review current research investigating emerging diet-related surrogate end points for colorectal cancer (CRC). The workshop aimed to overview current research and establish priorities for future research. The workshop considered th
Information, uncertainty and holographic action
Dikken, Robbert-Jan
2016-01-01
In this short note we show through simple derivation the explicit relation between information flow and the theories of the emergence of space-time and gravity, specifically for Newton's second law of motion. Next, in a rather straightforward derivation the Heisenberg uncertainty relation is uncovered from the universal bound on information flow. A relation between the universal bound on information flow and the change in bulk action is also shown to exist.
Uncertainties in Site Amplification Estimation
Cramer, C. H.; Bonilla, F.; Hartzell, S.
2004-12-01
Typically geophysical profiles (layer thickness, velocity, density, Q) and dynamic soil properties (modulus and damping versus strain curves) are used with appropriate input ground motions in a soil response computer code to estimate site amplification. Uncertainties in observations can be used to generate a distribution of possible site amplifications. The biggest sources of uncertainty in site amplifications estimates are the uncertainties in (1) input ground motions, (2) shear-wave velocities (Vs), (3) dynamic soil properties, (4) soil response code used, and (5) dynamic pore pressure effects. A study of site amplification was conducted for the 1 km thick Mississippi embayment sediments beneath Memphis, Tennessee (see USGS OFR 04-1294 on the web). In this study, the first three sources of uncertainty resulted in a combined coefficient of variation of 10 to 60 percent. The choice of soil response computer program can lead to uncertainties in median estimates of +/- 50 percent. Dynamic pore pressure effects due to the passing of seismic waves in saturated soft sediments are normally not considered in site-amplification studies and can contribute further large uncertainties in site amplification estimates. The effects may range from dilatancy and high-frequency amplification (such as observed at some sites during the 1993 Kushiro-Oki, Japan and 2001 Nisqually, Washington earthquakes) or general soil failure and deamplification of ground motions (such as observed at Treasure Island during the 1989 Loma Prieta, California earthquake). Examples of two case studies using geotechnical data for downhole arrays in Kushiro, Japan and the Wildlife Refuge, California using one dynamic code, NOAH, will be presented as examples of modeling uncertainties associated with these effects. Additionally, an example of inversion for estimates of in-situ dilatancy-related geotechnical modeling parameters will be presented for the Kushiro, Japan site.
Vayenas, Constantinos G; Grigoriou, Dimitrios P
2016-01-01
We discuss the common features between the Standard Model taxonomy of particles, based on electric charge, strangeness and isospin, and the taxonomy emerging from the key structural elements of the rotating neutrino model, which describes baryons as bound states formed by three highly relativistic electrically polarized neutrinos forming a symmetric ring rotating around a central electrically charged or polarized lepton. It is shown that the two taxonomies are fully compatible with each other.
Panescu, Dorin; Nerheim, Max; Kroll, Mark
2013-01-01
TASER(®) conducted electrical weapons (CEW) deliver electrical pulses that can inhibit a person's neuromuscular control or temporarily incapacitate. TASER X26, X26P, and X2 are among CEW models most frequently deployed by law enforcement agencies. The X2 CEW uses two cartridge bays while the X26 and X26P CEWs have only one. The TASER X26P CEW electronic output circuit design is equivalent to that of any one of the two TASER X2 outputs. The goal of this paper was to analyze the nominal electrical outputs of TASER X26, X26P, and X2 CEWs in reference to provisions of several international standards that specify safety requirements for electrical medical devices and electrical fences. Although these standards do not specifically mention CEWs, they are the closest electrical safety standards and hence give very relevant guidance. The outputs of two TASER X26 and two TASER X2 CEWs were measured and confirmed against manufacturer and other published specifications. The TASER X26, X26P, and X2 CEWs electrical output parameters were reviewed against relevant safety requirements of UL 69, IEC 60335-2-76 Ed 2.1, IEC 60479-1, IEC 60479-2, AS/NZS 60479.1, AS/NZS 60479.2 and IEC 60601-1. Prior reports on similar topics were reviewed as well. Our measurements and analyses confirmed that the nominal electrical outputs of TASER X26, X26P and X2 CEWs lie within safety bounds specified by relevant requirements of the above standards.
Blood Pressure Standards for Shiraz (Southern Iran) School Children in Relation to Height
Ayatollahi, Seyyed Mohammad-Taghi; Zare, Marzie
2012-01-01
Objective This study aims at providing local reference values for blood pressure by height and determining distribution pattern of systolic and diastolic blood pressure in 6.5-11.5 elementary school children for the first time in Shiraz (Southern Iran). Methods Height, systolic blood pressure (SBP) and diastolic blood pressure (DBP) were measured with standard methods in 2270 healthy school children (1174 boys, 1096 girls) who were selected by multi-stage random sampling in 2003-2004 academic...
Marco Fedrizzi
2015-12-01
Full Text Available This paper describes the methods used in the monitoring carried out in the farms of the MO.NA.CO. project, to calculate the economic competitiveness gap faced by agricultural holdings that accede to the commitments imposed by the standards included in the project. The monitoring works were performed in agricultural holdings in relation to the particular reference condition of each standard. The processing of the information acquired allowed us to define the working times of each cultivation operation by means of the indications in the recommendations of the Associazione Italiana di Genio Rurale - Italian Rural Engineering Association, that considers the official methodology of the International Commission of the Organisation Scientifique du Travail en Agriculture (C.I.O.S.T.A.. The overall costs and revenues in case of compliance or non-compliance with the commitments of the standard were calculated by using Biondi’s methodology and other norms that indicate the technical and economic coefficients to be used in the calculations (EP 496.2 and D 497.4 ASAE standards. With the data related to the unit cost of ploughing a model Partial Least Squares (PLS has been achieved and validated, and it makes possible to predict the unit cost of this agricultural operation. Finally, the values of the variation of the economic competitiveness gap are reported for each standard.
Uncertainty under quantum measures and quantum memory
Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing
2017-04-01
The uncertainty principle restricts potential information one gains about physical properties of the measured particle. However, if the particle is prepared in entanglement with a quantum memory, the corresponding entropic uncertainty relation will vary. Based on the knowledge of correlations between the measured particle and quantum memory, we have investigated the entropic uncertainty relations for two and multiple measurements and generalized the lower bounds on the sum of Shannon entropies without quantum side information to those that allow quantum memory. In particular, we have obtained generalization of Kaniewski-Tomamichel-Wehner's bound for effective measures and majorization bounds for noneffective measures to allow quantum side information. Furthermore, we have derived several strong bounds for the entropic uncertainty relations in the presence of quantum memory for two and multiple measurements. Finally, potential applications of our results to entanglement witnesses are discussed via the entropic uncertainty relation in the absence of quantum memory.
Uncertainty in in-place filter test results
Scripsick, R.C.; Beckman, R.J.; Mokler, B.V.
1996-12-31
Some benefits of accounting for uncertainty in in-place filter test results are explored. Information the test results provide relative to system performance acceptance limits is evaluated in terms of test result uncertainty. An expression for test result uncertainty is used to estimate uncertainty in in-place filter tests on an example air cleaning system. Modifications to the system test geometry are evaluated in terms of effects on test result uncertainty.
Inflation and Inflation Uncertainty Revisited: Evidence from Egypt
Mesbah Fathy Sharaf
2015-07-01
Full Text Available The welfare costs of inflation and inflation uncertainty are well documented in the literature and empirical evidence on the link between the two is sparse in the case of Egypt. This paper investigates the causal relationship between inflation and inflation uncertainty in Egypt using monthly time series data during the period January 1974–April 2015. To endogenously control for any potential structural breaks in the inflation time series, Zivot and Andrews (2002 and Clemente–Montanes–Reyes (1998 unit root tests are used. The inflation–inflation uncertainty relation is modeled by the standard two-step approach as well as simultaneously using various versions of the GARCH-M model to control for any potential feedback effects. The analyses explicitly control for the effect of the Economic Reform and Structural Adjustment Program (ERSAP undertaken by the Egyptian government in the early 1990s, which affected inflation rate and its associated volatility. Results show a high degree of inflation–volatility persistence in the response to inflationary shocks. Granger-causality test along with symmetric and asymmetric GARCH-M models indicate a statistically significant bi-directional positive relationship between inflation and inflation uncertainty, supporting both the Friedman–Ball and the Cukierman–Meltzer hypotheses. The findings are robust to the various estimation methods and model specifications. The findings of this paper support the view of adopting inflation-targeting policy in Egypt, after fulfilling its preconditions, to reduce the welfare cost of inflation and its related uncertainties. Monetary authorities in Egypt should enhance the credibility of monetary policy and attempt to reduce inflation uncertainty, which will help lower inflation rates.
Davis, C B
1987-08-01
The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results.
Improving the uncertainty of photomask linewidth measurements
Pedulla, J. M.; Potzick, James; Silver, Richard M.
2004-05-01
The National Institute of Standards and Technology (NIST) is currently developing a photomask linewidth standard (SRM 2059) with a lower expected uncertainty of calibration than the previous NIST standards (SRMs 473, 475, 476). In calibrating these standards, optical simulation modeling has been used to predict the microscope image intensity profiles, which are then compared to the experimental profiles to determine the certified linewidths. Consequently, the total uncertainty in the linewidth calibration is a result of uncertainty components from the optical simulation modeling and uncertainty due to experimental errors or approximations (e.g., tool imaging errors and material characterization errors). Errors of approximation in the simulation model and uncertainty in the parameters used in the model can contribute a large component to the total linewidth uncertainty. We have studied the effects of model parameter variation on measurement uncertainty using several different optical simulation programs that utilize different mathematical techniques. We have also evaluated the effects of chrome edge runout and varying indices of refraction on the linewidth images. There are several experimental parameters that are not ordinarily included in the modeling simulation. For example, the modeling programs assume a uniform illuminating field (e.g., Koehler illumination), ideal optics and perfect optical alignment. In practice, determining whether Koehler illumination has been achieved is difficult, and the optical components and their alignments are never ideal. We will present some techniques for evaluating Koehler illumination and methods to compensate for scattered (flare) light. Any such experimental elements, that are assumed accurate in the modeling, may actually present significant components to the uncertainty and need to be quantitatively estimated. The present state of metrology does not permit the absolute calibration of linewidth standards to the level of
Unscented transform-based uncertainty analysis of rotating coil transducers for field mapping
Arpaia, P.; De Matteis, E.; Schiano Lo Moriello, R.
2016-03-01
The uncertainty of a rotating coil transducer for magnetic field mapping is analyzed. Unscented transform and statistical design of experiments are combined to determine magnetic field expectation, standard uncertainty, and separate contributions of the uncertainty sources. For nonlinear measurement models, the unscented transform-based approach is more error-proof than the linearization underlying the "Guide to the expression of Uncertainty in Measurements" (GUMs), owing to the absence of model approximations and derivatives computation. When GUM assumptions are not met, the deterministic sampling strategy strongly reduces computational burden with respect to Monte Carlo-based methods proposed by the Supplement 1 of the GUM. Furthermore, the design of experiments and the associated statistical analysis allow the uncertainty sources domain to be explored efficiently, as well as their significance and single contributions to be assessed for an effective setup configuration. A straightforward experimental case study highlights that a one-order-of-magnitude reduction in the relative uncertainty of the coil area produces a decrease in uncertainty of the field mapping transducer by a factor of 25 with respect to the worst condition. Moreover, about 700 trials and the related processing achieve results corresponding to 5 × 106 brute-force Monte Carlo simulations.
Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Lozano-Rubí, Raimundo; Serrano-Balazote, Pablo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario
2017-08-18
The objective of this research is to compare the relational and non-relational (NoSQL) database systems approaches in order to store, recover, query and persist standardized medical information in the form of ISO/EN 13606 normalized Electronic Health Record XML extracts, both in isolation and concurrently. NoSQL database systems have recently attracted much attention, but few studies in the literature address their direct comparison with relational databases when applied to build the persistence layer of a standardized medical information system. One relational and two NoSQL databases (one document-based and one native XML database) of three different sizes have been created in order to evaluate and compare the response times (algorithmic complexity) of six different complexity growing queries, which have been performed on them. Similar appropriate results available in the literature have also been considered. Relational and non-relational NoSQL database systems show almost linear algorithmic complexity query execution. However, they show very different linear slopes, the former being much steeper than the two latter. Document-based NoSQL databases perform better in concurrency than in isolation, and also better than relational databases in concurrency. Non-relational NoSQL databases seem to be more appropriate than standard relational SQL databases when database size is extremely high (secondary use, research applications). Document-based NoSQL databases perform in general better than native XML NoSQL databases. EHR extracts visualization and edition are also document-based tasks more appropriate to NoSQL database systems. However, the appropriate database solution much depends on each particular situation and specific problem.
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Mullen, Carol A.; Kealy, William A.; Sullivan, Ashley
2004-01-01
This article addresses an important need--the dissemination of information relating to technology as a public relations tool--and the associated exigency for administrator and teacher technology training. Specifically, we identify the increased expectations for the performance of school leaders and teachers, as well as unresolved issues in public…
Morales, J; Fonseca, F; Morales, John; Quimbay, Carlos; Fonseca, Frank
1999-01-01
We calculate the fermionic dispersion relations in the minimal standard model at finite temperature in presence of non-vanishing chemical potentials due to the CP-asymmetric fermionic background. The dispersion relations are calculated for a vacuum expectation value of the Higgs field equal to zero (unbroken electroweak symmetry). The calculation is performed in the real time formalism of the thermal field theory at one-loop order in a general $\\xi$ gauge. The fermionic self-energy is calculated at leading order in temperature and chemical potential and this fact permits us to obtain gauge invariant analytical expressions for the dispersion relations.
Visual Semiotics & Uncertainty Visualization: An Empirical Study.
MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M
2012-12-01
This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.
STUDY OF STANDARD OPERATING PROCEDURE OF NAAG BHASMA IN RELATION TO ITS PHYSICO-CHEMICAL PROPERTIES
Lagad C. E.
2012-03-01
Full Text Available Standard operating procedures (SOPs should to be designed, implemented and set for all Ayurvedic drugs one by one for globalization Ayurveda. In this study, an attempt has been made to introduce SOP for preparation of Naag Bhasma [NB] & its analytical study. Study was conducted in the Department of Rasa Shastra under the postgraduate research programme is being presented. The pharmaceutical processing of NB was performed by following Samanya Shodhana, Jarana, Marana of Naag [Pb] & its analytical study. Naga Bhasma was prepared in two batches namely Batch A & B. In this method purified Haratala (Orpiment were taken as media. The percentage loss in the Naga Bhasma in Batch A was 63%, while in case of Batch B was 60.5%. Raw drugs, in process materials and the final products were analyzed physico-chemically and comparison was drawn to lay down pharmacopoeial standards. The average percentage purity of Naga decreased from 99.46% to 86.57% after Shodhana. The percentage of (Pb lead in Naga Bhasma was 58.4% and 57.89% respectively in Batch A and B. Both the Bhasmas were in PbS form chemically with other elements like Ca, Si, Fe, Al, K, As, Mg, Ni, Mn, Cd, Zn in trace amount.
Estimating uncertainties in complex joint inverse problems
Afonso, Juan Carlos
2016-04-01
Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related
Uncertainty in artificial intelligence
Kanal, LN
1986-01-01
How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.
[Ethics, empiricism and uncertainty].
Porz, R; Zimmermann, H; Exadaktylos, A K
2011-01-01
Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine. © Georg Thieme Verlag KG Stuttgart · New York.
Fuzzy Uncertainty Evaluation for Fault Tree Analysis
Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)
2015-05-15
This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.
Evaluating a Sex Related Ability: Social Comparison with Similar Others and Standard
Zanna, Mark P.; And Others
1975-01-01
The purpose of the present study is to reevaluate Festinger's similarity hypothesis and to investigate the relative strengths of the desire to compare with similar others and the desire to compare with those who are best off. (Author)
Mueller, David S.
2017-01-01
This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when
Platts-Mills, T.A.E.; Chapman, M.D.; Pollart, S.M.; Heymann, P.W.; Luczynska, C.M. (Univ. of Virginia, Charlottesville (United States))
1990-01-01
There is no doubt that a large number of individuals become allergic to foreign proteins that are predominantly or exclusively present indoors. In each case this immune response can be demonstrated either by immediate skin test responses or by measuring serum IgE antibodies. It has also been obvious for some time that patients presenting with asthma, perennial rhinitis and atopic dermatitis have an increased prevalence of IgE antibodies to these indoor allergens. More recently several epidemiological surveys have suggested that both mite exposure and IgE antibodies are important risk factors for asthma. The present situation is that assays have been developed capable of measuring the presence of mite, cockroach and cat allergens in house dust. Further clinical studies will be necessary to test the proposed standards for mite allergens and to define risk levels for other allergens.
Chang, S. J.; Graham, W. D.; Geurink, J. S.
2016-12-01
Climate change can change the magnitude and temporal characteristics of hydrologic responses which could impact the risk and resilience for public water supply facilities. Sustainable water resource planning requires reliable projections of potential future spatiotemporal changes in the regional hydrologic responses. Public water suppliers need to improve understanding of hydrologic response uncertainties associated with the choice of General Circulation Model (GCM), reference evapotranspiration (ET0) estimation method, and future water demands estimates to understand risk and increase resilience through adaptive management strategies. The objective of this study is to quantify uncertainties for hydrologic responses in west-central Florida associated with GCM selection, ET0 estimation method, and future water demand scenarios. Nine GCMs, eight ET0 estimation methods and water demand scenarios were used to develop inputs to the calibrated Integrated Northern Tampa Bay (INTB) model which was used to simulate hydrologic responses for 30 years for each of 520 simulation scenarios. INTB model simulated streamflow and groundwater levels for all simulation scenarios were used to assess future risks for the public water supply facilities of Tampa Bay Water.
Taha, Sheena Aislinn; Matheson, Kimberly; Anisman, Hymie
2014-04-01
H1N1 reached pandemic proportions in 2009, yet considerable ambivalence was apparent concerning the threat presented and the inclination to be vaccinated. The present investigation assessed several factors, notably appraisals of the threat, intolerance of uncertainty, and familiarity with the virus, that might contribute to reactions to a potential future viral threat. Canadian adults (N = 316) provided with several scenarios regarding viral threats reported moderate feelings of anxiety, irrespective of whether the viral threat was one that was familiar versus one that was entirely unfamiliar to them (H1N1 recurrence, H5N1, a fictitious virus: D3N4). Participants appraised the stressfulness of the threats to be moderate and believed that they would have control in this situation. However, among individuals with high intolerance of uncertainty, the viral threat was accompanied by high levels of anxiety, which was mediated by aspects of appraisals, particularly control and stressfulness. In addition, among those individuals that generally appraised ambiguous life events as being stressful, the viral threat appraisals were accompanied by still greater anxiety. Given the limited response to potential viral threats, these results raise concerns that the public may be hesitant to heed recommendations should another pandemic occur.
Uncertainty evaluation of the thermal expansion of simulated fuel
Park, Chang Je; Kang, Kweon Ho; Song, Kee Chan [Korea Atomic Energy Research Institute, 150 Dukjin-dong, Yuseung-gu, Daejon 305-353 (Korea, Republic of)
2006-08-15
Thermal expansions of simulated fuel (SS1) are measured by using a dilatometer (DIL402C) from room temperature to 1900K. The main procedure of an uncertainty evaluation was followed by the strategy of the UO{sub 2} fuel. There exist uncertainties in the measurement, which should be quantified based on statistics. Referring to the ISO (International Organization for Standardization) guide, the uncertainties of the thermal expansion are quantified in three parts: the initial length, the length variation, and the system calibration factor. Each part is divided into two types. The A type uncertainty is derived from the statistical iterative measurement of an uncertainty and the B type uncertainty comes from a non-statistical uncertainty including a calibration and test reports. For the uncertainty evaluation, the digital calipers had been calibrated by the KOLAS (Korea Laboratory Accreditation Scheme) to obtain not only the calibration values but also the type B uncertainty. The whole system, the dilatometer (DIL402C), is composed of many complex sub-systems and in fact it is difficult to consider all the uncertainties of sub-systems. Thus, a calibration of the system was performed with a standard material (Al{sub 2}O{sub 3}), which is provided by NETZSCH. From the above standard uncertainties, the combined standard uncertainties were calculated by using the law of a propagation of an uncertainty. Finally, the expanded uncertainty was calculated by using the effective degree of freedom and the t-distribution for a given confidence level. The uncertainty of the thermal expansion for a simulated fuel was also compared with those of UO{sub 2} fuel. (author)
The Precautionary Principle and Statistical Approaches to Uncertainty
Keiding, Niels; Budtz-Jørgensen, Esben
2005-01-01
Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification...
Nonlinear Schrödinger equation from generalized exact uncertainty principle
Rudnicki, Łukasz
2016-09-01
Inspired by the generalized uncertainty principle, which adds gravitational effects to the standard description of quantum uncertainty, we extend the exact uncertainty principle approach by Hall and Reginatto (2002 J. Phys. A: Math. Gen. 35 3289), and obtain a (quasi)nonlinear Schrödinger equation. This quantum evolution equation of unusual form, enjoys several desired properties like separation of non-interacting subsystems or plane-wave solutions for free particles. Starting with the harmonic oscillator example, we show that every solution of this equation respects the gravitationally induced minimal position uncertainty proportional to the Planck length. Quite surprisingly, our result successfully merges the core of classical physics with non-relativistic quantum mechanics in its extremal form. We predict that the commonly accepted phenomenon, namely a modification of a free-particle dispersion relation due to quantum gravity might not occur in reality.
New Generation of Parton Distributions with Uncertainties from Global QCD Analysis
Pumplin, Jon; Huston, J; Lai, H L; Nadolsky, P M; Tung, W K
2002-01-01
A new generation of parton distribution functions with increased precision and quantitative estimates of uncertainties is presented. This work significantly extends previous CTEQ and other global analyses on two fronts: (i) a full treatment of available experimental correlated systematic errors for both new and old data sets; (ii) a systematic and pragmatic treatment of uncertainties of the parton distributions and their physical predictions, using a recently developed eigenvector-basis approach to the Hessian method. The new gluon distribution is considerably harder than that of previous standard fits. A number of physics issues, particularly relating to the behavior of the gluon distribution, are addressed in more quantitative terms than before. Extensive results on the uncertainties of parton distributions at various scales, and on parton luminosity functions at the Tevatron RunII and the LHC, are presented. The latter provide the means to quickly estimate the uncertainties of a wide range of physical proc...
Fonseca Diaz, Nestor [Universidad Tecnologica de Pereira, Facultad de Ingenieria Mecanica, Pereira (Colombia); University of Liege, Campus du Sart Tilman, Bat: B49, P33, B-4000 Liege (Belgium)
2009-09-15
This article presents the general procedure for uncertainty calculation of net total cooling effect estimation for rating room air conditioners and packaged terminal air conditioners, by means of measurements carried out in a test bench specially designed for this purpose. The uncertainty analysis presented in this work looks for establishing a confidence degree or certainty of experimental results. It is particularly important considering that international standards related to this type of analysis are too ambiguous when treating this subject. The uncertainty analysis is on the other hand an indispensable requirement to international standard ISO 17025 [ISO, 2005. International Standard. 17025. General Requirement to Test and Calibration Laboratories Competences. International Organization for Standardization, Geneva.], which must be applied to obtain the required quality levels according to the Word Trade Organization WTO. (author)
Crothers S. J.
2005-10-01
Full Text Available Relativistic motion in the gravitational field of a massive body is governed by the external metric of a spherically symmetric extended object. Consequently, any solution for the point-mass is inadequate for the treatment of such motions since it pertains to a fictitious object. I therefore develop herein the physics of the standard tests of General Relativity by means of the generalised solution for the field external to a sphere of incompressible homogeneous fluid.
Lorentz invariance violation and generalized uncertainty principle
Tawfik, Abdel Nasser; Magdy, H.; Ali, A. Farag
2016-01-01
There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay Δ t comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, Δ t, and the relative change in the speed of muon neutrino Δ v in dependence on redshift z turn to be wrong, we utilize its main features to estimate Δ v. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, a that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.
The Relations between Teasing and Bullying and Middle School Standardized Exam Performance
Lacey, Anna; Cornell, Dewey; Konold, Timothy
2017-01-01
This study examined the relations between the schoolwide prevalence of teasing and bullying (PTB) and schoolwide academic performance in a sample of 271 Virginia middle schools. In addition, the study examined the mediating effects of student engagement. A three-step sequence of path models investigated associations between schoolwide PTB and…
2010-10-01
... example, states that the audit of the report required by Rule 17a-5(d) ``* * * shall be made in accordance... Exchange Act requires that the audit of certain over-the-counter derivative dealers ``* * * shall be made... Commission is considering a rulemaking project to update the audit and related attestation requirements under...
2012-03-23
... individuals and small businesses the same purchasing power as big businesses. The Departments of Health and... part of letter campaigns related to women's and mental health services, or were general comments on the... make fair enrollee risk comparison between excepted benefit plans and major medical plans difficult. We...
Ammon, G.; Schoenfelder, C.
2014-07-01
In recent years AREVA has conducted several measures to enhance the effectiveness of safety I and C related verification and validation activities within nuclear power plant (NPP) new build as well as modernization projects, thereby further strengthening its commitment to achieving the highest level of safety in nuclear facilities. (Author)
Hildebrandt, V.H.
2001-01-01
Work related musculoskeletal disorders constitute a major problem to modern society. They are a major cause of work absenteeism and disability, thus constituting one of the most expensive disease categories. There is a great need for effective ways to prevent or reduce musculoskeletal problems. Give
Sin, Gürkan; Gernaey, Krist V; Lantz, Anna Eliasson
2009-01-01
The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input uncertainty resulting from assumptions of the model was propagated using the Monte Carlo procedure to estimate the output uncertainty. The results showed that significant uncertainty exists in the model outputs. Moreover the uncertainty in the biomass, glucose, ammonium and base-consumption were found low compared to the large uncertainty observed in the antibiotic and off-gas CO(2) predictions. The output uncertainty was observed to be lower during the exponential growth phase, while higher in the stationary and death phases - meaning the model describes some periods better than others. To understand which input parameters are responsible for the output uncertainty, three sensitivity methods (Standardized Regression Coefficients, Morris and differential analysis) were evaluated and compared. The results from these methods were mostly in agreement with each other and revealed that only few parameters (about 10) out of a total 56 were mainly responsible for the output uncertainty. Among these significant parameters, one finds parameters related to fermentation characteristics such as biomass metabolism, chemical equilibria and mass-transfer. Overall the uncertainty and sensitivity analysis are found promising for helping to build reliable mechanistic models and to interpret the model outputs properly. These tools make part of good modeling practice, which can contribute to successful PAT applications for increased process understanding, operation and control purposes.
Van Nooyen, R.R.P.; Hrachowitz, M.; Kolechkina, A.G.
2014-01-01
Even without uncertainty about the model structure or parameters, the output of a hydrological model run still contains several sources of uncertainty. These are: measurement errors affecting the input, the transition from continuous time and space to discrete time and space, which causes loss of in
Capel, H.W.; Cramer, J.S.; Estevez-Uscanga, O.
1995-01-01
'Uncertainty and chance' is a subject with a broad span, in that there is no academic discipline or walk of life that is not beset by uncertainty and chance. In this book a range of approaches is represented by authors from varied disciplines: natural sciences, mathematics, social sciences and medic
Guide for Uncertainty Communication
Wardekker, J.A.|info:eu-repo/dai/nl/306644398; Kloprogge, P.|info:eu-repo/dai/nl/306644312; Petersen, A.C.; Janssen, P.H.M.; van der Sluijs, J.P.|info:eu-repo/dai/nl/073427489
2013-01-01
Dealing with uncertainty, in terms of analysis and communication, is an important and distinct topic for PBL Netherlands Environmental Assessment Agency. Without paying adequate attention to the role and implications of uncertainty, research and assessment results may be of limited value and could
Computing with Epistemic Uncertainty
2015-01-01
modified the input uncertainties in any way. And by avoiding the need for simulation, various assumptions and selection of specific sampling...strategies that may affect results are also avoided . According with the Principle of Maximum Uncertainty , epistemic intervals represent the highest input...
Kennedy, Theodore A.
2013-01-01
Identifying areas of scientific uncertainty is a critical step in the adaptive management process (Walters, 1986; Runge, Converse, and Lyons, 2011). To identify key areas of scientific uncertainty regarding biologic resources of importance to the Glen Canyon Dam Adaptive Management Program, the Grand Canyon Monitoring and Research Center (GCMRC) convened Knowledge Assessment Workshops in May and July 2005. One of the products of these workshops was a set of strategic science questions that highlighted key areas of scientific uncertainty. These questions were intended to frame and guide the research and monitoring activities conducted by the GCMRC in subsequent years. Questions were developed collaboratively by scientists and managers. The questions were not all of equal importance or merit—some questions were large scale and others were small scale. Nevertheless, these questions were adopted and have guided the research and monitoring efforts conducted by the GCMRC since 2005. A new round of Knowledge Assessment Workshops was convened by the GCMRC in June and October 2011 and January 2012 to determine whether the research and monitoring activities conducted since 2005 had successfully answered some of the strategic science questions. Oral presentations by scientists highlighting research findings were a centerpiece of all three of the 2011–12 workshops. Each presenter was also asked to provide an answer to the strategic science questions that were specific to the presenter’s research area. One limitation of this approach is that these answers represented the views of the handful of scientists who developed the presentations, and, as such, they did not incorporate other perspectives. Thus, the answers provided by presenters at the Knowledge Assessment Workshops may not have accurately captured the sentiments of the broader group of scientists involved in research and monitoring of the Colorado River in Glen and Grand Canyons. Yet a fundamental ingredient of
Hearing loss from gun and railroad noise--relations with ISO standard 1999.
Kryter, K D
1991-12-01
Pure-tone hearing thresholds and anamnestic data pertaining to nosocusis and exposure to gun noise were analyzed for 9778 male railroad train-crew workers. A major portion of losses in hearing sensitivity due to railroad noise are obscured in comparisons of hearing levels of trainmen with the hearing levels of the unscreened samples of United States males given in Annex B, ISO 1999 [ISO 1999 (1990), "Acoustics--Determination of occupational noise exposure and estimation of noise-induced hearing impairment" (International Organization for Standardization, Geneva)]. Comparisons of the hearing levels, adjusted for nosocusis, of trainmen who had used no guns, with the hearing levels of otologically and noise screened males (Annex A, ISO 1999) reveal significant losses due to railroad noise. Additional losses were found at high frequencies in trainmen who had used guns. It appears that the effective Leq8h exposure level of trainmen to railroad noise is about 92 dBA, and 87-89 dBA to gun noise. These results are in general agreement with those of study of railway workers by Prosser et al. [Br. J. Audiol. 22, 85-91 (1988)]. Asymmetries in losses between the two ears, effects of ear protection, losses from nosocusis, and losses from sport, as compared to military, gun noise exposures, are examined.
Uncertainties in Cup Anemometer Calibrations. Type A and Type B uncertainties
Eecen, P.J. Eecen; De Noord, M. [ECN Wind Energy, Petten (Netherlands)
2005-06-01
ECN Wind Energy is accredited according ISO 17025 to perform power performance measurements of wind turbines following IEC 61400-12 and Measnet. The typical results of these measurements are measured power curves of the wind turbines. These curves show the electrical power versus the wind speed. Part of the power performance analysis is the uncertainty analysis. In power performance measurements according IEC 61400-12 or Measnet, the wind speed is measured using cup anemometers. The uncertainty of the wind speed measurements depends among others on the calibration uncertainty. The cup-anemometer calibration uncertainty is divided in Type A and Type B uncertainty. The Type A uncertainty is the statistical uncertainty of the wind speed measurement and can be calculated from the wind speed measurements. The Type B uncertainty includes all uncertainties that do not have a statistical background. For example, these are uncertainties due to temperature changes, air pressure changes, transducer gain influences, and digital conversion influences, etc. This document presents an overview of the type B uncertainties of a cup anemometer calibration following the Measnet 'Cup Anemometer Calibration Procedure, Version 1, September 1997' and the proposal for the IEC 61400-12 that is currently under vote. For the calculations, the numbers applying to the DEWI wind tunnel are used throughout the report. It is shown that the cup anemometer in wind tunnels has unstable behaviour due to the turbulent wakes of the cups. An analysis of uncertainties due to regression in cup anemometer calibrations is presented and compared to the standard uncertainty.
Kowalska, Justyna D; Mocroft, Amanda; Ledergerber, Bruno;
2011-01-01
are a natural consequence of an increased awareness and knowledge in the field. To monitor and analyze changes in mortality over time, we have explored this issue within the EuroSIDA study and propose a standardized protocol unifying data collected and allowing for classification of all deaths as AIDS or non......-AIDS related, including events with missing cause of death. Methods: Several classifications of the underlying cause of death as AIDS or non-AIDS related within the EuroSIDA study were compared: central classification (CC-reference group) based on an externally standardised method (the CoDe procedures), local...
Liu Baoding [Tsinghua Univ., Beijing (China). Uncertainty Theory Lab.
2007-07-01
Uncertainty theory is a branch of mathematics based on normality, monotonicity, self-duality, and countable subadditivity axioms. The goal of uncertainty theory is to study the behavior of uncertain phenomena such as fuzziness and randomness. The main topics include probability theory, credibility theory, and chance theory. For this new edition the entire text has been totally rewritten. More importantly, the chapters on chance theory and uncertainty theory are completely new. This book provides a self-contained, comprehensive and up-to-date presentation of uncertainty theory. The purpose is to equip the readers with an axiomatic approach to deal with uncertainty. Mathematicians, researchers, engineers, designers, and students in the field of mathematics, information science, operations research, industrial engineering, computer science, artificial intelligence, and management science will find this work a stimulating and useful reference. (orig.)
Economic uncertainty and econophysics
Schinckus, Christophe
2009-10-01
The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.
Physical Uncertainty Bounds (PUB)
Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.
Muench, Fred
2016-01-01
In recent years, the number of available eHealth interventions aimed at treating behavioral and mental health challenges has been growing. From the perspective of health care providers, there is a need for eHealth interventions to be evaluated prior to clinical trials and for the limited resources allocated to empirical research to be invested in the most promising products. Following a literature review, a gap was found in the availability of eHealth interventions evaluation principles related to the patient experience of the therapeutic process. This paper introduces principles and concepts for the evaluation of eHealth interventions developed as a first step in a process to outline general evaluation guidelines that relate to the clinical context from health care providers’ perspective. Our approach was to conduct a review of literature that relates to the examination of eHealth interventions. We identified the literature that was most relevant to our study and used it to define guidelines that relate to the clinical context. We then compiled a list of heuristics we found to be useful for the evaluation of eHealth intervention products’ suitability for empirical examination. Four heuristics were identified with respect to the therapeutic process: (1) the product’s ease of use (ie, usability), (2) the eHealth intervention’s compatibility with the clinical setting, (3) the presence of tools that make it easier for the user to engage in therapeutic activities, and (4) the provision of a feasible therapeutic pathway to growth. We then used this set of heuristics to conduct a detailed examination of MyFitnessPal. This line of work could help to set the bar higher for product developers and to inform health care providers about preferred eHealth intervention designs. PMID:26764209
Minimum uncertainty and squeezing in diffusion processes and stochastic quantization
Demartino, S.; Desiena, S.; Illuminati, Fabrizo; Vitiello, Giuseppe
1994-01-01
We show that uncertainty relations, as well as minimum uncertainty coherent and squeezed states, are structural properties for diffusion processes. Through Nelson stochastic quantization we derive the stochastic image of the quantum mechanical coherent and squeezed states.
Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.
2008-01-01
This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.
Sciacchitano, Andrea; Wieneke, Bernhard
2016-08-01
This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It is shown that the uncertainty of vorticity and velocity divergence requires the knowledge of the spatial correlation between the error of the x and y particle image displacement, which depends upon the measurement spatial resolution. The uncertainty of statistical quantities is often dominated by the random uncertainty due to the finite sample size and decreases with the square root of the effective number of independent samples. Monte Carlo simulations are conducted to assess the accuracy of the uncertainty propagation formulae. Furthermore, three experimental assessments are carried out. In the first experiment, a turntable is used to simulate a rigid rotation flow field. The estimated uncertainty of the vorticity is compared with the actual vorticity error root-mean-square, with differences between the two quantities within 5-10% for different interrogation window sizes and overlap factors. A turbulent jet flow is investigated in the second experimental assessment. The reference velocity, which is used to compute the reference value of the instantaneous flow properties of interest, is obtained with an auxiliary PIV system, which features a higher dynamic range than the measurement system. Finally, the uncertainty quantification of statistical quantities is assessed via PIV measurements in a cavity flow. The comparison between estimated uncertainty and actual error demonstrates the accuracy of the proposed uncertainty propagation methodology.
The importance of expression of uncertainty of acoustical parameters of ultrasonic phantoms
Maggi, L E; Souza, A B B; Ichinose, R M; Pereira, W C A; Kruger, M A von [Programa de Engenharia Biomedica/COPPE - UFRJ, Rio de Janeiro (Brazil); Costa-Felix, R P B, E-mail: luis.maggi@gmail.com [Ultrasound Laboratory, Diavi/Dimci/Inmetro, Duque de Caxias, RJ (Brazil)
2011-02-01
The measurement of uncertainties in scientific experiments improves greatly quality and reliability of the results. However, in many cases, experimental results are only expressed by its average value and standard deviation. The longitudinal velocity and attenuation coefficient are acoustic parameters commonly used to characterize biological tissues and materials. In this work it is studied the uncertainty in experiments designed to evaluate these parameters on two different materials (silicone rubber and PVCP). The uncertainties were studied following the Guide to the Expression of Uncertainty in Measurement and calculated by a program in Labview8.6. One setup was developed to measure the acoustic parameters by a transmission/reception technique. Five signals of each medium (water and materials) were collected. The attenuation coefficient was calculated using the relation between the amplitude spectrum peak of the water signal and its respective point on the spectrum of the material signal. The longitudinal velocity was calculated using the time delay between signal peaks (from water and from the material). The individual uncertainties of each part of setup were estimated and these values permitted to identify which were the sources of uncertainty that most contributed to increase the value of associated uncertainty. It permitted to improve experiment's quality and reliability.
Do Orthopaedic Surgeons Acknowledge Uncertainty?
Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert
2016-06-01
of experience. Two hundred forty-two (34%) members completed the survey. We found no differences between responders and nonresponders. Each survey item measured its own trait better than any of the other traits. Recognition of uncertainty (0.70) and confidence bias (0.75) had relatively high Cronbach alpha levels, meaning that the questions making up these traits are closely related and probably measure the same construct. This was lower for statistical understanding (0.48) and trust in the orthopaedic evidence base (0.37). Subsequently, combining each trait's individual questions, we calculated a 0 to 10 score for each trait. The mean recognition of uncertainty score was 3.2 ± 1.4. Recognition of uncertainty in daily practice did not vary by years in practice (0-5 years, 3.2 ± 1.3; 6-10 years, 2.9 ± 1.3; 11-20 years, 3.2 ± 1.4; 21-30 years, 3.3 ± 1.6 years; p = 0.51), but overconfidence bias did correlate with years in practice (0-5 years, 6.2 ± 1.4; 6-10 years, 7.1 ± 1.3; 11-20 years, 7.4 ± 1.4; 21-30 years, 7.1 ± 1.2 years; p coefficient, -0.53; 95% confidence interval [CI], -1.0 to -0.055; partial R(2), 0.021; p = 0.029), belief in God or any other deity/deities (β, -0.57; 95% CI, -1.0 to -0.11; partial R(2), 0.026; p = 0.015), greater confidence bias (β, -0.26; 95% CI, -0.37 to -0.14; partial R(2), 0.084; p definitive evidence. If patients want to be informed of the areas of uncertainty and surgeon-to-surgeon variation relevant to their care, it seems possible that a low recognition of uncertainty and surgeon confidence bias might hinder adequately informing patients, informed decisions, and consent. Moreover, limited recognition of uncertainty is associated with modifiable factors such as confidence bias, trust in orthopaedic evidence base, and statistical understanding. Perhaps improved statistical teaching in residency, journal clubs to improve the critique of evidence and awareness of bias, and acknowledgment of knowledge gaps at courses and
Lidar arc scan uncertainty reduction through scanning geometry optimization
H. Wang
2015-10-01
Full Text Available Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction. Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation. This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annual energy production. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation when arc scans are used for wind resource assessment.
Assessing measurement uncertainty in meteorology in urban environments
Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.
2017-10-01
Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.
Gillund, Frøydis; Kjølberg, Kamilla A; von Krauss, Martin Krayer; Myhr, Anne I
2008-12-15
The Walker and Harremoës (W&H) uncertainty framework is a tool to systematically identify scientific uncertainty. We applied the W&H uncertainty framework to elicit scientists' judgements of potential sources of uncertainty associated with the use of DNA vaccination in aquaculture. DNA vaccination is considered a promising solution to combat pathological fish diseases. There is, however, lack of knowledge regarding its ecological and social implications. Our findings indicate that scientists are open and aware of a number of uncertainties associated with DNA vaccination e.g. with regard to immune response, degradation and distribution of the DNA plasmid after injection and environmental release, and consider most of these uncertainties to be adequately reduced through more research. We proceed to discuss our experience of using the W&H uncertainty framework. Some challenges related to the application of the framework were recognised. This was especially related to the respondents' unfamiliarity with the concepts used and their lack of experience in discussing qualitative aspects of uncertainties. As we see it, the W&H framework should be considered as a useful tool to stimulate reflection on uncertainty and an important first step in a more extensive process of including and properly dealing with uncertainties in science and policymaking.
Translation of questionnaires measuring health related quality of life is not standardized
Danielsen, Anne Kjaergaard; Pommergaard, Hans-Christian; Burcharth, Jakob
2015-01-01
INTRODUCTION: There is growing awareness of the need to explore patient reported outcomes in clinical trials. In the Scandinavian Surgical Outcomes Research Group we are conducting several clinical trials in cooperation between Danish and Swedish surgical researchers, and we use questionnaires...... aimed at patients from both countries. In relation to this and similar international cooperation, the validity and reliability of translated questionnaires are central aspects. MAIN OBJECTIVES: The purpose of this study was to explore which methodological measures were used in studies reporting...... translation of questionnaires. Furthermore, we wanted to make some methodological suggestions for clinical researchers who are faced with having to translate a questionnaire. MATERIAL AND METHODS: We designed a research study based on a survey of the literature and extracted data from published studies...
A note on the mutual relation between thermodynamics, energy definitions and standard cosmology
Moradpour, H
2016-01-01
In this paper, by solving the Friedman and thermodynamic pressure equations simultaneously, we investigate the relation between the Friedman and thermodynamic pressure equations. Our investigation shows that a prefect fluid, as a suitable solution for the Friedman equations, cannot simultaneously satisfy the thermodynamic pressure equation and Friedman equations. Moreover, we consider various common energy definitions, such as the Komar mass, and try to simultaneously solve the Friedman and thermodynamic pressure equations for various fluids. Our investigation shows that the Komar mass leads to a solution which may unify the primary inflationary and the current accelerating eras into one model. Thereinafter, we consider a general form for the energy of cosmic fluid and combine it with the Friedman and thermodynamic pressure equations to get some new solutions. The cosmological consequences of found solutions are also addressed. In addition, by taking into account a cosmic fluid with the known equation of stat...
International Organization for Standardization. Geneva
1994-01-01
Information technology - Telecommunications and information exchange between systems - Elements of management information related to OSI transport layer standards; Amendment 1: NCMS Management; Technical Corrigendum 1
Effects of ethanol on CYP2E1 levels and related oxidative stress using a standard balanced diet.
Azzalis, Ligia A; Fonseca, Fernando L A; Simon, Karin A; Schindler, Fernanda; Giavarotti, Leandro; Monteiro, Hugo P; Videla, Luis A; Junqueira, Virgínia B C
2012-07-01
Expression of cytochrome P4502E1 (CYP2E1) is very much influenced by nutritional factors, especially carbohydrate consumption, and various results concerning the expression of CYP2E1 were obtained with a low-carbohydrate diet. This study describes the effects of ethanol treatment on CYP2E1 levels and its relationship with oxidative stress using a balanced standard diet to avoid low or high carbohydrate consumption. Rats were fed for 1, 2, 3, or 4 weeks a commercial diet plus an ethanol-sucrose solution. The results have shown that ethanol administration was associated with CYP2E1 induction and stabilization without related oxidative stress. Our findings suggest that experimental models with a low-carbohydrate/high-fat diet produce some undesirable CYP2E1 changes that are not present when a balanced standard diet is given.
Anthias, Chloe; O'Donnell, Paul V; Kiefer, Deidre M; Yared, Jean; Norkin, Maxim; Anderlini, Paolo; Savani, Bipin N; Diaz, Miguel A; Bitan, Menachem; Halter, Joerg P; Logan, Brent R; Switzer, Galen E; Pulsipher, Michael A; Confer, Dennis L; Shaw, Bronwen E
2016-01-01
Previous studies have identified healthcare practices that may place undue pressure on related donors (RDs) of hematopoietic cell products, and an increase in serious adverse events associated with morbidities in this population. As a result, specific requirements to safeguard RD health have been introduced to FACT-JACIE Standards, but the impact of accreditation on RD care has not previously been evaluated. A survey of transplant program directors of EBMT member centers was conducted by the Donor Health and Safety Working Committee of the Center for International Blood and Marrow Transplant Research (CIBMTR) to test the hypothesis that RD care in FACT-JACIE accredited centers is more closely aligned with international consensus donor care recommendations than RD care delivered in centers without accreditation. Responses were received from 39% of 304 centers. Our results show that practice in accredited centers was much closer to recommended standards as compared to non-accredited centers. Specifically, a higher percentage of accredited centers use eligibility criteria to assess RDs (93% versus 78%; P=0.02) and a lower percentage have a single physician simultaneously responsible for a RD and their recipient (14% versus 35%; P=0.008). In contrast, where regulatory standards do not exist, both accredited and non-accredited centers fell short of accepted best practice. These results raise concerns that despite improvements in care, current practice can place undue pressure on donors, and may increase the risk of donation-associated adverse events. We recommend measures to address these issues through enhancement of regulatory standards as well as national initiatives to standardize RD care. PMID:26597079
Hill, R.F., E-mail: robin.hill@email.cs.nsw.gov.a [Institute of Medical Physics, School of Physics, University of Sydney, Sydney NSW 2006 (Australia); Department of Radiation Oncology, Royal Prince Alfred Hospital, Camperdown NSW 2050 (Australia); Tofts, P.S. [Brighton and Sussex Medical School, Brighton BN1 9RR (United Kingdom); Institute of Neurology, University College London, London WC1N 3BG (United Kingdom); Baldock, C. [Institute of Medical Physics, School of Physics, University of Sydney, Sydney NSW 2006 (Australia)
2010-08-15
Bland-Altman analysis is used to compare two different methods of measurement and to determine whether a new method of measurement may replace an existing accepted 'gold standard' method. In this work, Bland-Altman analysis has been applied to radiation dosimetry to compare the PTW Markus and Roos parallel plate ionisation chambers and a PTW PinPoint chamber against a Farmer type ionisation chamber which is accepted as the gold standard for radiation dosimetry in the clinic. Depth doses for low energy x-rays beams with energies of 50, 75 and 100 kVp were measured using each of the ionisation chambers. Depth doses were also calculated by interpolation of the data in the British Journal of Radiology (BJR) Report 25. From the Bland-Altman analysis, the mean dose difference between the two parallel plate chambers and the Farmer chambers was 1% over the range of depths measured. The PinPoint chamber gave significant dose differences compared to the Farmer chamber. There were also differences of up to 12% between the BJR Report 25 depth doses and the measured data. For the Bland-Altman plots, the lines representing the limits of agreement were selected to be a particular percentage agreement e.g. 1 or 2%, instead of being based on the standard deviation ({sigma}) of the differences. The Bland-Altman statistical analysis is a powerful tool for making comparisons of ionisation chambers with an ionisation chamber that has been accepted as a 'gold standard'. Therefore we conclude that Bland-Altman analysis does have a role in assessing radiation dosimeter performance relative to an established standard.
Uncertainty Analysis of Thermal Comfort Parameters
Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages
2015-08-01
International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.
Mika Augustyn
2016-12-01
Full Text Available Plum trees of ‘Elena’, designed for mechanical harvesting with a straddle self-propelled harvester, were planted in 2008 in the experimental orchard at Dąbrowice at a distance of 4 × 1.5 and 2.0 m. The trees were trained to a central leader to a height of 2.7 m and 1.5 or 2.0 m spread. Plum trees designed for mechanical harvesting with a small tractor-driven harvester were spaced at 4 × 1.0 or 1.5 m and were trellised horizontally on wires stretched along rows 0.8 m above the ground. Fruits were harvested in 2012–2015. The cumulative yield from the trellised trees was only half of that from the trees trained to a central leader, whereas the fruit load index (weight of fruits per m3 canopy was the highest at 4 × 1.0 m. To explain this phenomenon, studies were conducted in 2015 on light relations in the two training systems. The studies revealed that light transmission has different patterns in the two training systems, but the level of light interception was nearly similar. Light distribution was more beneficial for photosynthesis in the central leader trees. The trees trained to a horizontal canopy had poor illumination at the canopy base. The main reason of low productivity of the horizontal canopy was low canopy volume.
Calibration procedure for a laser triangulation scanner with uncertainty evaluation
Genta, Gianfranco; Minetola, Paolo; Barbato, Giulio
2016-11-01
Most of low cost 3D scanning devices that are nowadays available on the market are sold without a user calibration procedure to correct measurement errors related to changes in environmental conditions. In addition, there is no specific international standard defining a procedure to check the performance of a 3D scanner along time. This paper aims at detailing a thorough methodology to calibrate a 3D scanner and assess its measurement uncertainty. The proposed procedure is based on the use of a reference ball plate and applied to a triangulation laser scanner. Experimental results show that the metrological performance of the instrument can be greatly improved by the application of the calibration procedure that corrects systematic errors and reduces the device's measurement uncertainty.
Generalized Uncertainty Principle and Analogue of Quantum Gravity in Optics
Braidotti, Maria Chiara; Conti, Claudio
2016-01-01
The design of optical systems capable of processing and manipulating ultra-short pulses and ultra-focused beams is highly challenging with far reaching fundamental technological applications. One key obstacle routinely encountered while implementing sub-wavelength optical schemes is how to overcome the limitations set by standard Fourier optics. A strategy to overcome these difficulties is to utilize the concept of generalized uncertainty principle (G-UP) that has been originally developed to study quantum gravity. In this paper we propose to use the concept of G-UP within the framework of optics to show that the generalized Schrodinger equation describing short pulses and ultra-focused beams predicts the existence of a minimal spatial or temporal scale which in turn implies the existence of maximally localized states. Using a Gaussian wavepacket with complex phase, we derive the corresponding generalized uncertainty relation and its maximally localized states. We numerically show that the presence of nonlin...
利用不确定性关系计算夸克-胶子等离子体的寿命%To Calculate the Lifetime of Quark-gluon Plasma with Uncertainty Relation
王栋; 傅永平
2013-01-01
In this paper, the Uncertainty Relation is used to calculate the lifetime of quark-gluon plasma, this method avoids the complicated theoretical derivation, and the physical meaning and physical process is very clear.%利用不确定性关系可以计算出夸克-胶子等离子体的寿命，这种方法既避免了复杂的理论推导，又明确了物理过程和物理意义。
Uncertainty of dose measurement in radiation processing
Miller, A.
1996-01-01
The major standard organizations of the world have addressed the issue of reporting uncertainties in measurement reports and certificates. There is, however, still some ambiguity in the minds of many people who try to implement the recommendations in real life. This paper is a contribution...... to the running debate and presents the author's view, which is based upon experience in radiation processing dosimetry. The origin of all uncertainty components must be identified and can be classified according to Type A and Type B, but it is equally important to separate the uncertainty components into those...... that contribute to the observable uncertainty of repeated measurements and those that do not. Examples of the use of these principles are presented in the paper....
Measurement Errors and Uncertainties Theory and Practice
Rabinovich, Semyon G
2006-01-01
Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...
Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)
Jordan, D.; Kurtz, S.; Hansen, C.
2014-04-01
Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.
Parameter and Uncertainty Estimation in Groundwater Modelling
Jensen, Jacob Birk
The data basis on which groundwater models are constructed is in general very incomplete, and this leads to uncertainty in model outcome. Groundwater models form the basis for many, often costly decisions and if these are to be made on solid grounds, the uncertainty attached to model results must...... be quantified. This study was motivated by the need to estimate the uncertainty involved in groundwater models.Chapter 2 presents an integrated surface/subsurface unstructured finite difference model that was developed and applied to a synthetic case study.The following two chapters concern calibration...... and uncertainty estimation. Essential issues relating to calibration are discussed. The classical regression methods are described; however, the main focus is on the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The next two chapters describe case studies in which the GLUE methodology...