DEFF Research Database (Denmark)
Garde, Anne Helene; Hansen, Ase Marie; Kristiansen, Jesper
2004-01-01
When measuring biomarkers in urine, volume (and time) or concentration of creatinine are both accepted methods of standardization for diuresis. Both types of standardization contribute uncertainty to the final result. The aim of the present paper was to compare the uncertainty introduced when usi...... increase in convenience for the participants, when collecting small volumes rather than complete 24 h samples....... the two types of standardization on 24 h samples from healthy individuals. Estimates of uncertainties were based on results from the literature supplemented with data from our own studies. Only the difference in uncertainty related to the two standardization methods was evaluated. It was found...... that the uncertainty associated with creatinine standardization (19-35%) was higher than the uncertainty related to volume standardization (up to 10%, when not correcting for deviations from 24 h) for 24 h urine samples. However, volume standardization introduced an average bias of 4% due to missed volumes...
Simplified propagation of standard uncertainties
International Nuclear Information System (INIS)
Shull, A.H.
1997-01-01
An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper
International Nuclear Information System (INIS)
Khalil, M. Y.
2006-01-01
Full text: The Instrumental Neutron Activation Analysis (INAA) Laboratory of Egypt Second Training and Research Reactor (ETRR-2) is increasingly requested to perform multi-element analysis to large number of samples from different origins. The INAA laboratory has to demonstrate competence by conforming to appropriate internationally and nationally accepted standards. The objective of this work is to determine the uncertainty budget and sensitivity of the INAA laboratory measurements. Concentrations of 9 elements; Mn, Na, K, Ca, Co, Cr, Fe, Rb, and Cs, were measured against a certified test sample. Relative, absolute, and Ko-IAEA standardization methods were employed and results compared. The flux was monitored using cadmium covered gold method, and multifoil (gold, nickel and zirconium) method. The combined and expanded uncertainties were estimated. Uncertainty of concentrations ranged between 2-21% depending on the standardization method used. The relative method, giving the lowest uncertainty, produced uncertainty budget between 2 and 11%. The minimum detectable concentration was the lowest for Cs ranging between 0.36 and 0.59 ppb and the highest being for K in the range of 0.32 to 8.64 ppm
Uncertainty relation and probability. Numerical illustration
International Nuclear Information System (INIS)
Fujikawa, Kazuo; Umetsu, Koichiro
2011-01-01
The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)
Additivity of entropic uncertainty relations
Directory of Open Access Journals (Sweden)
René Schwonnek
2018-03-01
Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.
Relational uncertainty in service dyads
DEFF Research Database (Denmark)
Kreye, Melanie
2017-01-01
in service dyads and how they resolve it through suitable organisational responses to increase the level of service quality. Design/methodology/approach: We apply the overall logic of Organisational Information-Processing Theory (OIPT) and present empirical insights from two industrial case studies collected...... the relational uncertainty increased the functional quality while resolving the partner’s organisational uncertainty increased the technical quality of the delivered service. Originality: We make two contributions. First, we introduce relational uncertainty to the OM literature as the inability to predict...... and explain the actions of a partnering organisation due to a lack of knowledge about their abilities and intentions. Second, we present suitable organisational responses to relational uncertainty and their effect on service quality....
STANDARDIZATION OF UNCERTAINTY SITUATIONS IN TRAINING MODULES
Directory of Open Access Journals (Sweden)
Sergey A. Safontsev
2015-01-01
Full Text Available The aim of this study is the description of modular structure of the academic discipline in accordance with the requirements of Federal State Educational Standards.Methods. The authors use the methods of standardization of the educational system that are based on educational theory quality measurement. As the process of learning does not depend on the perspective of the diagnostician, the objectification of the results has been achieved by using relative units, allowing the authors to compare the effectiveness of different stages of education quality assessment with quantitative methods. Furthermore, sampling method, correlation and comparative analysis of statistical significance of the obtained distributions have been used to exclude from analytical data the results that were not confirmed experimentally.Results. Statistical methods are presented in a complex, allowing the authors to receive experimental result at level of the statistical importance of psychological and pedagogical researches. According to ideas of the competence-based education and general theory of systems and educational qualimetry, it is shown that constructs of vocational training are problem, test and detailed designs included in structure of educational modules. Funds of estimated means have been used to measure the level of trainee’s competences: – problematic situation of uncertainty orientation for current control; – situations of test and project orientation for boundary control at the end of each module, intermediate control in the form of the exam held at the end of the semester, as well as state certification of control at the end of the study at the university. Validity of the degree of interest of the learning process, reliability of coincidences constructive reflection of students’ own achievements with independent project performance and efficiency as the ratio of the result obtained to the costs of implementing the target function of the educational
Uncertainty in relative energy resolution measurements
International Nuclear Information System (INIS)
Volkovitsky, P.; Yen, J.; Cumberland, L.
2007-01-01
We suggest a new method for the determination of the detector relative energy resolution and its uncertainty based on spline approximation of experimental spectra and a statistical bootstrapping procedure. The proposed method is applied to the spectra obtained with NaI(Tl) scintillating detectors and 137 Cs sources. The spectrum histogram with background subtracted channel-by-channel is modeled by cubic spline approximation. The relative energy resolution (which is also known as pulse height resolution and energy resolution), defined as the full-width at half-maximum (FWHM) divided by the value of peak centroid, is calculated using the intercepts of the spline curve with the line of the half peak height. The value of the peak height is determined as the point where the value of the derivative goes to zero. The residuals, which are normalized over the square root of counts in a given bin (y-coordinate), obey the standard Gaussian distribution. The values of these residuals are randomly re-assigned to a different set of y-coordinates where a new 'pseudo-experimental' data set is obtained after 'de-normalization' of the old values. For this new data set a new spline approximation is found and the whole procedure is repeated several hundred times, until the standard deviation of relative energy resolution becomes stabilized. The standard deviation of relative energy resolutions calculated for each 'pseudo-experimental' data set (bootstrap uncertainty) is considered to be an estimate for relative energy resolution uncertainty. It is also shown that the relative bootstrap uncertainty is proportional to, and generally only two to three times bigger than, 1/√(N tot ), which is the relative statistical count uncertainty (N tot is the total number of counts under the peak). The newly suggested method is also applicable to other radiation and particle detectors, not only for relative energy resolution, but also for any of the other parameters in a measured spectrum, like
Two multi-dimensional uncertainty relations
International Nuclear Information System (INIS)
Skala, L; Kapsa, V
2008-01-01
Two multi-dimensional uncertainty relations, one related to the probability density and the other one related to the probability density current, are derived and discussed. Both relations are stronger than the usual uncertainty relations for the coordinates and momentum
Uncertainty Relations and Possible Experience
Directory of Open Access Journals (Sweden)
Gregg Jaeger
2016-06-01
Full Text Available The uncertainty principle can be understood as a condition of joint indeterminacy of classes of properties in quantum theory. The mathematical expressions most closely associated with this principle have been the uncertainty relations, various inequalities exemplified by the well known expression regarding position and momentum introduced by Heisenberg. Here, recent work involving a new sort of “logical” indeterminacy principle and associated relations introduced by Pitowsky, expressable directly in terms of probabilities of outcomes of measurements of sharp quantum observables, is reviewed and its quantum nature is discussed. These novel relations are derivable from Boolean “conditions of possible experience” of the quantum realm and have been considered both as fundamentally logical and as fundamentally geometrical. This work focuses on the relationship of indeterminacy to the propositions regarding the values of discrete, sharp observables of quantum systems. Here, reasons for favoring each of these two positions are considered. Finally, with an eye toward future research related to indeterminacy relations, further novel approaches grounded in category theory and intended to capture and reconceptualize the complementarity characteristics of quantum propositions are discussed in relation to the former.
Quantum uncertainty relation based on the mean deviation
Sharma, Gautam; Mukhopadhyay, Chiranjib; Sazim, Sk; Pati, Arun Kumar
2018-01-01
Traditional forms of quantum uncertainty relations are invariably based on the standard deviation. This can be understood in the historical context of simultaneous development of quantum theory and mathematical statistics. Here, we present alternative forms of uncertainty relations, in both state dependent and state independent forms, based on the mean deviation. We illustrate the robustness of this formulation in situations where the standard deviation based uncertainty relation is inapplica...
Measurement uncertainties for vacuum standards at Korea Research Institute of Standards and Science
International Nuclear Information System (INIS)
Hong, S. S.; Shin, Y. H.; Chung, K. H.
2006-01-01
The Korea Research Institute of Standards and Science has three major vacuum systems: an ultrasonic interferometer manometer (UIM) (Sec. II, Figs. 1 and 2) for low vacuum, a static expansion system (SES) (Sec. III, Figs. 3 and 4) for medium vacuum, and an orifice-type dynamic expansion system (DES) (Sec. IV, Figs. 5 and 6) for high and ultrahigh vacuum. For each system explicit measurement model equations with multiple variables are, respectively, given. According to ISO standards, all these system variable errors were used to calculate the expanded uncertainty (U). For each system the expanded uncertainties (k=1, confidence level=95%) and relative expanded uncertainty (expanded uncertainty/generated pressure) are summarized in Table IV and are estimated to be as follows. For UIM, at 2.5-300 Pa generated pressure, the expanded uncertainty is -2 Pa and the relative expanded uncertainty is -2 ; at 1-100 kPa generated pressure, the expanded uncertainty is -5 . For SES, at 3-100 Pa generated pressure, the expanded uncertainty is -1 Pa and the relative expanded uncertainty is -3 . For DES, at 4.6x10 -3 -1.3x10 -2 Pa generated pressure, the expanded uncertainty is -4 Pa and the relative expanded uncertainty is -3 ; at 3.0x10 -6 -9.0x10 -4 Pa generated pressure, the expanded uncertainty is -6 Pa and the relative expanded uncertainty is -2 . Within uncertainty limits our bilateral and key comparisons [CCM.P-K4 (10 Pa-1 kPa)] are extensive and in good agreement with those of other nations (Fig. 8 and Table V)
Improvement of uncertainty relations for mixed states
International Nuclear Information System (INIS)
Park, Yong Moon
2005-01-01
We study a possible improvement of uncertainty relations. The Heisenberg uncertainty relation employs commutator of a pair of conjugate observables to set the limit of quantum measurement of the observables. The Schroedinger uncertainty relation improves the Heisenberg uncertainty relation by adding the correlation in terms of anti-commutator. However both relations are insensitive whether the state used is pure or mixed. We improve the uncertainty relations by introducing additional terms which measure the mixtureness of the state. For the momentum and position operators as conjugate observables and for the thermal state of quantum harmonic oscillator, it turns out that the equalities in the improved uncertainty relations hold
Uncertainty Estimates: A New Editorial Standard
International Nuclear Information System (INIS)
Drake, Gordon W.F.
2014-01-01
Full text: The objective of achieving higher standards for uncertainty estimates in the publication of theoretical data for atoms and molecules requires a concerted effort by both the authors of papers and the editors who send them out for peer review. In April, 2011, the editors of Physical Review A published an Editorial announcing a new standard that uncertainty estimates would be required whenever practicable, and in particular in the following circumstances: 1. If the authors claim high accuracy, or improvements on the accuracy of previous work. 2. If the primary motivation for the paper is to make comparisons with present or future high precision experimental measurements. 3. If the primary motivation is to provide interpolations or extrapolations of known experimental measurements. The new policy means that papers that do not meet these standards are not sent out for peer review until they have been suitably revised, and the authors are so notified immediately upon receipt. The policy has now been in effect for three years. (author
Entropic uncertainty relations-a survey
International Nuclear Information System (INIS)
Wehner, Stephanie; Winter, Andreas
2010-01-01
Uncertainty relations play a central role in quantum mechanics. Entropic uncertainty relations in particular have gained significant importance within quantum information, providing the foundation for the security of many quantum cryptographic protocols. Yet, little is known about entropic uncertainty relations with more than two measurement settings. In the present survey, we review known results and open questions.
Uncertainty relations for approximation and estimation
Energy Technology Data Exchange (ETDEWEB)
Lee, Jaeha, E-mail: jlee@post.kek.jp [Department of Physics, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Tsutsui, Izumi, E-mail: izumi.tsutsui@kek.jp [Department of Physics, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Theory Center, Institute of Particle and Nuclear Studies, High Energy Accelerator Research Organization (KEK), 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan)
2016-05-27
We present a versatile inequality of uncertainty relations which are useful when one approximates an observable and/or estimates a physical parameter based on the measurement of another observable. It is shown that the optimal choice for proxy functions used for the approximation is given by Aharonov's weak value, which also determines the classical Fisher information in parameter estimation, turning our inequality into the genuine Cramér–Rao inequality. Since the standard form of the uncertainty relation arises as a special case of our inequality, and since the parameter estimation is available as well, our inequality can treat both the position–momentum and the time–energy relations in one framework albeit handled differently. - Highlights: • Several inequalities interpreted as uncertainty relations for approximation/estimation are derived from a single ‘versatile inequality’. • The ‘versatile inequality’ sets a limit on the approximation of an observable and/or the estimation of a parameter by another observable. • The ‘versatile inequality’ turns into an elaboration of the Robertson–Kennard (Schrödinger) inequality and the Cramér–Rao inequality. • Both the position–momentum and the time–energy relation are treated in one framework. • In every case, Aharonov's weak value arises as a key geometrical ingredient, deciding the optimal choice for the proxy functions.
Uncertainty relations for approximation and estimation
International Nuclear Information System (INIS)
Lee, Jaeha; Tsutsui, Izumi
2016-01-01
We present a versatile inequality of uncertainty relations which are useful when one approximates an observable and/or estimates a physical parameter based on the measurement of another observable. It is shown that the optimal choice for proxy functions used for the approximation is given by Aharonov's weak value, which also determines the classical Fisher information in parameter estimation, turning our inequality into the genuine Cramér–Rao inequality. Since the standard form of the uncertainty relation arises as a special case of our inequality, and since the parameter estimation is available as well, our inequality can treat both the position–momentum and the time–energy relations in one framework albeit handled differently. - Highlights: • Several inequalities interpreted as uncertainty relations for approximation/estimation are derived from a single ‘versatile inequality’. • The ‘versatile inequality’ sets a limit on the approximation of an observable and/or the estimation of a parameter by another observable. • The ‘versatile inequality’ turns into an elaboration of the Robertson–Kennard (Schrödinger) inequality and the Cramér–Rao inequality. • Both the position–momentum and the time–energy relation are treated in one framework. • In every case, Aharonov's weak value arises as a key geometrical ingredient, deciding the optimal choice for the proxy functions.
Energy and Uncertainty in General Relativity
Cooperstock, F. I.; Dupre, M. J.
2018-03-01
The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.
Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma
2009-10-01
Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.
Nonclassicality in phase-number uncertainty relations
International Nuclear Information System (INIS)
Matia-Hernando, Paloma; Luis, Alfredo
2011-01-01
We show that there are nonclassical states with lesser joint fluctuations of phase and number than any classical state. This is rather paradoxical since one would expect classical coherent states to be always of minimum uncertainty. The same result is obtained when we replace phase by a phase-dependent field quadrature. Number and phase uncertainties are assessed using variance and Holevo relation.
Nonclassicality in phase-number uncertainty relations
Energy Technology Data Exchange (ETDEWEB)
Matia-Hernando, Paloma; Luis, Alfredo [Departamento de Optica, Facultad de Ciencias Fisicas, Universidad Complutense, 28040 Madrid (Spain)
2011-12-15
We show that there are nonclassical states with lesser joint fluctuations of phase and number than any classical state. This is rather paradoxical since one would expect classical coherent states to be always of minimum uncertainty. The same result is obtained when we replace phase by a phase-dependent field quadrature. Number and phase uncertainties are assessed using variance and Holevo relation.
Optimal entropic uncertainty relation for successive measurements ...
Indian Academy of Sciences (India)
measurements in quantum information theory. M D SRINIVAS ... derived by Robertson in 1929 [2] from the first principles of quantum theory, does not ... systems and may hence be referred to as 'uncertainty relations for distinct measurements'.
Ascertaining the uncertainty relations via quantum correlations
International Nuclear Information System (INIS)
Li, Jun-Li; Du, Kun; Qiao, Cong-Feng
2014-01-01
We propose a new scheme to express the uncertainty principle in the form of inequality of the bipartite correlation functions for a given multipartite state, which provides an experimentally feasible and model-independent way to verify various uncertainty and measurement disturbance relations. By virtue of this scheme, the implementation of experimental measurement on the measurement disturbance relation to a variety of physical systems becomes practical. The inequality in turn, also imposes a constraint on the strength of correlation, i.e. it determines the maximum value of the correlation function for two-body system and a monogamy relation of the bipartite correlation functions for multipartite system. (paper)
Majorization uncertainty relations for mixed quantum states
Puchała, Zbigniew; Rudnicki, Łukasz; Krawiec, Aleksandra; Życzkowski, Karol
2018-04-01
Majorization uncertainty relations are generalized for an arbitrary mixed quantum state ρ of a finite size N. In particular, a lower bound for the sum of two entropies characterizing the probability distributions corresponding to measurements with respect to two arbitrary orthogonal bases is derived in terms of the spectrum of ρ and the entries of a unitary matrix U relating both bases. The results obtained can also be formulated for two measurements performed on a single subsystem of a bipartite system described by a pure state, and consequently expressed as an uncertainty relation for the sum of conditional entropies.
International Nuclear Information System (INIS)
Jocelyn, Sabrina; Baudoin, James; Chinniah, Yuvin; Charpentier, Philippe
2014-01-01
In industry, machine users and people who modify or integrate equipment often have to evaluate the safety level of a safety-related control circuit that they have not necessarily designed. The modifications or integrations may involve work to make an existing machine that does not comply with normative or regulatory specifications safe. However, how can a circuit performing a safety function be validated a posteriori? Is the validation exercise feasible? What are the difficulties and limitations of such a procedure? The aim of this article is to answer these questions by presenting a validation study of a safety function of an existing machine. A plastic injection molding machine is used for this study, as well as standard ISO 13849-1:2006. Validation consists of performing an a posteriori (post-design) estimation of the performance level of the safety function. The procedure is studied for two contexts of use of the machine: in industry, and in laboratory. The calculations required by the ISO standard were done using Excel, followed by SIStema software. It is shown that, based on the context of use, the estimated performance level was different for the same safety-related circuit. The variability in the results is explained by the assumptions made by the person undertaking the validation without the involvement of the machine designer. - Highlights: • Validation of the performance level of a safety function is undertaken. • An injection molding machine and ISO 13849-1:2006 standard are used for the procedure. • The procedure is undertaken for two contexts of use of the machine. • In this study, the performance level depends on the context of use. • The assumptions made throughout the study partially explain this difference
Generalized Landau-Pollak uncertainty relation
International Nuclear Information System (INIS)
Miyadera, Takayuki; Imai, Hideki
2007-01-01
The Landau-Pollak uncertainty relation treats a pair of rank one projection valued measures and imposes a restriction on their probability distributions. It gives a nontrivial bound for summation of their maximum values. We give a generalization of this bound (weak version of the Landau-Pollak uncertainty relation). Our generalization covers a pair of positive operator valued measures. A nontrivial but slightly weak inequality that can treat an arbitrary number of positive operator valued measures is also presented. A possible application to the problem of separability criterion is also suggested
Heisenberg's principle of uncertainty and the uncertainty relations
International Nuclear Information System (INIS)
Redei, Miklos
1987-01-01
The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs
Entanglement detection via tighter local uncertainty relations
International Nuclear Information System (INIS)
Zhang Chengjie; Zhang Yongsheng; Guo Guangcan; Nha, Hyunchul
2010-01-01
We propose an entanglement criterion based on local uncertainty relations (LURs) in a stronger form than the original LUR criterion introduced by Hofmann and Takeuchi [H. F. Hofmann and S. Takeuchi, Phys. Rev. A 68, 032103 (2003)]. Using arbitrarily chosen operators (A k ) and (B k ) of subsystems A and B, the tighter LUR criterion, which may be used not only for discrete variables but also for continuous variables, can detect more entangled states than the original criterion.
On uncertainty relations in quantum mechanics
International Nuclear Information System (INIS)
Ignatovich, V.K.
2004-01-01
Uncertainty relations (UR) are shown to have nothing specific for quantum mechanics (QM), being the general property valid for the arbitrary function. A wave function of a particle simultaneously having a precisely defined position and momentum in QM is demonstrated. Interference on two slits in a screen is shown to exist in classical mechanics. A nonlinear classical system of equations replacing the QM Schroedinger equation is suggested. This approach is shown to have nothing in common with the Bohm mechanics
Uncertainties and demonstration of compliance with numerical risk standards
International Nuclear Information System (INIS)
Preyssl, C.; Cullingford, M.C.
1987-01-01
When dealing with numerical results of a probabilistic risk analysis performed for a complex system, such as a nuclear power plant, one major objective may be to deal with the problem of compliance or non-compliance with a prefixed risk standard. The uncertainties in the risk results associated with the consequences and their probabilities of occurrence may be considered by representing the risk as a risk band. Studying the area and distance between the upper and lower bound of the risk band provides consistent information on the uncertainties in terms of risk, not by means of scalars only but also by real functions. Criteria can be defined for determining compliance with a numerical risk standard, and the 'weighting functional' method, representing a possible tool for testing compliance of risk results, is introduced. By shifting the upper confidence bound due to redefinition, part of the risk band may exceed the standard without changing the underlying results. Using the concept described it is possible to determine the amount of risk, i.e. uncertainty, exceeding the standard. The mathematical treatment of uncertainties therefore allows probabilistic risk assessment results to be compared. A realistic example illustrates the method. (author)
Approaches to handling uncertainty when setting environmental exposure standards
DEFF Research Database (Denmark)
Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe
2009-01-01
attempts for the first time to cover the full range of issues related to model uncertainties, from the subjectivity of setting up a conceptual model of a given system, all the way to communicating the nature of model uncertainties to non-scientists and accounting for model uncertainties in policy decisions....... Theoretical chapters, providing background information on specific steps in the modelling process and in the adoption of models by end-users, are complemented by illustrative case studies dealing with soils and global climate change. All the chapters are authored by recognized experts in their respective...
Stronger Schrödinger-like uncertainty relations
International Nuclear Information System (INIS)
Song, Qiu-Cheng; Qiao, Cong-Feng
2016-01-01
Highlights: • A stronger Schrödinger-like uncertainty relation in the sum of variances of two observables is obtained. • An improved Schrödinger-like uncertainty relation in the product of variances of two observables is obtained. • A stronger uncertainty relation in the sum of variances of three observables is proposed. - Abstract: Uncertainty relation is one of the fundamental building blocks of quantum theory. Nevertheless, the traditional uncertainty relations do not fully capture the concept of incompatible observables. Here we present a stronger Schrödinger-like uncertainty relation, which is stronger than the relation recently derived by Maccone and Pati (2014) [11]. Furthermore, we give an additive uncertainty relation which holds for three incompatible observables, which is stronger than the relation newly obtained by Kechrimparis and Weigert (2014) [12] and the simple extension of the Schrödinger uncertainty relation.
Role of information theoretic uncertainty relations in quantum theory
Energy Technology Data Exchange (ETDEWEB)
Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)
2015-04-15
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.
Role of information theoretic uncertainty relations in quantum theory
International Nuclear Information System (INIS)
Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo
2015-01-01
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
Entropy-power uncertainty relations: towards a tight inequality for all Gaussian pure states
International Nuclear Information System (INIS)
Hertz, Anaelle; Jabbour, Michael G; Cerf, Nicolas J
2017-01-01
We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate continuous variables relies on entropy power, a standard notion in Shannon information theory for real-valued signals. The resulting entropy-power uncertainty relation is equivalent to the entropic formulation of the uncertainty relation due to Bialynicki-Birula and Mycielski, but can be further extended to rotated variables. Hence, based on a reasonable assumption, we give a partial proof of a tighter form of the entropy-power uncertainty relation taking correlations into account and provide extensive numerical evidence of its validity. Interestingly, it implies the generalized (rotation-invariant) Schrödinger–Robertson uncertainty relation exactly as the original entropy-power uncertainty relation implies Heisenberg relation. It is saturated for all Gaussian pure states, in contrast with hitherto known entropic formulations of the uncertainty principle. (paper)
Workshop on Squeezed States and Uncertainty Relations
International Nuclear Information System (INIS)
Han, D.; Kim, Y.S.; Zachary, W.W.
1992-02-01
The proceedings from the workshop are presented, and the focus was on the application of squeezed states. There are many who say that the potential for industrial applications is enormous, as the history of the conventional laser suggests. All those who worked so hard to produce squeezed states of light are continuing their efforts to construct more efficient squeezed-state lasers. Quite naturally, they are looking for new experiments using these lasers. The physical basis of squeezed states is the uncertainty relation in Fock space, which is also the basis for the creation and annihilation of particles in quantum field theory. Indeed, squeezed states provide a unique opportunity for field theoreticians to develop a measurement theory for quantum field theory
Uncertainty relations and topological-band insulator transitions in 2D gapped Dirac materials
International Nuclear Information System (INIS)
Romera, E; Calixto, M
2015-01-01
Uncertainty relations are studied for a characterization of topological-band insulator transitions in 2D gapped Dirac materials isostructural with graphene. We show that the relative or Kullback–Leibler entropy in position and momentum spaces, and the standard variance-based uncertainty relation give sharp signatures of topological phase transitions in these systems. (paper)
International Nuclear Information System (INIS)
Silva, T.A. da
1988-01-01
The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt
Some applications of uncertainty relations in quantum information
Majumdar, A. S.; Pramanik, T.
2016-08-01
We discuss some applications of various versions of uncertainty relations for both discrete and continuous variables in the context of quantum information theory. The Heisenberg uncertainty relation enables demonstration of the Einstein, Podolsky and Rosen (EPR) paradox. Entropic uncertainty relations (EURs) are used to reveal quantum steering for non-Gaussian continuous variable states. EURs for discrete variables are studied in the context of quantum memory where fine-graining yields the optimum lower bound of uncertainty. The fine-grained uncertainty relation is used to obtain connections between uncertainty and the nonlocality of retrieval games for bipartite and tripartite systems. The Robertson-Schrödinger (RS) uncertainty relation is applied for distinguishing pure and mixed states of discrete variables.
Advancing Uncertainty: Untangling and Discerning Related Concepts
Directory of Open Access Journals (Sweden)
Janice Penrod
2002-12-01
Full Text Available Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal investigation into the meaning of the concept. Through concept analysis, the concept was deconstructed to identify conceptual components and gaps in understanding. Using this skeletal framework of the concept identified through concept analysis, subsequent studies were carried out to add ‘flesh’ to the concept. First, a concept refinement using the literature as data was completed. Findings revealed that the current state of the concept of uncertainty failed to incorporate what was known of the lived experience. Therefore, using interview techniques as the primary data source, a phenomenological study of uncertainty among caregivers was conducted. Incorporating the findings of the phenomenology, the skeletal framework of the concept was further fleshed out using techniques of concept correction to produce a more mature conceptualization of uncertainty. In this section, I describe the flow of this qualitative project investigating the concept of uncertainty, with special emphasis on a particular threat to validity (called conceptual tunnel vision that was identified and addressed during the phases of concept correction. Though in this article I employ a study of uncertainty for illustration, limited substantive findings regarding uncertainty are presented to retain a clear focus on the methodological issues.
Advancing Uncertainty: Untangling and Discerning Related Concepts
Janice Penrod
2002-01-01
Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal in...
Measurement uncertainty. A practical guide for Secondary Standards Dosimetry Laboratories
International Nuclear Information System (INIS)
2008-05-01
The need for international traceability for radiation dose measurements has been understood since the early nineteen-sixties. The benefits of high dosimetric accuracy were recognized, particularly in radiotherapy, where the outcome of treatments is dependent on the radiation dose delivered to patients. When considering radiation protection dosimetry, the uncertainty may be greater than for therapy, but proper traceability of the measurements is no less important. To ensure harmonization and consistency in radiation measurements, the International Atomic Energy Agency (IAEA) and the World Health Organization (WHO) created a Network of Secondary Standards Dosimetry Laboratories (SSDLs) in 1976. An SSDL is a laboratory that has been designated by the competent national authorities to undertake the duty of providing the necessary link in the traceability chain of radiation dosimetry to the international measurement system (SI, for Systeme International) for radiation metrology users. The role of the SSDLs is crucial in providing traceable calibrations; they disseminate calibrations at specific radiation qualities appropriate for the use of radiation measuring instruments. Historically, although the first SSDLs were established mainly to provide radiotherapy level calibrations, the scope of their work has expanded over the years. Today, many SSDLs provide traceability for radiation protection measurements and diagnostic radiology in addition to radiotherapy. Some SSDLs, with the appropriate facilities and expertise, also conduct quality audits of the clinical use of the calibrated dosimeters - for example, by providing postal dosimeters for dose comparisons for medical institutions or on-site dosimetry audits with an ion chamber and other appropriate equipment. The requirements for traceable and reliable calibrations are becoming more important. For example, for international trade where radiation products are manufactured within strict quality control systems, it is
Automated system for calculating the uncertainty of standards
International Nuclear Information System (INIS)
Harvel, C.D.
1990-01-01
Working Calibration and Test Material (WCTM) solutions are essential as standards in the surveillance of analytical methods, the calibration of equipment and methods, and the training and testing of laboratory personnel. Before the WCTM can be used it must be characterized. That is, the WCTM concentration and its associated uncertainty must be estimated. The characterization of a WCTM is a tedious process. The chemistry and subsequent statistical analysis require a significant amount of care. For a nonstatistician, the statistical analysis of a WCTM characterization can be quite difficult. In addition, the WCTM traceability and characterization must be thoroughly documented as required by DOE Order 5633.3 [1]. An automated system can easily do the statistical analysis and provide the necessary documentation. 3 refs., 2 figs
EDITORIAL: Squeezed states and uncertainty relations
Jauregue-Renaud, Rocio; Kim, Young S.; Man'ko, Margarita A.; Moya-Cessa, Hector
2004-06-01
This special issue of Journal of Optics B: Quantum and Semiclassical Optics is composed mainly of extended versions of talks and papers presented at the Eighth International Conference on Squeezed States and Uncertainty Relations held in Puebla, Mexico on 9-13 June 2003. The Conference was hosted by Instituto de Astrofísica, Óptica y Electrónica, and the Universidad Nacional Autónoma de México. This series of meetings began at the University of Maryland, College Park, USA, in March 1991. The second and third workshops were organized by the Lebedev Physical Institute in Moscow, Russia, in 1992 and by the University of Maryland Baltimore County, USA, in 1993, respectively. Afterwards, it was decided that the workshop series should be held every two years. Thus the fourth meeting took place at the University of Shanxi in China and was supported by the International Union of Pure and Applied Physics (IUPAP). The next three meetings in 1997, 1999 and 2001 were held in Lake Balatonfüred, Hungary, in Naples, Italy, and in Boston, USA, respectively. All of them were sponsored by IUPAP. The ninth workshop will take place in Besançon, France, in 2005. The conference has now become one of the major international meetings on quantum optics and the foundations of quantum mechanics, where most of the active research groups throughout the world present their new results. Accordingly this conference has been able to align itself to the current trend in quantum optics and quantum mechanics. The Puebla meeting covered most extensively the following areas: quantum measurements, quantum computing and information theory, trapped atoms and degenerate gases, and the generation and characterization of quantum states of light. The meeting also covered squeeze-like transformations in areas other than quantum optics, such as atomic physics, nuclear physics, statistical physics and relativity, as well as optical devices. There were many new participants at this meeting, particularly
State-independent uncertainty relations and entanglement detection
Qian, Chen; Li, Jun-Li; Qiao, Cong-Feng
2018-04-01
The uncertainty relation is one of the key ingredients of quantum theory. Despite the great efforts devoted to this subject, most of the variance-based uncertainty relations are state-dependent and suffering from the triviality problem of zero lower bounds. Here we develop a method to get uncertainty relations with state-independent lower bounds. The method works by exploring the eigenvalues of a Hermitian matrix composed by Bloch vectors of incompatible observables and is applicable for both pure and mixed states and for arbitrary number of N-dimensional observables. The uncertainty relation for the incompatible observables can be explained by geometric relations related to the parallel postulate and the inequalities in Horn's conjecture on Hermitian matrix sum. Practical entanglement criteria are also presented based on the derived uncertainty relations.
New class of uncertainty relations for partially coherent light
Bastiaans, M.J.
1984-01-01
A class of uncertainty relations for partially coherent light is derived; the uncertainty relations in this class express the fact that the product of the effective widths of the space-domain intensity and the spatial-frequency-domain intensity of the light has a lower bound and that this lower
Uncertainty relations, zero point energy and the linear canonical group
Sudarshan, E. C. G.
1993-01-01
The close relationship between the zero point energy, the uncertainty relations, coherent states, squeezed states, and correlated states for one mode is investigated. This group-theoretic perspective enables the parametrization and identification of their multimode generalization. In particular the generalized Schroedinger-Robertson uncertainty relations are analyzed. An elementary method of determining the canonical structure of the generalized correlated states is presented.
Uncertainty relations for information entropy in wave mechanics
International Nuclear Information System (INIS)
Bialynicki-Birula, I.; Pittsburgh Univ., Pa.; Mycielski, J.
1975-01-01
New uncertainty relations in quantum mechanics are derived. They express restrictions imposed by quantum theory on probability distributions of canonically conjugate variables in terms of corresponding information entropies. The Heisenberg uncertainty relation follows from those inequalities and so does the Gross-Nelson inequality. (orig.) [de
Generalized uncertainty relations and characteristic invariants for the multimode states
International Nuclear Information System (INIS)
Sudarshan, E.C.G.; Chiu, C.B.; Bhamathi, G.
1995-01-01
The close relationship between the zero-point energy, the uncertainty relation, coherent states, squeezed states, and correlated states for one mode is investigated. This group theoretic perspective of the problem enables the parametrization and identification of their multimode generalization. A simple and efficient method of determining the canonical structure of the generalized correlated states is presented. Implication of canonical commutation relations for correlations are not exhausted by the Heisenberg uncertainty relation, not even by the Schroedinger-Robertson uncertainty inequality, but there are relations in the multimode case that are the generalization of the Schroedinger-Robertson relation
Some uncertainties associated with preparation of standards in organic matrix
International Nuclear Information System (INIS)
Cholewa, M.; Hanson, A.L.; Jones, K.W.; McNally, W.P.; Fand, I.
1986-01-01
Until recently no techniques which have multielement detection capability, high sensitivity and good spatial resolution in relatively thick tissue sections have existed. The use of proton induced x-ray emission /PIXE/ and synchrotron radiation induced x-ray emission /SRIXE/ using proton and x-ray microbeams changed this situation. However there are some difficulties with the existing standards for quantitatively calibrating the trace elements concentration in biological materials. For the purpose of our experiments a special technique of standard production was applied. 7 refs., 1 tab., 1 fig
Banks, H T; Holm, Kathleen; Robbins, Danielle
2010-11-01
We computationally investigate two approaches for uncertainty quantification in inverse problems for nonlinear parameter dependent dynamical systems. We compare the bootstrapping and asymptotic theory approaches for problems involving data with several noise forms and levels. We consider both constant variance absolute error data and relative error which produces non-constant variance data in our parameter estimation formulations. We compare and contrast parameter estimates, standard errors, confidence intervals, and computational times for both bootstrapping and asymptotic theory methods.
Measurement Uncertainty Relations for Discrete Observables: Relative Entropy Formulation
Barchielli, Alberto; Gregoratti, Matteo; Toigo, Alessandro
2018-02-01
We introduce a new information-theoretic formulation of quantum measurement uncertainty relations, based on the notion of relative entropy between measurement probabilities. In the case of a finite-dimensional system and for any approximate joint measurement of two target discrete observables, we define the entropic divergence as the maximal total loss of information occurring in the approximation at hand. For fixed target observables, we study the joint measurements minimizing the entropic divergence, and we prove the general properties of its minimum value. Such a minimum is our uncertainty lower bound: the total information lost by replacing the target observables with their optimal approximations, evaluated at the worst possible state. The bound turns out to be also an entropic incompatibility degree, that is, a good information-theoretic measure of incompatibility: indeed, it vanishes if and only if the target observables are compatible, it is state-independent, and it enjoys all the invariance properties which are desirable for such a measure. In this context, we point out the difference between general approximate joint measurements and sequential approximate joint measurements; to do this, we introduce a separate index for the tradeoff between the error of the first measurement and the disturbance of the second one. By exploiting the symmetry properties of the target observables, exact values, lower bounds and optimal approximations are evaluated in two different concrete examples: (1) a couple of spin-1/2 components (not necessarily orthogonal); (2) two Fourier conjugate mutually unbiased bases in prime power dimension. Finally, the entropic incompatibility degree straightforwardly generalizes to the case of many observables, still maintaining all its relevant properties; we explicitly compute it for three orthogonal spin-1/2 components.
Sensitivity, uncertainty assessment, and target accuracies related to radiotoxicity evaluation
International Nuclear Information System (INIS)
Palmiotti, G.; Salvatores, M.; Hill, R.N.
1994-01-01
Time-dependent sensitivity techniques, which have been used in the past for standard reactor applications, are adapted to calculate the impact of data uncertainties and to estimate target data accuracies in radiotoxicity evaluations. The methodology is applied to different strategies of radioactive waste management connected with the European Fast Reactor and the Integral Fast Reactor fuel cycles. Results are provided in terms of sensitivity coefficients of basic data (cross sections and decay constants), uncertainties of global radiotoxicity at different times of storing after discharge, and target data accuracies needed to satisfy maximum uncertainty limits
Reconsiderations of long debated subjects: uncertainty relations and Planck's constant
International Nuclear Information System (INIS)
Dumitru, S.
2005-01-01
Some earlier unresolved controversies about uncertainty relations and quantum measurements have persisted to this day. They originate in the shortcomings of the conventional interpretation of uncertainty relations. In this paper, we showed that those shortcomings exposed credible, unavoidable facts making it imperative that the conventional interpretation should be dropped. So, the primitive uncertainty relations appeared as being either figments or fluctuation formulae. Subsequently, we showed that for quantum microparticles the Planck constant h acted as an indicator of stochasticity, a role entirely similar to the one the Boltzmann constant k played in respect of the thermodynamic stochasticity of macroscopic systems. (author)
Uncertainty relation and simultaneous measurements in quantum theory
International Nuclear Information System (INIS)
Busch, P.
1982-01-01
In this thesis the question for the interpretation of the uncertainty relation is picked up, and a program for the justification of its individualistic interpretation is formulated. By means of quantum mechanical models for the position and momentum measurement a justification of the interpretaton has been tried by reconstruction of the origin of the uncertainties from the conditions of the measuring devices and the determination of the relation of the measured results to the object. By means of a model of the common measurement it could be shown how the uncertainty relation results from the not eliminable mutual disturbance of the devices and the uncertainty relation for the measuring system. So finally the commutation relation is conclusive. For the illustration the split experiment is discussed, first according to Heisenberg with fixed split, then for the quantum mechanical, movable split (Bohr-Einstein). (orig./HSI) [de
Do the Uncertainty Relations Really have Crucial Significances for Physics?
Directory of Open Access Journals (Sweden)
Dumitru S.
2010-10-01
Full Text Available It is proved the falsity of idea that the Uncertainty Relations (UR have crucial significances for physics. Additionally one argues for the necesity of an UR-disconnected quantum philosophy.
Tightness Entropic Uncertainty Relation in Quantum Markovian-Davies Environment
Zhang, Jun; Liu, Liang; Han, Yan
2018-05-01
In this paper, we investigate the tightness of entropic uncertainty relation in the absence (presence) of the quantum memory which the memory particle being weakly coupled to a decohering Davies-type Markovian environment. The results show that the tightness of the quantum uncertainty relation can be controlled by the energy relaxation time F, the dephasing time G and the rescaled temperature p, the perfect tightness can be arrived by dephasing and energy relaxation satisfying F = 2G and p = 1/2. In addition, the tightness of the memory-assisted entropic uncertainty relation and the entropic uncertainty relation can be influenced mainly by the purity. While in memory-assisted model, the purity and quantum correlation can also influence the tightness actively while the quantum entanglement can influence the tightness slightly.
Uncertainty relations and semi-groups in B-algebras
International Nuclear Information System (INIS)
Papaloucas, L.C.
1980-07-01
Starting from a B-algebra which satisfies the conditions of a structure theorem, we obtain directly a Lie algebra for which the Lie ring satisfies automatically the Heisenberg uncertainty relations. (author)
Stakeholder relations in the oil sands : managing uncertainty
Energy Technology Data Exchange (ETDEWEB)
NONE
2009-05-15
Alberta's oil sands are now at the crossroads of a series of significant and complex global issues that will require careful negotiation by all stakeholders involved in the oil sands industry. This paper discussed methods of managing uncertainty and risk related to the oil sands industry's agenda for the future. Oil sands developers must continue to secure permission from communities and other key stakeholders in order to develop oil sand projects. Stakeholder relations between oil sands operators, First Nations, and Metis Nation communities must ensure that respect is maintained while environmental impacts are minimized and long-term economic benefits are secured for all parties. Environmental non-governmental organizations (ENGOs) must ensure that oil sands resources are developed responsibly, and that environmental standards are maintained. Seven key shifts in stakeholder relations resulting from the recent economic crisis were identified. These included (1) withdrawal from the multi-stakeholder process, (2) increased focus on government to demonstrate policy leadership, (3) a stronger push from ENGOs to express environmental concerns, (4) global lobby and public relations efforts from ENGOs, (5) companies retreating to local community stakeholders, (6) more active demands from First Nations and Metis Nations groups, and (7) companies challenging ENGO campaigns. The study concluded by suggesting that government leadership is needed to clear policy and regulatory frameworks for Canada's oil sands.
Squeezed States and Uncertainty Relations. Abstracts
International Nuclear Information System (INIS)
Masahito, Hayashi; Reynaud, S.; Jaekel, M.Th.; Fiuraaek, J.; Garcia-Patron, R.; Cerf, N.J.; Hage, B.; Chelkowski, S.; Franzen, A.; Lastzka, N.; Vahlbruch, N.; Danzmann, K.; Schnabel, R.; Hassan, S.S.; Joshi, A.; Jakob, M.; Bergou, J.A.; Kozlovskii, A.V.; Prakash, H.; Kumar, R.
2005-01-01
The purpose of the conference was to bring together people working in the field of quantum optics, with special emphasis on non-classical light sources and related areas, quantum computing, statistical mechanics and mathematical physics. As a novelty, this edition will include the topics of quantum imaging, quantum phase noise and number theory in quantum mechanics. This document gives the program of the conference and gathers the abstracts
Uncertainty relation on a world crystal and its applications to micro black holes
International Nuclear Information System (INIS)
Jizba, Petr; Kleinert, Hagen; Scardigli, Fabio
2010-01-01
We formulate generalized uncertainty relations in a crystal-like universe - a 'world crystal' - whose lattice spacing is of the order of Planck length. In the particular case when energies lie near the border of the Brillouin zone, i.e., for Planckian energies, the uncertainty relation for position and momenta does not pose any lower bound on involved uncertainties. We apply our results to micro black holes physics, where we derive a new mass-temperature relation for Schwarzschild micro black holes. In contrast to standard results based on Heisenberg and stringy uncertainty relations, our mass-temperature formula predicts both a finite Hawking's temperature and a zero rest-mass remnant at the end of the micro black hole evaporation. We also briefly mention some connections of the world-crystal paradigm with 't Hooft's quantization and double special relativity.
International Nuclear Information System (INIS)
Landsberg, P.T.
1990-01-01
This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)
Universal quantum uncertainty relations between nonergodicity and loss of information
Awasthi, Natasha; Bhattacharya, Samyadeb; SenDe, Aditi; Sen, Ujjwal
2018-03-01
We establish uncertainty relations between information loss in general open quantum systems and the amount of nonergodicity of the corresponding dynamics. The relations hold for arbitrary quantum systems interacting with an arbitrary quantum environment. The elements of the uncertainty relations are quantified via distance measures on the space of quantum density matrices. The relations hold for arbitrary distance measures satisfying a set of intuitively satisfactory axioms. The relations show that as the nonergodicity of the dynamics increases, the lower bound on information loss decreases, which validates the belief that nonergodicity plays an important role in preserving information of quantum states undergoing lossy evolution. We also consider a model of a central qubit interacting with a fermionic thermal bath and derive its reduced dynamics to subsequently investigate the information loss and nonergodicity in such dynamics. We comment on the "minimal" situations that saturate the uncertainty relations.
An analysis of combined standard uncertainty for radiochemical measurements of environmental samples
International Nuclear Information System (INIS)
Berne, A.
1996-01-01
It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained
The role of general relativity in the uncertainty principle
International Nuclear Information System (INIS)
Padmanabhan, T.
1986-01-01
The role played by general relativity in quantum mechanics (especially as regards the uncertainty principle) is investigated. It is confirmed that the validity of time-energy uncertainty does depend on gravitational time dilation. It is also shown that there exists an intrinsic lower bound to the accuracy with which acceleration due to gravity can be measured. The motion of equivalence principle in quantum mechanics is clarified. (author)
Uncertainties related to the fault tree reliability data
International Nuclear Information System (INIS)
Apostol, Minodora; Nitoi, Mirela; Farcasiu, M.
2003-01-01
Uncertainty analyses related to the fault trees evaluate the system variability which appears from the uncertainties of the basic events probabilities. Having a logical model which describes a system, to obtain outcomes means to evaluate it, using estimations for each basic event of the model. If the model has basic events that incorporate uncertainties, then the results of the model should incorporate the uncertainties of the events. Uncertainties estimation in the final result of the fault tree means first the uncertainties evaluation for the basic event probabilities and then combination of these uncertainties, to calculate the top event uncertainty. To calculate the propagating uncertainty, a knowledge of the probability density function as well as the range of possible values of the basic event probabilities is required. The following data are defined, using suitable probability density function: the components failure rates; the human error probabilities; the initiating event frequencies. It was supposed that the possible value distribution of the basic event probabilities is given by the lognormal probability density function. To know the range of possible value of the basic event probabilities, the error factor or the uncertainty factor is required. The aim of this paper is to estimate the error factor for the failure rates and for the human errors probabilities from the reliability data base used in Cernavoda Probabilistic Safety Evaluation. The top event chosen as an example is FEED3, from the Pressure and Inventory Control System. The quantitative evaluation of this top event was made by using EDFT code, developed in Institute for Nuclear Research Pitesti (INR). It was supposed that the error factors for the component failures are the same as for the failure rates. Uncertainty analysis was made with INCERT application, which uses the moment method and Monte Carlo method. The reliability data base used at INR Pitesti does not contain the error factors (ef
New Inequalities and Uncertainty Relations on Linear Canonical Transform Revisit
Directory of Open Access Journals (Sweden)
Xu Guanlei
2009-01-01
Full Text Available The uncertainty principle plays an important role in mathematics, physics, signal processing, and so on. Firstly, based on definition of the linear canonical transform (LCT and the traditional Pitt's inequality, one novel Pitt's inequality in the LCT domains is obtained, which is connected with the LCT parameters a and b. Then one novel logarithmic uncertainty principle is derived from this novel Pitt's inequality in the LCT domains, which is associated with parameters of the two LCTs. Secondly, from the relation between the original function and LCT, one entropic uncertainty principle and one Heisenberg's uncertainty principle in the LCT domains are derived, which are associated with the LCT parameters a and b. The reason why the three lower bounds are only associated with LCT parameters a and b and independent of c and d is presented. The results show it is possible that the bounds tend to zeros.
Decoherence effect on quantum-memory-assisted entropic uncertainty relations
Ming, Fei; Wang, Dong; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu
2018-01-01
Uncertainty principle significantly provides a bound to predict precision of measurement with regard to any two incompatible observables, and thereby plays a nontrivial role in quantum precision measurement. In this work, we observe the dynamical features of the quantum-memory-assisted entropic uncertainty relations (EUR) for a pair of incompatible measurements in an open system characterized by local generalized amplitude damping (GAD) noises. Herein, we derive the dynamical evolution of the entropic uncertainty with respect to the measurement affecting by the canonical GAD noises when particle A is initially entangled with quantum memory B. Specifically, we examine the dynamics of EUR in the frame of three realistic scenarios: one case is that particle A is affected by environmental noise (GAD) while particle B as quantum memory is free from any noises, another case is that particle B is affected by the external noise while particle A is not, and the last case is that both of the particles suffer from the noises. By analytical methods, it turns out that the uncertainty is not full dependent of quantum correlation evolution of the composite system consisting of A and B, but the minimal conditional entropy of the measured subsystem. Furthermore, we present a possible physical interpretation for the behavior of the uncertainty evolution by means of the mixedness of the observed system; we argue that the uncertainty might be dramatically correlated with the systematic mixedness. Furthermore, we put forward a simple and effective strategy to reduce the measuring uncertainty of interest upon quantum partially collapsed measurement. Therefore, our explorations might offer an insight into the dynamics of the entropic uncertainty relation in a realistic system, and be of importance to quantum precision measurement during quantum information processing.
Symmetry, Contingency, Complexity: Accommodating Uncertainty in Public Relations Theory.
Murphy, Priscilla
2000-01-01
Explores the potential of complexity theory as a unifying theory in public relations, where scholars have recently raised problems involving flux, uncertainty, adaptiveness, and loss of control. Describes specific complexity-based methodologies and their potential for public relations studies. Offers an account of complexity theory, its…
A new uncertainty relation for angular momentum and angle
International Nuclear Information System (INIS)
Kranold, H.U.
1984-01-01
An uncertainty relation of the form ΔL 2 ΔSo >=sup(h/2π)/sub(2) is derived for angular momentum and angle. The non-linear operator So measures angles and has a simple interpretation. Subject to very general conditions of rotational invariance the above relation is unique. Radial momentum is not quantized
Generalization of uncertainty relation for quantum and stochastic systems
Koide, T.; Kodama, T.
2018-06-01
The generalized uncertainty relation applicable to quantum and stochastic systems is derived within the stochastic variational method. This relation not only reproduces the well-known inequality in quantum mechanics but also is applicable to the Gross-Pitaevskii equation and the Navier-Stokes-Fourier equation, showing that the finite minimum uncertainty between the position and the momentum is not an inherent property of quantum mechanics but a common feature of stochastic systems. We further discuss the possible implication of the present study in discussing the application of the hydrodynamic picture to microscopic systems, like relativistic heavy-ion collisions.
The Second International Workshop on Squeezed States and Uncertainty Relations
Han, D. (Editor); Kim, Y. S.; Manko, V. I.
1993-01-01
This conference publication contains the proceedings of the Second International Workshop on Squeezed States and Uncertainty Relations held in Moscow, Russia, on 25-29 May 1992. The purpose of this workshop was to study possible applications of squeezed states of light. The Workshop brought together many active researchers in squeezed states of light and those who may find the concept of squeezed states useful in their research, particularly in understanding the uncertainty relations. It was found at this workshop that the squeezed state has a much broader implication than the two-photon coherent states in quantum optics, since the squeeze transformation is one of the most fundamental transformations in physics.
Uncertainty related to Environmental Data and Estimated Extreme Events
DEFF Research Database (Denmark)
Burcharth, H. F.
The design loads on rubble mound breakwaters are almost entirely determined by the environmental conditions, i.e. sea state, water levels, sea bed characteristics, etc. It is the objective of sub-group B to identify the most important environmental parameters and evaluate the related uncertainties...... including those corresponding to extreme estimates typically used for design purposes. Basically a design condition is made up of a set of parameter values stemming from several environmental parameters. To be able to evaluate the uncertainty related to design states one must know the corresponding joint....... Consequently this report deals mainly with each parameter separately. Multi parameter problems are briefly discussed in section 9. It is important to notice that the quantified uncertainties reported in section 7.7 represent what might be regarded as typical figures to be used only when no more qualified...
Correlated quadratures of resonance fluorescence and the generalized uncertainty relation
Arnoldus, Henk F.; George, Thomas F.; Gross, Rolf W. F.
1994-01-01
Resonance fluorescence from a two-state atom has been predicted to exhibit quadrature squeezing below the Heisenberg uncertainty limit, provided that the optical parameters (Rabi frequency, detuning, laser linewidth, etc.) are chosen carefully. When the correlation between two quadratures of the radiation field does not vanish, however, the Heisenberg limit for quantum fluctuations might be an unrealistic lower bound. A generalized uncertainty relation, due to Schroedinger, takes into account the possible correlation between the quadrature components of the radiation, and it suggests a modified definition of squeezing. We show that the coherence between the two levels of a laser-driven atom is responsible for the correlation between the quadrature components of the emitted fluorescence, and that the Schrodinger uncertainty limit increases monotonically with the coherence. On the other hand, the fluctuations in the quadrature field diminish with an increasing coherence, and can disappear completely when the coherence reaches 1/2, provided that certain phase relations hold.
Interpretation of uncertainty relations for three or more observables
International Nuclear Information System (INIS)
Shirokov, M.I.
2003-01-01
Conventional quantum uncertainty relations (URs) contain dispersions of two observables. Generalized URs are known which contain three or more dispersions. They are derived here starting with suitable generalized Cauchy inequalities. It is shown what new information the generalized URs provide. Similar interpretation is given to generalized Cauchy inequalities
Fifth International Conference on Squeezed States and Uncertainty Relations
Han, D. (Editor); Janszky, J. (Editor); Kim, Y. S. (Editor); Man'ko, V. I. (Editor)
1998-01-01
The Fifth International Conference on Squeezed States and Uncertainty Relations was held at Balatonfured, Hungary, on 27-31 May 1997. This series was initiated in 1991 at the College Park Campus of the University of Maryland as the Workshop on Squeezed States and Uncertainty Relations. The scientific purpose of this series was to discuss squeezed states of light, but in recent years the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics including quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic. As the meeting attracted more participants and started covering more diversified subjects, the fourth meeting was called an international conference. The Fourth International Conference on Squeezed States and Uncertainty Relations was held in 1995 was hosted by Shanxi University in Taiyuan, China. The fifth meeting of this series, which was held at Balatonfured, Hungary, was also supported by the IUPAP. In 1999, the Sixth International Conference will be hosted by the University of Naples in 1999. The meeting will take place in Ravello near Naples.
International Nuclear Information System (INIS)
Cacais, F.L.; Delgado, J.U.; Loayza, V.M.
2016-01-01
In preparing solutions for the production of radionuclide metrology standards is necessary measuring the quantity Activity by mass. The gravimetric method by elimination is applied to perform weighing with smaller uncertainties. At this work is carried out the validation, by the Monte Carlo method, of the uncertainty calculation approach implemented by Lourenco and Bobin according to ISO GUM for the method by elimination. The results obtained by both uncertainty calculation methods were consistent indicating that were fulfilled the conditions for the application of ISO GUM in the preparation of radioactive standards. (author)
Differentiating intolerance of uncertainty from three related but distinct constructs.
Rosen, Natalie O; Ivanova, Elena; Knäuper, Bärbel
2014-01-01
Individual differences in uncertainty have been associated with heightened anxiety, stress and approach-oriented coping. Intolerance of uncertainty (IU) is a trait characteristic that arises from negative beliefs about uncertainty and its consequences. Researchers have established the central role of IU in the development of problematic worry and maladaptive coping, highlighting the importance of this construct to anxiety disorders. However, there is a need to improve our understanding of the phenomenology of IU. The goal of this paper was to present hypotheses regarding the similarities and differences between IU and three related constructs--intolerance of ambiguity, uncertainty orientation, and need for cognitive closure--and to call for future empirical studies to substantiate these hypotheses. To assist with achieving this goal, we conducted a systematic review of the literature, which also served to identify current gaps in knowledge. This paper differentiates these constructs by outlining each definition and general approaches to assessment, reviewing the existing empirical relations, and proposing theoretical similarities and distinctions. Findings may assist researchers in selecting the appropriate construct to address their research questions. Future research directions for the application of these constructs, particularly within the field of clinical and health psychology, are discussed.
Implementation of unscented transform to estimate the uncertainty of a liquid flow standard system
Energy Technology Data Exchange (ETDEWEB)
Chun, Sejong; Choi, Hae-Man; Yoon, Byung-Ro; Kang, Woong [Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)
2017-03-15
First-order partial derivatives of a mathematical model are an essential part of evaluating the measurement uncertainty of a liquid flow standard system according to the Guide to the expression of uncertainty in measurement (GUM). Although the GUM provides a straightforward method to evaluate the measurement uncertainty of volume flow rate, the first-order partial derivatives can be complicated. The mathematical model of volume flow rate in a liquid flow standard system has a cross-correlation between liquid density and buoyancy correction factor. This cross-correlation can make derivation of the first-order partial derivatives difficult. Monte Carlo simulation can be used as an alternative method to circumvent the difficulty in partial derivation. However, the Monte Carlo simulation requires large computational resources for a correct simulation because it considers the completeness issue whether an ideal or a real operator conducts an experiment to evaluate the measurement uncertainty. Thus, the Monte Carlo simulation needs a large number of samples to ensure that the uncertainty evaluation is as close to the GUM as possible. Unscented transform can alleviate this problem because unscented transform can be regarded as a Monte Carlo simulation with an infinite number of samples. This idea means that unscented transform considers the uncertainty evaluation with respect to the ideal operator. Thus, unscented transform can evaluate the measurement uncertainty the same as the uncertainty that the GUM provides.
Another two dark energy models motivated from Karolyhazy uncertainty relation
Energy Technology Data Exchange (ETDEWEB)
Sun, Cheng-Yi; Yang, Wen-Li; Song, Yu. [Northwest University, Institute of Modern Physics, Xian (China); Yue, Rui-Hong [Ningbo University, Faculty of Science, Ningbo (China)
2012-03-15
The Karolyhazy uncertainty relation indicates that there exists a minimal detectable cell {delta}t{sup 3} over the region t{sup 3} in Minkowski space-time. Due to the energy-time uncertainty relation, the energy of the cell {delta}t {sup 3} cannot be less {delta}t{sup -1}. Then we get a new energy density of metric fluctuations of Minkowski spacetime as {delta}t{sup -4}. Motivated by the energy density, we propose two new dark-energy models. One model is characterized by the age of the universe and the other is characterized by the conformal age of the universe. We find that in the two models, the dark energy mimics a cosmological constant in the late time. (orig.)
Indian Academy of Sciences (India)
To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...
The Sobolev inequality and the Tsallis entropic uncertainty relation
International Nuclear Information System (INIS)
Rajagopal, A.K.
1995-01-01
The Heisenberg uncertainty relation is expressed in terms of the Tsallis entropies associated with the conjugate coordinate and momentum probability densities. By rewriting this in terms of a positive joint probability distribution suggested by Cohen and coworkers, a different insight into the statistical dependence of the quantum variables is obtained. A discussion of how this improves the previous results on this subject is given. (orig.)
Fourth International Conference on Squeezed States and Uncertainty Relations
Han, D. (Editor); Peng, Kunchi (Editor); Kim, Y. S. (Editor); Manko, V. I. (Editor)
1996-01-01
The fourth International Conference on Squeezed States and Uncertainty Relations was held at Shanxi University, Taiyuan, Shanxi, China, on June 5 - 9, 1995. This conference was jointly organized by Shanxi University, the University of Maryland (U.S.A.), and the Lebedev Physical Institute (Russia). The first meeting of this series was called the Workshop on Squeezed States and Uncertainty Relations, and was held in 1991 at College Park, Maryland. The second and third meetings in this series were hosted in 1992 by the Lebedev Institute in Moscow, and in 1993 by the University of Maryland Baltimore County, respectively. The scientific purpose of this series was initially to discuss squeezed states of light, but in recent years, the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics, including, of course, quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic transformation. This transition took place at the fourth meeting of this series held at Shanxi University in 1995. The fifth meeting in this series will be held in Budapest (Hungary) in 1997, and the principal organizer will be Jozsef Janszky of the Laboratory of Crystal Physics, P.O. Box 132, H-1052. Budapest, Hungary.
Uncertainties of exposure-related quantities in mammographic x-ray unit quality control
International Nuclear Information System (INIS)
Gregory, Kent J.; Pattison, John E.; Bibbo, Giovanni
2006-01-01
Breast screening programs operate in many countries with mammographic x-ray units subject to stringent quality control tests. These tests include the evaluation of quantities based on exposure measurements, such as half value layer, automatic exposure control reproducibility, average glandular dose, and radiation output rate. There are numerous error sources that contribute to the uncertainty of these exposure-related quantities, some of which are unique to the low energy x-ray spectrum produced by mammographic x-ray units. For each of these exposure-related quantities, the applicable error sources and their magnitudes vary, depending on the test equipment used to make the measurement, and whether or not relevant corrections have been applied. This study has identified and quantified a range of error sources that may be used to estimate the combined uncertainty of these exposure-related quantities, given the test equipment used and corrections applied. The uncertainty analysis uses methods described by the International Standards Organization's Guide to the Expression of Uncertainty in Measurement. Examples of how these error sources combine to give the uncertainty of the exposure-related quantities are presented. Using the best test equipment evaluated in this study, uncertainties of the four exposure-related quantities at the 95% confidence interval were found to be ±1.6% (half value layer), ±0.0008 (automatic exposure control reproducibility), ±2.3% (average glandular dose), and ±2.1% (radiation output rate). In some cases, using less precise test equipment or failing to apply corrections, resulted in uncertainties more than double in magnitude
Heisenberg's uncertainty relation: Violation and reformulation
International Nuclear Information System (INIS)
Ozawa, Masanao
2014-01-01
The uncertainty relation formulated by Heisenberg in 1927 describes a trade-off between the error of a measurement of one observable and the disturbance caused on another complementary observable so that their product should be no less than a limit set by Planck's constant. In 1980, Braginsky, Vorontsov, and Thorne claimed that this relation leads to a sensitivity limit for gravitational wave detectors. However, in 1988 a model of position measurement was constructed that breaks both this limit and Heisenberg's relation. Here, we discuss the problems as to how we reformulate Heisenberg's relation to be universally valid and how we experimentally quantify the error and the disturbance to refute the old relation and to confirm the new relation.
Cauchy inequality and uncertainty relations for mixed states
International Nuclear Information System (INIS)
Shirokov, M.I.
2004-01-01
Cauchy inequality (CI) relates scalar products of two vectors and their norms. I point out other similar inequalities (SI). Starting with CI Schroedinger derived his uncertainty relation (UR). By using SI other various UR can be obtained. It is shown that they follow from the Schroedinger UR. Two generalizations of CI are obtained for mixed states described by density matrices. Using them two generalizations of UR for mixed states are derived. Both differ from the UR generalization known from the literature. The discussion of these generalizations is given
Uncertainty Analysis of Spectral Irradiance Reference Standards Used for NREL Calibrations
Energy Technology Data Exchange (ETDEWEB)
Habte, A.; Andreas, A.; Reda, I.; Campanelli, M.; Stoffel, T.
2013-05-01
Spectral irradiance produced by lamp standards such as the National Institute of Standards and Technology (NIST) FEL-type tungsten halogen lamps are used to calibrate spectroradiometers at the National Renewable Energy Laboratory. Spectroradiometers are often used to characterize spectral irradiance of solar simulators, which in turn are used to characterize photovoltaic device performance, e.g., power output and spectral response. Therefore, quantifying the calibration uncertainty of spectroradiometers is critical to understanding photovoltaic system performance. In this study, we attempted to reproduce the NIST-reported input variables, including the calibration uncertainty in spectral irradiance for a standard NIST lamp, and quantify uncertainty for measurement setup at the Optical Metrology Laboratory at the National Renewable Energy Laboratory.
Realistic Approach of the Relations of Uncertainty of Heisenberg
Directory of Open Access Journals (Sweden)
Paul E. Sterian
2013-01-01
Full Text Available Due to the requirements of the principle of causality in the theory of relativity, one cannot make a device for the simultaneous measuring of the canonical conjugate variables in the conjugate Fourier spaces. Instead of admitting that a particle’s position and its conjugate momentum cannot be accurately measured at the same time, we consider the only probabilities which can be determined when working at subatomic level to be valid. On the other hand, based on Schwinger's action principle and using the quadridimensional form of the unitary transformation generator function of the quantum operators in the paper, the general form of the evolution equation for these operators is established. In the nonrelativistic case one obtains the Heisenberg's type evolution equations which can be particularized to derive Heisenberg's uncertainty relations. The analysis of the uncertainty relations as implicit evolution equations allows us to put into evidence the intrinsic nature of the correlation expressed by these equations in straight relations with the measuring process. The independence of the quantisation postulate from the causal evolution postulate of quantum mechanics is also put into discussion.
Bound entangled states violate a nonsymmetric local uncertainty relation
International Nuclear Information System (INIS)
Hofmann, Holger F.
2003-01-01
As a consequence of having a positive partial transpose, bound entangled states lack many of the properties otherwise associated with entanglement. It is therefore interesting to identify properties that distinguish bound entangled states from separable states. In this paper, it is shown that some bound entangled states violate a nonsymmetric class of local uncertainty relations [H. F. Hofmann and S. Takeuchi, Phys. Rev. A 68, 032103 (2003)]. This result indicates that the asymmetry of nonclassical correlations may be a characteristic feature of bound entanglement
Uncertainties related to numerical methods for neutron spectra unfolding
International Nuclear Information System (INIS)
Glodic, S.; Ninkovic, M.; Adarougi, N.A.
1987-10-01
One of the often used techniques for neutron detection in radiation protection utilities is the Bonner multisphere spectrometer. Besides its advantages and universal applicability for evaluating integral parameters of neutron fields in health physics practices, the outstanding problems of the method are data analysis and the accuracy of the results. This paper briefly discusses some numerical problems related to neutron spectra unfolding, such as uncertainty of the response matrix as a source of error, and the possibility of real time data reduction using spectrometers. (author)
Detecting quantum entanglement. Entanglement witnesses and uncertainty relations
International Nuclear Information System (INIS)
Guehne, O.
2004-01-01
This thesis deals with methods of the detection of entanglement. After recalling some facts and definitions concerning entanglement and separability, we investigate two methods of the detection of entanglement. In the first part of this thesis we consider so-called entanglement witnesses, mainly in view of the detection of multipartite entanglement. Entanglement witnesses are observables for which a negative expectation value indicates entanglement. We first present a simple method to construct these witnesses. Since witnesses are nonlocal observables, they are not easy to measure in a real experiment. However, as we will show, one can circumvent this problem by decomposing the witness into several local observables which can be measured separately. We calculate the local decompositions for several interesting witnesses for two, three and four qubits. Local decompositions can be optimized in the number of measurement settings which are needed for an experimental implementation. We present a method to prove that a given local decomposition is optimal and discuss with this the optimality of our decompositions. Then we present another method of designing witnesses which are by construction measurable with local measurements. Finally, we shortly report on experiments where some of the witnesses derived in this part have been used to detect three- and four-partite entanglement of polarized photons. The second part of this thesis deals with separability criteria which are written in terms of uncertainty relations. There are two different formulations of uncertainty relations since one can measure the uncertainty of an observable by its variance as well as by entropic quantities. We show that both formulations are useful tools for the derivation of separability criteria for finite-dimensional systems and investigate the resulting criteria. Our results in this part exhibit also some more fundamental properties of entanglement: We show how known separability criteria for
Statistical analysis of the uncertainty related to flood hazard appraisal
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
Characterizing quantum correlations. Entanglement, uncertainty relations and exponential families
Energy Technology Data Exchange (ETDEWEB)
Niekamp, Soenke
2012-04-20
This thesis is concerned with different characterizations of multi-particle quantum correlations and with entropic uncertainty relations. The effect of statistical errors on the detection of entanglement is investigated. First, general results on the statistical significance of entanglement witnesses are obtained. Then, using an error model for experiments with polarization-entangled photons, it is demonstrated that Bell inequalities with lower violation can have higher significance. The question for the best observables to discriminate between a state and the equivalence class of another state is addressed. Two measures for the discrimination strength of an observable are defined, and optimal families of observables are constructed for several examples. A property of stabilizer bases is shown which is a natural generalization of mutual unbiasedness. For sets of several dichotomic, pairwise anticommuting observables, uncertainty relations using different entropies are constructed in a systematic way. Exponential families provide a classification of states according to their correlations. In this classification scheme, a state is considered as k-correlated if it can be written as thermal state of a k-body Hamiltonian. Witness operators for the detection of higher-order interactions are constructed, and an algorithm for the computation of the nearest k-correlated state is developed.
Characterizing quantum correlations. Entanglement, uncertainty relations and exponential families
International Nuclear Information System (INIS)
Niekamp, Soenke
2012-01-01
This thesis is concerned with different characterizations of multi-particle quantum correlations and with entropic uncertainty relations. The effect of statistical errors on the detection of entanglement is investigated. First, general results on the statistical significance of entanglement witnesses are obtained. Then, using an error model for experiments with polarization-entangled photons, it is demonstrated that Bell inequalities with lower violation can have higher significance. The question for the best observables to discriminate between a state and the equivalence class of another state is addressed. Two measures for the discrimination strength of an observable are defined, and optimal families of observables are constructed for several examples. A property of stabilizer bases is shown which is a natural generalization of mutual unbiasedness. For sets of several dichotomic, pairwise anticommuting observables, uncertainty relations using different entropies are constructed in a systematic way. Exponential families provide a classification of states according to their correlations. In this classification scheme, a state is considered as k-correlated if it can be written as thermal state of a k-body Hamiltonian. Witness operators for the detection of higher-order interactions are constructed, and an algorithm for the computation of the nearest k-correlated state is developed.
Sixth International Conference on Squeezed States and Uncertainty Relations
Han, D. (Editor); Kim, Y. S. (Editor); Solimento, S. (Editor)
2000-01-01
These proceedings contain contributions from about 200 participants to the 6th International Conference on Squeezed States and Uncertainty Relations (ICSSUR'99) held in Naples May 24-29, 1999, and organized jointly by the University of Naples "Federico II," the University of Maryland at College Park, and the Lebedev Institute, Moscow. This was the sixth of a series of very successful meetings started in 1990 at the College Park Campus of the University of Maryland. The other meetings in the series were held in Moscow (1992), Baltimore (1993), Taiyuan P.R.C. (1995) and Balatonfuered, Hungary (1997). The present one was held at the campus Monte Sant'Angelo of the University "Federico II" of Naples. The meeting sought to provide a forum for updating and reviewing a wide range of quantum optics disciplines, including device developments and applications, and related areas of quantum measurements and quantum noise. Over the years, the ICSSUR Conference evolved from a meeting on quantum measurement sector of quantum optics, to a wide range of quantum optics themes, including multifacet aspects of generation, measurement, and applications of nonclassical light (squeezed and Schrodinger cat radiation fields, etc.), and encompassing several related areas, ranging from quantum measurement to quantum noise. ICSSUR'99 brought together about 250 people active in the field of quantum optics, with special emphasis on nonclassical light sources and related areas. The Conference was organized in 8 Sections: Squeezed states and uncertainty relations; Harmonic oscillators and squeeze transformations; Methods of quantum interference and correlations; Quantum measurements; Generation and characterisation of non-classical light; Quantum noise; Quantum communication and information; and Quantum-like systems.
How to: understanding SWAT model uncertainty relative to measured results
Watershed models are being relied upon to contribute to most policy-making decisions of watershed management, and the demand for an accurate accounting of complete model uncertainty is rising. Generalized likelihood uncertainty estimation (GLUE) is a widely used method for quantifying uncertainty i...
Fisher information, kinetic energy and uncertainty relation inequalities
International Nuclear Information System (INIS)
Luo Shunlong
2002-01-01
By interpolating between Fisher information and mechanical kinetic energy, we introduce a general notion of kinetic energy with respect to a parameter of Schroedinger wavefunctions from a statistical inference perspective. Kinetic energy is the sum of Fisher information and an integral of a parametrized analogue of quantum mechanical current density related to phase. A family of integral inequalities concerning kinetic energy and moments are established, among which the Cramer-Rao inequality and the Weyl-Heisenberg inequality, are special cases. In particular, the integral inequalities involving the negative order moments are relevant to the study of electron systems. Moreover, by specifying the parameter to a scale, we obtain a family of inequalities of uncertainty relation type which incorporate the position and momentum observables symmetrically in a single quantity. (author)
Directory of Open Access Journals (Sweden)
Rodrigo Fernandes Malaquias
2017-04-01
Full Text Available This paper examines whether differences in the perception of uncertainty expressions persist over time. The empirical analysis of this question involved two approaches: quantitative (with tests to compare means, medians, regression analysis with ordinary least squares and quantile regression, and qualitative, with interviews. Principal findings are that the differences in the perceptions of participants with respect to uncertainty expressions were not statistically significant, which differs from the findings reported in previous studies. This may be indicative of a tendency toward elimination of potential differences in the interpretation of accounting standards over time.
The density-salinity relation of standard seawater
Schmidt, Hannes; Seitz, Steffen; Hassel, Egon; Wolf, Henning
2018-01-01
The determination of salinity by means of electrical conductivity relies on stable salt proportions in the North Atlantic Ocean, because standard seawater, which is required for salinometer calibration, is produced from water of the North Atlantic. To verify the long-term stability of the standard seawater composition, it was proposed to perform measurements of the standard seawater density. Since the density is sensitive to all salt components, a density measurement can detect any change in the composition. A conversion of the density values to salinity can be performed by means of a density-salinity relation. To use such a relation with a target uncertainty in salinity comparable to that in salinity obtained from conductivity measurements, a density measurement with an uncertainty of 2 g m-3 is mandatory. We present a new density-salinity relation based on such accurate density measurements. The substitution measurement method used is described and density corrections for uniform isotopic and chemical compositions are reported. The comparison of densities calculated using the new relation with those calculated using the present reference equations of state TEOS-10 suggests that the density accuracy of TEOS-10 (as well as that of EOS-80) has been overestimated, as the accuracy of some of its underlying density measurements had been overestimated. The new density-salinity relation may be used to verify the stable composition of standard seawater by means of routine density measurements.
King, B
2001-11-01
The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.
Uncertainty of Water-hammer Loads for Safety Related Systems
Energy Technology Data Exchange (ETDEWEB)
Lee, Seung Chan; Yoon, Duk Joo [Korea Hydro and Nuclear Power Co., LT., Daejeon (Korea, Republic of)
2013-10-15
In this study, the basic methodology is base on ISO GUM (Guide to the Expression of Uncertainty in Measurements). For a given gas void volumes in the discharge piping, the maximum pressure of water hammer is defined in equation. From equation, uncertainty parameter is selected as U{sub s} (superficial velocity for the specific pipe size and corresponding area) of equation. The main uncertainty parameter (U{sub s}) is estimated by measurement method and Monte Carlo simulation. Two methods are in good agreement with the extended uncertainty. Extended uncertainty of the measurement and Monte Carlo simulation is 1.30 and 1.34 respectively in 95% confidence interval. In 99% confidence interval, the uncertainties are 1.95 and 1.97 respectively. NRC Generic Letter 2008-01 requires nuclear power plant operators to evaluate the possibility of noncondensable gas accumulation for the Emergency Core Cooling System. Specially, gas accumulation can result in system pressure transient in pump discharge piping at a pump start. Consequently, this evolves into a gas water, a water-hammer event and the force imbalances on the piping segments. In this paper, MCS (Monte Carlo Simulation) method is introduced in estimating the uncertainty of water hammer. The aim is to evaluate the uncertainty of the water hammer estimation results carried out by KHNP CRI in 2013.
Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Godin-Beekmann, Sophie; Haefele, Alexander; Trickl, Thomas; Payen, Guillaume; Liberti, Gianluigi
2016-08-01
A standardized approach for the definition, propagation, and reporting of uncertainty in the ozone differential absorption lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One essential aspect of the proposed approach is the propagation in parallel of all independent uncertainty components through the data processing chain before they are combined together to form the ozone combined standard uncertainty. The independent uncertainty components contributing to the overall budget include random noise associated with signal detection, uncertainty due to saturation correction, background noise extraction, the absorption cross sections of O3, NO2, SO2, and O2, the molecular extinction cross sections, and the number densities of the air, NO2, and SO2. The expression of the individual uncertainty components and their step-by-step propagation through the ozone differential absorption lidar (DIAL) processing chain are thoroughly estimated. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which requires knowledge of the covariance matrix when the lidar signal is vertically filtered. In addition, the covariance terms must be taken into account if the same detection hardware is shared by the lidar receiver channels at the absorbed and non-absorbed wavelengths. The ozone uncertainty budget is presented as much as possible in a generic form (i.e., as a function of instrument performance and wavelength) so that all NDACC ozone DIAL investigators across the network can estimate, for their own instrument and in a straightforward manner, the expected impact of each reviewed uncertainty component. In addition, two actual examples of full uncertainty budget are provided, using nighttime measurements from the tropospheric ozone DIAL located at the Jet Propulsion Laboratory (JPL) Table Mountain Facility, California, and nighttime measurements from the JPL
Impact of model uncertainty on soil quality standards for cadmium in rice paddy fields
Energy Technology Data Exchange (ETDEWEB)
Roemkens, P.F.A.M., E-mail: paul.romkens@wur.nl [Soil Science Center, Alterra, WageningenUR. P.O. Box 47, 6700AA Wageningen (Netherlands); Brus, D.J. [Soil Science Center, Alterra, WageningenUR. P.O. Box 47, 6700AA Wageningen (Netherlands); Guo, H.Y.; Chu, C.L.; Chiang, C.M. [Taiwan Agricultural Research Institute (TARI), Wufong, Taiwan (China); Koopmans, G.F. [Soil Science Center, Alterra, WageningenUR. P.O. Box 47, 6700AA Wageningen (Netherlands); Department of Soil Quality, Wageningen University, WageningenUR. P.O. Box 47, 6700AA, Wageningen (Netherlands)
2011-08-01
At present, soil quality standards used for agriculture do not consider the influence of pH and CEC on the uptake of pollutants by crops. A database with 750 selected paired samples of cadmium (Cd) in soil and paddy rice was used to calibrate soil to plant transfer models using the soil metal content, pH, and CEC or soil Cd and Zn extracted by 0.01 M CaCl{sub 2} as explanatory variables. The models were validated against a set of 2300 data points not used in the calibration. These models were then used inversely to derive soil quality standards for Japonica and Indica rice cultivars based on the food quality standards for rice. To account for model uncertainty, strict soil quality standards were derived considering a maximum probability that rice exceeds the food quality standard equal to 10 or 5%. Model derived soil standards based on Aqua Regia ranged from less than 0.3 mg kg{sup -1} for Indica at pH 4.5 to more than 6 mg kg{sup -1} for Japonica-type cultivars in clay soils at pH 7. Based on the CaCl{sub 2} extract, standards ranged from 0.03 mg kg{sup -1} Cd for Indica cultivars to 0.1 mg kg{sup -1} Cd for Japonica cultivars. For both Japonica and Indica-type cultivars, the soil quality standards must be reduced by a factor of 2 to 3 to obtain the strict standards. The strong impact of pH and CEC on soil quality standards implies that it is essential to correct for soil type when deriving national or local standards. Validation on the remaining 2300 samples indicated that both types of models were able to accurately predict (> 92%) whether rice grown on a specific soil will meet the food quality standard used in Taiwan. - Research highlights: {yields} Cadmium uptake by Japonica and Indica rice varieties depends on soil pH and CEC. {yields} Food safety based soil standards range from 0.3 (Indica) to 6 mg kg{sup -1} (Japonica). {yields} Model uncertainty leads to strict soil standards of less than 0.1 mg kg{sup -1} for Indica. {yields} Soil pH and CEC should be
Impact of model uncertainty on soil quality standards for cadmium in rice paddy fields
International Nuclear Information System (INIS)
Roemkens, P.F.A.M.; Brus, D.J.; Guo, H.Y.; Chu, C.L.; Chiang, C.M.; Koopmans, G.F.
2011-01-01
At present, soil quality standards used for agriculture do not consider the influence of pH and CEC on the uptake of pollutants by crops. A database with 750 selected paired samples of cadmium (Cd) in soil and paddy rice was used to calibrate soil to plant transfer models using the soil metal content, pH, and CEC or soil Cd and Zn extracted by 0.01 M CaCl 2 as explanatory variables. The models were validated against a set of 2300 data points not used in the calibration. These models were then used inversely to derive soil quality standards for Japonica and Indica rice cultivars based on the food quality standards for rice. To account for model uncertainty, strict soil quality standards were derived considering a maximum probability that rice exceeds the food quality standard equal to 10 or 5%. Model derived soil standards based on Aqua Regia ranged from less than 0.3 mg kg -1 for Indica at pH 4.5 to more than 6 mg kg -1 for Japonica-type cultivars in clay soils at pH 7. Based on the CaCl 2 extract, standards ranged from 0.03 mg kg -1 Cd for Indica cultivars to 0.1 mg kg -1 Cd for Japonica cultivars. For both Japonica and Indica-type cultivars, the soil quality standards must be reduced by a factor of 2 to 3 to obtain the strict standards. The strong impact of pH and CEC on soil quality standards implies that it is essential to correct for soil type when deriving national or local standards. Validation on the remaining 2300 samples indicated that both types of models were able to accurately predict (> 92%) whether rice grown on a specific soil will meet the food quality standard used in Taiwan. - Research highlights: → Cadmium uptake by Japonica and Indica rice varieties depends on soil pH and CEC. → Food safety based soil standards range from 0.3 (Indica) to 6 mg kg -1 (Japonica). → Model uncertainty leads to strict soil standards of less than 0.1 mg kg -1 for Indica. → Soil pH and CEC should be considered to obtain meaningful standards for agriculture.
Reconsideration of the Uncertainty Relations and Quantum Measurements
Directory of Open Access Journals (Sweden)
Dumitru S.
2008-04-01
Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and discussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucialpieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii simple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information-transmission model, in which the quantum observables are considered as random variables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.
Entanglement witness via quantum-memory-assisted entropic uncertainty relation
Shi, Jiadong; Ding, Zhiyong; Wu, Tao; He, Juan; Yu, Lizhi; Sun, Wenyang; Wang, Dong; Ye, Liu
2017-12-01
By virtue of the quantum-memory-assisted entropic uncertainty relation (EUR), we analyze entanglement witness via the efficiencies of the estimates proposed by Berta (2010 Nat. Phys. 6 659) and Pati (2012 Phys. Rev. A 86 042105). The results show that, without a structured reservoir, the entanglement regions witnessed by these EUR estimates are only determined by the chosen estimated setup, and have no correlation with the explicit form of the initial state. On the other hand, with the structured reservoirs, the time regions during which the entanglement can be witnessed, and the corresponding entanglement regions closely depend on the choice of the estimated setup, the initial state and the state purity p . Concretely, for a pure state with p=1 , the entanglement can be witnessed by both estimates, while for mixed states with p=0.78 , it can only be witnessed using the Pati estimate. What is more, we find that the time regions incorporating the Pati estimate become discontinuous for the initial state with ≤ft| {{φ }\\prime } \\right> ={≤ft(≤ft| 01 \\right> +≤ft| 10 \\right> \\right)}/{\\sqrt{2}} , and the corresponding entanglement regions remain the same; however the entanglement can only be witnessed once by utilizing the Berta estimate.
Reconsideration of the Uncertainty Relations and Quantum Measurements
Directory of Open Access Journals (Sweden)
Dumitru S.
2008-04-01
Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and dis- cussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucial pieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii sim- ple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information- transmission model, in which the quantum observables are considered as random vari- ables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.
Energy levels of one-dimensional systems satisfying the minimal length uncertainty relation
Energy Technology Data Exchange (ETDEWEB)
Bernardo, Reginald Christian S., E-mail: rcbernardo@nip.upd.edu.ph; Esguerra, Jose Perico H., E-mail: jesguerra@nip.upd.edu.ph
2016-10-15
The standard approach to calculating the energy levels for quantum systems satisfying the minimal length uncertainty relation is to solve an eigenvalue problem involving a fourth- or higher-order differential equation in quasiposition space. It is shown that the problem can be reformulated so that the energy levels of these systems can be obtained by solving only a second-order quasiposition eigenvalue equation. Through this formulation the energy levels are calculated for the following potentials: particle in a box, harmonic oscillator, Pöschl–Teller well, Gaussian well, and double-Gaussian well. For the particle in a box, the second-order quasiposition eigenvalue equation is a second-order differential equation with constant coefficients. For the harmonic oscillator, Pöschl–Teller well, Gaussian well, and double-Gaussian well, a method that involves using Wronskians has been used to solve the second-order quasiposition eigenvalue equation. It is observed for all of these quantum systems that the introduction of a nonzero minimal length uncertainty induces a positive shift in the energy levels. It is shown that the calculation of energy levels in systems satisfying the minimal length uncertainty relation is not limited to a small number of problems like particle in a box and the harmonic oscillator but can be extended to a wider class of problems involving potentials such as the Pöschl–Teller and Gaussian wells.
Concept of uncertainty in relation to the foresight research
Directory of Open Access Journals (Sweden)
Magruk Andrzej
2017-03-01
Full Text Available Uncertainty is one of the most important features of many areas of social and economic life, especially in the forward-looking context. On the one hand, the degree of uncertainty is associated with the objective essence of randomness of the phenomenon, and on the other, with the subjective perspective of a man. Future-oriented perception of human activities is laden with an incomplete specificity of the analysed phenomena, their volatility, and lack of continuity. A man is unable to determine, with complete certainty, the further course of these phenomena. According to the author of this article, in order to significantly reduce the uncertainty while making strategic decisions in a complex environment, we should focus our actions on the future through systemic research of foresight. This article attempts to answer the following research questions: 1 What is the relationship between foresight studies in the system perspective to studies of the uncertainty? 2 What classes of foresight methods enable the research of uncertainty in the process of system inquiry of the future? This study conducted deductive reasoning based on the results of the analysis methods and criticism of literature.
International Nuclear Information System (INIS)
Sakurai, Hiromu; Ehara, Kensei
2011-01-01
We evaluated uncertainties in current measurement by the electrometer at the current level on the order of femtoamperes. The electrometer was the one used in the Faraday-cup aerosol electrometer of the Japanese national standard for number concentration of aerosol nanoparticles in which the accuracy of the absolute current is not required, but the net current which is obtained as the difference in currents under two different conditions must be measured accurately. The evaluation was done experimentally at the current level of 20 fA, which was much smaller than the intervals between the electrometer's calibration points at +1, +0.5, −0.5 and −1 pA. The slope of the response curve for the relationship between the 'true' and measured current, which is crucial in the above measurement, was evaluated locally at many different points within the ±1 pA range for deviation from the slope determined by a linear regression of the calibration data. The sum of the current induced by a flow of charged particles and a bias current from a current-source instrument was measured by the electrometer while the particle current was toggled on and off. The net particle current was obtained as the difference in the measured currents between the toggling, while at the same time the current was estimated from the particle concentration read by a condensation particle counter. The local slope was calculated as the ratio of the measured to estimated currents at each bias current setting. The standard deviation of the local slope values observed at varied bias currents was about 0.003, which was calculated by analysis of variance (ANOVA) for the treatment of the bias current. The combined standard uncertainty of the slope, which was calculated from the uncertainty of the slope by linear regression and the variability of the slope, was calculated to be about 0.004
Review of studies related to uncertainty in risk analsis
International Nuclear Information System (INIS)
Rish, W.R.; Marnicio, R.J.
1988-08-01
The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented
Mapping of uncertainty relations between continuous and discrete time.
Chiuchiù, Davide; Pigolotti, Simone
2018-03-01
Lower bounds on fluctuations of thermodynamic currents depend on the nature of time, discrete or continuous. To understand the physical reason, we compare current fluctuations in discrete-time Markov chains and continuous-time master equations. We prove that current fluctuations in the master equations are always more likely, due to random timings of transitions. This comparison leads to a mapping of the moments of a current between discrete and continuous time. We exploit this mapping to obtain uncertainty bounds. Our results reduce the quests for uncertainty bounds in discrete and continuous time to a single problem.
Review of studies related to uncertainty in risk analsis
Energy Technology Data Exchange (ETDEWEB)
Rish, W.R.; Marnicio, R.J.
1988-08-01
The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented.
Best Estimate plus Uncertainty (BEPU) Analyses in the IAEA Safety Standards
International Nuclear Information System (INIS)
Dusic, Milorad; )
2013-01-01
The Safety Standards Series establishes an essential basis for safety and represents the broadest international consensus. Safety Standards Series publications are categorized into: Safety Fundamental (Present the overall objectives, concepts and principles of protection and safety, they are the policy documents of the safety standards), Safety Requirements (Establish requirements that must be met to ensure the protection and safety of people and the environment, both now and in the future), and Safety Guides (Provide guidance, in the form of more detailed actions, conditions or procedures that can be used to comply with the Requirements). The incorporation of more detailed requirements, in accordance with national practice, may still be necessary. There should be only one set of international safety standards. Each safety standard will be reviewed by the relevant committee or by the commission every five years. Best Estimate plus Uncertainty (BEPU) Analyses are approached in the following IAEA Safety Standards: - Safety Requirements SSR 2/1 - Safety of NPPs, Design (Revision of NS-R-1); - General Safety Requirement GSR Part 4: Safety Assessment for Facilities and Activities; - Safety Guide SSG-2 Deterministic Safety Analysis for Nuclear Power Plants. NUSSC suggested that new safety guides should be accompanied by documents like TECDOCs or Safety Reports describing in detail their recommendations where appropriate. Special review is currently underway to identify needs for revision in the light of the Fukushima accident. Revision will concern, first, the Safety Requirements, and then, the Selected Safety Guides
International Nuclear Information System (INIS)
Silva, Cosme Norival Mello da; Rosado, Paulo Henrique Goncalves
2011-01-01
The National Metrology Laboratory of Ionizing Radiation (LNMRI) is the laboratory designated by INMETRO in the field of Metrology of ionizing radiation and is a Secondary Standard Dosimetry Laboratory (SSDL). One of its guidelines is to maintain and disseminate LNMRI absorbed dose in water used as a national standard dosimetry in radiotherapy. For this pattern is metrologically acceptable accuracy and uncertainties should be assessed over time. The objective of this study is to analyze the uncertainties involved in determining the absorbed dose rate in water and standard uncertainty of absorbed dose calibration in water from a clinical dosimeter. The largest sources of uncertainty in determining the rate of absorbed dose in water are due to: calibration coefficient of the calibration certificate supplied by the BIPM, electrometer calibration, camber stability over time, variation of pressure and humidity, strong dependence and non-uniformity of the field. The expanded uncertainty is 0.94% for k = 2. For the calibration standard uncertainty of absorbed dose in water of a dosimeter in a clinical a major source of uncertainty is due to the absorbed dose rate in water (0.94%). The value of expanded uncertainty of calibrating a clinical dosimeter is 1.2% for k = 2. (author)
Psychological Entropy: A Framework for Understanding Uncertainty-Related Anxiety
Hirsh, Jacob B.; Mar, Raymond A.; Peterson, Jordan B.
2012-01-01
Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level. We propose the entropy model of…
Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Haefele, Alexander; Payen, Guillaume; Liberti, Gianluigi
2016-08-01
A standardized approach for the definition, propagation, and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified uncertainty sources comprise major components such as signal detection, saturation correction, background noise extraction, temperature tie-on at the top of the profile, and absorption by ozone if working in the visible spectrum, as well as other components such as molecular extinction, the acceleration of gravity, and the molecular mass of air, whose magnitudes depend on the instrument, data processing algorithm, and altitude range of interest. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated, taking into account the effect of vertical filtering and the merging of multiple channels. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated from the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. Using this standardized approach, an example of uncertainty budget is provided for the Jet Propulsion Laboratory (JPL) lidar at Mauna Loa Observatory, Hawai'i, which is
The Beam Dynamics and Beam Related Uncertainties in Fermilab Muon $g-2$ Experiment
Energy Technology Data Exchange (ETDEWEB)
Wu, Wanwei [Mississippi U.
2018-05-01
The anomaly of the muon magnetic moment, $a_{\\mu}\\equiv (g-2)/2$, has played an important role in constraining physics beyond the Standard Model for many years. Currently, the Standard Model prediction for $a_{\\mu}$ is accurate to 0.42 parts per million (ppm). The most recent muon $g-2$ experiment was done at Brookhaven National Laboratory (BNL) and determined $a_{\\mu}$ to 0.54 ppm, with a central value that differs from the Standard Model prediction by 3.3-3.6 standard deviations and provides a strong hint of new physics. The Fermilab Muon $g-2$ Experiment has a goal to measure $a_{\\mu}$ to unprecedented precision: 0.14 ppm, which could provide an unambiguous answer to the question whether there are new particles and forces that exist in nature. To achieve this goal, several items have been identified to lower the systematic uncertainties. In this work, we focus on the beam dynamics and beam associated uncertainties, which are important and must be better understood. We will discuss the electrostatic quadrupole system, particularly the hardware-related quad plate alignment and the quad extension and readout system. We will review the beam dynamics in the muon storage ring, present discussions on the beam related systematic errors, simulate the 3D electric fields of the electrostatic quadrupoles and examine the beam resonances. We will use a fast rotation analysis to study the muon radial momentum distribution, which provides the key input for evaluating the electric field correction to the measured $a_{\\mu}$.
Exploring entropic uncertainty relation in the Heisenberg XX model with inhomogeneous magnetic field
Huang, Ai-Jun; Wang, Dong; Wang, Jia-Ming; Shi, Jia-Dong; Sun, Wen-Yang; Ye, Liu
2017-08-01
In this work, we investigate the quantum-memory-assisted entropic uncertainty relation in a two-qubit Heisenberg XX model with inhomogeneous magnetic field. It has been found that larger coupling strength J between the two spin-chain qubits can effectively reduce the entropic uncertainty. Besides, we observe the mechanics of how the inhomogeneous field influences the uncertainty, and find out that when the inhomogeneous field parameter b1. Intriguingly, the entropic uncertainty can shrink to zero when the coupling coefficients are relatively large, while the entropic uncertainty only reduces to 1 with the increase of the homogeneous magnetic field. Additionally, we observe the purity of the state and Bell non-locality and obtain that the entropic uncertainty is anticorrelated with both the purity and Bell non-locality of the evolution state.
Position-momentum uncertainty relations in the presence of quantum memory
DEFF Research Database (Denmark)
Furrer, Fabian; Berta, Mario; Tomamichel, Marco
2014-01-01
A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear oper....... As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states....
International Nuclear Information System (INIS)
Fonseca Coelho, B.C. da.
1987-01-01
The equipment to measure ionizing radiation used in medicine needs appropriate technical qualifications to comply with their purposes and regular calibrations to assure the correct evaluation of associated quantities. By legal requirements, the annual calibration of users' dosemeters is to be done in a Secondary Standard Dosimetry Laboratory (SSDL), andthe SSDL'S standard dosemeters are refered to a Primary Standard Dosimetry (PSDL), establishing a rigourous metrological network. The SSDL network. The SSDL needs to maintain, regularly, a quality control program for short and Long term stability of standard dosemeters. The purpose of the work was to determine the uncertainties associated to technical procedures of X-rays calibration at the SSDL/IRD/IRD. To evaluate the influence of the nine main parameters that can give origin to uncertainties, specific procedures and methods are established, according to international requirements and recomendations. The methods are based on the comparison of the behaviour of the users' dosemeters, with a standard dosemeter in the many measuring conditions set up for the secondary standard used as a reference. The total uncertainty obtained was 1,81% usig a conservative procedure, to protect the users and patients. When needed to transfer the calibration factor and their uncertainty, the procedure used was to determine the uncertainty under the worsst possible operating conditions of the equipment, to obtain a superestimated value. This represents an excellent result for an SDDL of IAEA Network. (autor) [pt
Tritium source-related systematic uncertainties of the KATRIN experiment
Energy Technology Data Exchange (ETDEWEB)
Seitz-Moskaliuk, Hendrik [Karlsruher Institut fuer Technologie, Institut fuer experimentelle Kernphysik, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Collaboration: KATRIN-Collaboration
2016-07-01
KATRIN will perform a direct, kinematics-based measurement of the neutrino mass with a sensitivity of 200 meV (90 % C. L.) reached after 3 years of measurement time. The neutrino mass is obtained by determining the shape of the spectrum of tritium β decay electrons close to the endpoint of 18.6 keV with a spectrometer of MAC-E filter type. To achieve the planned sensitivity, the systematic measurement uncertainties have to be carefully controlled and evaluated. Main sources of systematics are the MAC-E filter on the one hand and the source and transport section (STS) on the other hand. Most of the operational parameters of KATRIN have to be stable at or even below the per mille level and have to meet further strict requirements. This talk reviews the KATRIN systematics with a special focus on the STS. Early commissioning measurements to determine the main systematics are introduced.
SU-G-BRB-14: Uncertainty of Radiochromic Film Based Relative Dose Measurements
Energy Technology Data Exchange (ETDEWEB)
Devic, S; Tomic, N; DeBlois, F; Seuntjens, J [McGill University, Montreal, QC (Canada); Lewis, D [RCF Consulting, LLC, Monroe, CT (United States); Aldelaijan, S [King Faisal Specialist Hospital & Research Center, Riyadh (Saudi Arabia)
2016-06-15
Purpose: Due to inherently non-linear dose response, measurement of relative dose distribution with radiochromic film requires measurement of absolute dose using a calibration curve following previously established reference dosimetry protocol. On the other hand, a functional form that converts the inherently non-linear dose response curve of the radiochromic film dosimetry system into linear one has been proposed recently [Devic et al, Med. Phys. 39 4850–4857 (2012)]. However, there is a question what would be the uncertainty of such measured relative dose. Methods: If the relative dose distribution is determined going through the reference dosimetry system (conversion of the response by using calibration curve into absolute dose) the total uncertainty of such determined relative dose will be calculated by summing in quadrature total uncertainties of doses measured at a given and at the reference point. On the other hand, if the relative dose is determined using linearization method, the new response variable is calculated as ζ=a(netOD)n/ln(netOD). In this case, the total uncertainty in relative dose will be calculated by summing in quadrature uncertainties for a new response function (σζ) for a given and the reference point. Results: Except at very low doses, where the measurement uncertainty dominates, the total relative dose uncertainty is less than 1% for the linear response method as compared to almost 2% uncertainty level for the reference dosimetry method. The result is not surprising having in mind that the total uncertainty of the reference dose method is dominated by the fitting uncertainty, which is mitigated in the case of linearization method. Conclusion: Linearization of the radiochromic film dose response provides a convenient and a more precise method for relative dose measurements as it does not require reference dosimetry and creation of calibration curve. However, the linearity of the newly introduced function must be verified. Dave Lewis
DEFF Research Database (Denmark)
Kolarik, Jakub; Olesen, Bjarne W.
2015-01-01
European Standard EN 15 251 in its current version does not provide any guidance on how to handle uncertainty of long term measurements of indoor environmental parameters used for classification of buildings. The objective of the study was to analyse the uncertainty for field measurements...... measurements of operative temperature at two measuring points (south/south-west and north/northeast orientation). Results of the present study suggest that measurement uncertainty needs to be considered during assessment of thermal environment in existing buildings. When expanded standard uncertainty was taken...... into account in categorization of thermal environment according to EN 15251, the difference in prevalence of exceeded category limits were up to 17.3%, 8.3% and 2% of occupied hours for category I, II and III respectively....
The uncertainties in estimating measurement uncertainties
International Nuclear Information System (INIS)
Clark, J.P.; Shull, A.H.
1994-01-01
All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties
Uncertainty in estimating and mitigating industrial related GHG emissions
International Nuclear Information System (INIS)
El-Fadel, M.; Zeinati, M.; Ghaddar, N.; Mezher, T.
2001-01-01
Global climate change has been one of the challenging environmental concerns facing policy makers in the past decade. The characterization of the wide range of greenhouse gas emissions sources and sinks as well as their behavior in the atmosphere remains an on-going activity in many countries. Lebanon, being a signatory to the Framework Convention on Climate Change, is required to submit and regularly update a national inventory of greenhouse gas emissions sources and removals. Accordingly, an inventory of greenhouse gases from various sectors was conducted following the guidelines set by the United Nations Intergovernmental Panel on Climate Change (IPCC). The inventory indicated that the industrial sector contributes about 29% to the total greenhouse gas emissions divided between industrial processes and energy requirements at 12 and 17%, respectively. This paper describes major mitigation scenarios to reduce emissions from this sector based on associated technical, economic, environmental, and social characteristics. Economic ranking of these scenarios was conducted and uncertainty in emission factors used in the estimation process was emphasized. For this purpose, theoretical and experimental emission factors were used as alternatives to default factors recommended by the IPCC and the significance of resulting deviations in emission estimation is presented. (author)
Uncertainty Analysis of RBMK-Related Experimental Data
International Nuclear Information System (INIS)
Urbonas, Rolandas; Kaliatka, Algirdas; Liaukonis, Mindaugas
2002-01-01
An attempt to validate state-of-the-art thermal hydraulic code ATHLET (GRS, Germany) on the basis of E-108 test facility was made. Originally this code was developed and validated for different type reactors than RBMK. Since state-of-art thermal hydraulic codes are widely used for simulation of RBMK reactors, further codes' implementation and validation is required. The phenomena associated with channel type flow instabilities and CHF were found to be an important step in the frame of the overall effort of state-of-the-art validation and application for RBMK reactors. In the paper one-channel approach analysis is presented. Thus, the oscillatory behaviour of the system was not detected. The results show dependence on the nodalization used in the heated channels, initial and boundary conditions and code selected models. It is shown that the code is able to predict a sudden heat structure temperature excursion, when critical heat flux is approached. GRS developed uncertainty and sensitivity methodology was employed in the analysis. (authors)
Experimental Test of Entropic Noise-Disturbance Uncertainty Relations for Spin-1/2 Measurements.
Sulyok, Georg; Sponar, Stephan; Demirel, Bülent; Buscemi, Francesco; Hall, Michael J W; Ozawa, Masanao; Hasegawa, Yuji
2015-07-17
Information-theoretic definitions for noise and disturbance in quantum measurements were given in [Phys. Rev. Lett. 112, 050401 (2014)] and a state-independent noise-disturbance uncertainty relation was obtained. Here, we derive a tight noise-disturbance uncertainty relation for complementary qubit observables and carry out an experimental test. Successive projective measurements on the neutron's spin-1/2 system, together with a correction procedure which reduces the disturbance, are performed. Our experimental results saturate the tight noise-disturbance uncertainty relation for qubits when an optimal correction procedure is applied.
Interpretation of the peak areas in gamma-ray spectra that have a large relative uncertainty
International Nuclear Information System (INIS)
Korun, M.; Maver Modec, P.; Vodenik, B.
2012-01-01
Empirical evidence is provided that the areas of peaks having a relative uncertainty in excess of 30% are overestimated. This systematic influence is of a statistical nature and originates in way the peak-analyzing routine recognizes the small peaks. It is not easy to detect this influence since it is smaller than the peak-area uncertainty. However, the systematic influence can be revealed in repeated measurements under the same experimental conditions, e.g., in background measurements. To evaluate the systematic influence, background measurements were analyzed with the peak-analyzing procedure described by Korun et al. (2008). The magnitude of the influence depends on the relative uncertainty of the peak area and may amount, in the conditions used in the peak analysis, to a factor of 5 at relative uncertainties exceeding 60%. From the measurements, the probability for type-II errors, as a function of the relative uncertainty of the peak area, was extracted. This probability is near zero below an uncertainty of 30% and rises to 90% at uncertainties exceeding 50%. - Highlights: ► A systematic influence affecting small peak areas in gamma-ray spectra is described. ► The influence originates in the peak locating procedure, using a pre-determined sensitivity. ► The predetermined sensitivity makes peak areas with large uncertainties to be overestimated. ► The influence depends on the relative uncertainty of the number of counts in the peak. ► Corrections exceeding a factor of 3 are attained at peak area uncertainties exceeding 60%.
Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu
2018-01-01
Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.
International Nuclear Information System (INIS)
Dijk, Eduard van; Kolkman-Deurloo, Inger-Karine K.; Damen, Patricia M. G.
2004-01-01
Different methods exist to determine the air kerma calibration factor of an ionization chamber for the spectrum of a 192 Ir high-dose-rate (HDR) or pulsed-dose-rate (PDR) source. An analysis of two methods to obtain such a calibration factor was performed: (i) the method recommended by [Goetsch et al., Med. Phys. 18, 462-467 (1991)] and (ii) the method employed by the Dutch national standards institute NMi [Petersen et al., Report S-EI-94.01 (NMi, Delft, The Netherlands, 1994)]. This analysis showed a systematic difference on the order of 1% in the determination of the strength of 192 Ir HDR and PDR sources depending on the method used for determining the air kerma calibration factor. The definitive significance of the difference between these methods can only be addressed after performing an accurate analysis of the associated uncertainties. For an NE 2561 (or equivalent) ionization chamber and an in-air jig, a typical uncertainty budget of 0.94% was found with the NMi method. The largest contribution in the type-B uncertainty is the uncertainty in the air kerma calibration factor for isotope i, N k i , as determined by the primary or secondary standards laboratories. This uncertainty is dominated by the uncertainties in the physical constants for the average mass-energy absorption coefficient ratio and the stopping power ratios. This means that it is not foreseeable that the standards laboratories can decrease the uncertainty in the air kerma calibration factors for ionization chambers in the short term. When the results of the determination of the 192 Ir reference air kerma rates in, e.g., different institutes are compared, the uncertainties in the physical constants are the same. To compare the applied techniques, the ratio of the results can be judged by leaving out the uncertainties due to these physical constants. In that case an uncertainty budget of 0.40% (coverage factor=2) should be taken into account. Due to the differences in approach between the
On entropic uncertainty relations in the presence of a minimal length
Rastegin, Alexey E.
2017-07-01
Entropic uncertainty relations for the position and momentum within the generalized uncertainty principle are examined. Studies of this principle are motivated by the existence of a minimal observable length. Then the position and momentum operators satisfy the modified commutation relation, for which more than one algebraic representation is known. One of them is described by auxiliary momentum so that the momentum and coordinate wave functions are connected by the Fourier transform. However, the probability density functions of the physically true and auxiliary momenta are different. As the corresponding entropies differ, known entropic uncertainty relations are changed. Using differential Shannon entropies, we give a state-dependent formulation with correction term. State-independent uncertainty relations are obtained in terms of the Rényi entropies and the Tsallis entropies with binning. Such relations allow one to take into account a finiteness of measurement resolution.
Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu
2018-04-01
The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.
Do the Uncertainty Relations Really have Crucial Signiﬁcances for Physics?
Directory of Open Access Journals (Sweden)
Dumitru S.
2010-10-01
Full Text Available It is proved the falsity of idea that the Uncertainty Relations (UR have crucial signif- icances for physics. Additionally one argues for the necesity of an UR-disconnected quantum philosophy.
Uncertainty of relative sensitivity factors in glow discharge mass spectrometry
Meija, Juris; Methven, Brad; Sturgeon, Ralph E.
2017-10-01
The concept of the relative sensitivity factors required for the correction of the measured ion beam ratios in pin-cell glow discharge mass spectrometry is examined in detail. We propose a data-driven model for predicting the relative response factors, which relies on a non-linear least squares adjustment and analyte/matrix interchangeability phenomena. The model provides a self-consistent set of response factors for any analyte/matrix combination of any element that appears as either an analyte or matrix in at least one known response factor.
Park, DaeKil
2018-06-01
The dynamics of entanglement and uncertainty relation is explored by solving the time-dependent Schrödinger equation for coupled harmonic oscillator system analytically when the angular frequencies and coupling constant are arbitrarily time dependent. We derive the spectral and Schmidt decompositions for vacuum solution. Using the decompositions, we derive the analytical expressions for von Neumann and Rényi entropies. Making use of Wigner distribution function defined in phase space, we derive the time dependence of position-momentum uncertainty relations. To show the dynamics of entanglement and uncertainty relation graphically, we introduce two toy models and one realistic quenched model. While the dynamics can be conjectured by simple consideration in the toy models, the dynamics in the realistic quenched model is somewhat different from that in the toy models. In particular, the dynamics of entanglement exhibits similar pattern to dynamics of uncertainty parameter in the realistic quenched model.
Overcoming uncertainty for within-network relational machine learning
Pfeiffer, Joseph J.
2015-01-01
People increasingly communicate through email and social networks to maintain friendships and conduct business, as well as share online content such as pictures, videos and products. Relational machine learning (RML) utilizes a set of observed attributes and network structure to predict corresponding labels for items; for example, to predict individuals engaged in securities fraud, we can utilize phone calls and workplace information to make joint predictions over the individuals. However, in...
Peest, Christian; Schinke, Carsten; Brendel, Rolf; Schmidt, Jan; Bothe, Karsten
2017-01-01
Spectrophotometers are operated in numerous fields of science and industry for a variety of applications. In order to provide confidence for the measured data, analyzing the associated uncertainty is valuable. However, the uncertainty of the measurement results is often unknown or reduced to sample-related contributions. In this paper, we describe our approach for the systematic determination of the measurement uncertainty of the commercially available two-channel spectrophotometer Agilent Cary 5000 in accordance with the Guide to the expression of uncertainty in measurements. We focus on the instrumentation-related uncertainty contributions rather than the specific application and thus outline a general procedure which can be adapted for other instruments. Moreover, we discover a systematic signal deviation due to the inertia of the measurement amplifier and develop and apply a correction procedure. Thereby we increase the usable dynamic range of the instrument by more than one order of magnitude. We present methods for the quantification of the uncertainty contributions and combine them into an uncertainty budget for the device.
Towards standard testbeds for numerical relativity
International Nuclear Information System (INIS)
Alcubierre, Miguel; Allen, Gabrielle; Bona, Carles; Fiske, David; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Hawley, Scott H; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David; Salgado, Marcelo; Schnetter, Erik; Seidel, Edward; Shinkai, Hisa-aki; Shoemaker, Deirdre; Szilagyi, Bela; Takahashi, Ryoji; Winicour, Jeff
2004-01-01
In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community
Towards standard testbeds for numerical relativity
Energy Technology Data Exchange (ETDEWEB)
Alcubierre, Miguel [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Allen, Gabrielle; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany); Bona, Carles [Departament de Fisica, Universitat de les Illes Balears, Ctra de Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Fiske, David [Dept. of Physics, Univ. of Maryland, College Park, MD 20742-4111 (United States); Hawley, Scott H [Center for Relativity, Univ. of Texas at Austin, Austin, Texas 78712 (United States); Salgado, Marcelo [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Schnetter, Erik [Inst. fuer Astronomie und Astrophysik, Universitaet Tuebingen, 72076 Tuebingen (Germany); Seidel, Edward [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Inst., 14476 Golm (Germany); Shinkai, Hisa-aki [Computational Science Div., Inst. of Physical and Chemical Research (RIKEN), Hirosawa 2-1, Wako, Saitama 351-0198 (Japan); Shoemaker, Deirdre [Center for Radiophysics and Space Research, Cornell Univ., Ithaca, NY 14853 (United States); Szilagyi, Bela [Dept. of Physics and Astronomy, Univ. of Pittsburgh, Pittsburgh, PA 15260 (United States); Takahashi, Ryoji [Theoretical Astrophysics Center, Juliane Maries Vej 30, 2100 Copenhagen, (Denmark); Winicour, Jeff [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany)
2004-01-21
In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community.
Eisenberg, Stacy A; Kurita, Keiko; Taylor-Ford, Megan; Agus, David B; Gross, Mitchell E; Meyerowitz, Beth E
2015-02-01
Prostate cancer survivors have reported cognitive complaints following treatment, and these difficulties may be associated with survivors' ongoing cancer-related distress. Intolerance of uncertainty may exacerbate this hypothesized relationship by predisposing individuals to approach uncertain situations such as cancer survivorship in an inflexible and negative manner. We investigated whether greater cognitive complaints and higher intolerance of uncertainty would interact in their relation to more cancer-related distress symptoms. This cross-sectional, questionnaire-based study included 67 prostate cancer survivors who were 3 to 5 years post treatment. Hierarchical multiple regression analyses tested the extent to which intolerance of uncertainty, cognitive complaints, and their interaction were associated with cancer-related distress (measured with the Impact of Event Scale-Revised; IES-R) after adjusting for age, education, physical symptoms, and fear of cancer recurrence. Intolerance of uncertainty was positively associated with the IES-R avoidance and hyperarousal subscales. More cognitive complaints were associated with higher scores on the IES-R hyperarousal subscale. The interaction of intolerance of uncertainty and cognitive complaints was significantly associated with IES-R intrusion, such that greater cognitive complaints were associated with greater intrusive thoughts in survivors high in intolerance of uncertainty but not those low in it. Prostate cancer survivors who report cognitive difficulties or who find uncertainty uncomfortable and unacceptable may be at greater risk for cancer-related distress, even 3 to 5 years after completing treatment. It may be beneficial to address both cognitive complaints and intolerance of uncertainty in psychosocial interventions. Copyright © 2014 John Wiley & Sons, Ltd.
Uncertainty characterization of HOAPS 3.3 latent heat-flux-related parameters
Liman, Julian; Schröder, Marc; Fennig, Karsten; Andersson, Axel; Hollmann, Rainer
2018-03-01
Latent heat flux (LHF) is one of the main contributors to the global energy budget. As the density of in situ LHF measurements over the global oceans is generally poor, the potential of remotely sensed LHF for meteorological applications is enormous. However, to date none of the available satellite products have included estimates of systematic, random, and sampling uncertainties, all of which are essential for assessing their quality. Here, the challenge is taken on by matching LHF-related pixel-level data of the Hamburg Ocean Atmosphere Parameters and Fluxes from Satellite (HOAPS) climatology (version 3.3) to in situ measurements originating from a high-quality data archive of buoys and selected ships. Assuming the ground reference to be bias-free, this allows for deriving instantaneous systematic uncertainties as a function of four atmospheric predictor variables. The approach is regionally independent and therefore overcomes the issue of sparse in situ data densities over large oceanic areas. Likewise, random uncertainties are derived, which include not only a retrieval component but also contributions from in situ measurement noise and the collocation procedure. A recently published random uncertainty decomposition approach is applied to isolate the random retrieval uncertainty of all LHF-related HOAPS parameters. It makes use of two combinations of independent data triplets of both satellite and in situ data, which are analysed in terms of their pairwise variances of differences. Instantaneous uncertainties are finally aggregated, allowing for uncertainty characterizations on monthly to multi-annual timescales. Results show that systematic LHF uncertainties range between 15 and 50 W m-2 with a global mean of 25 W m-2. Local maxima are mainly found over the subtropical ocean basins as well as along the western boundary currents. Investigations indicate that contributions from qa (U) to the overall LHF uncertainty are on the order of 60 % (25 %). From an
2016-03-01
CYCLONE TRACK FORECAST ERROR DISTRIBUTIONS WITH MEASUREMENTS OF FORECAST UNCERTAINTY by Nicholas M. Chisler March 2016 Thesis Advisor...March 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE RELATING TROPICAL CYCLONE TRACK FORECAST ERROR DISTRIBUTIONS...WITH MEASUREMENTS OF FORECAST UNCERTAINTY 5. FUNDING NUMBERS 6. AUTHOR(S) Nicholas M. Chisler 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES
International Nuclear Information System (INIS)
Lo Bianco, A.S.; Oliveira, H.P.S.; Peixoto, J.G.P.
2009-01-01
To implant the primary standard of the magnitude kerma in the air for X-ray between 10 - 50 keV, the National Metrology Laboratory of Ionizing Radiations (LNMRI) must evaluate all the uncertainties of measurement related with Victtoren chamber. So, it was evaluated the uncertainty of the kerma in the air consequent of the inaccuracy in the active volume of the chamber using the calculation of Monte Carlo as a tool through the Penelope software
Position-momentum uncertainty relations in the presence of quantum memory
Energy Technology Data Exchange (ETDEWEB)
Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp [Department of Physics, Graduate School of Science, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Berta, Mario [Institute for Quantum Information and Matter, Caltech, Pasadena, California 91125 (United States); Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Tomamichel, Marco [School of Physics, The University of Sydney, Sydney 2006 (Australia); Centre for Quantum Technologies, National University of Singapore, Singapore 117543 (Singapore); Scholz, Volkher B. [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Christandl, Matthias [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Department of Mathematical Sciences, University of Copenhagen, Universitetsparken 5, 2100 Copenhagen (Denmark)
2014-12-15
A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.
Hyeon, Changbong; Hwang, Wonseok
2017-07-01
Using Brownian motion in periodic potentials V (x ) tilted by a force f , we provide physical insight into the thermodynamic uncertainty relation, a recently conjectured principle for statistical errors and irreversible heat dissipation in nonequilibrium steady states. According to the relation, nonequilibrium output generated from dissipative processes necessarily incurs an energetic cost or heat dissipation q , and in order to limit the output fluctuation within a relative uncertainty ɛ , at least 2 kBT /ɛ2 of heat must be dissipated. Our model shows that this bound is attained not only at near-equilibrium [f ≪V'(x ) ] but also at far-from-equilibrium [f ≫V'(x ) ] , more generally when the dissipated heat is normally distributed. Furthermore, the energetic cost is maximized near the critical force when the barrier separating the potential wells is about to vanish and the fluctuation of Brownian particles is maximized. These findings indicate that the deviation of heat distribution from Gaussianity gives rise to the inequality of the uncertainty relation, further clarifying the meaning of the uncertainty relation. Our derivation of the uncertainty relation also recognizes a bound of nonequilibrium fluctuations that the variance of dissipated heat (σq2) increases with its mean (μq), and it cannot be smaller than 2 kBT μq .
Hertz, Anaelle; Vanbever, Luc; Cerf, Nicolas J.
2018-01-01
The uncertainty relation for continuous variables due to Byałinicki-Birula and Mycielski [I. Białynicki-Birula and J. Mycielski, Commun. Math. Phys. 44, 129 (1975), 10.1007/BF01608825] expresses the complementarity between two n -tuples of canonically conjugate variables (x1,x2,...,xn) and (p1,p2,...,pn) in terms of Shannon differential entropy. Here we consider the generalization to variables that are not canonically conjugate and derive an entropic uncertainty relation expressing the balance between any two n -variable Gaussian projective measurements. The bound on entropies is expressed in terms of the determinant of a matrix of commutators between the measured variables. This uncertainty relation also captures the complementarity between any two incompatible linear canonical transforms, the bound being written in terms of the corresponding symplectic matrices in phase space. Finally, we extend this uncertainty relation to Rényi entropies and also prove a covariance-based uncertainty relation which generalizes the Robertson relation.
Madaniyazi, Lina; Guo, Yuming; Yu, Weiwei; Tong, Shilu
2015-02-01
Climate change may affect mortality associated with air pollutants, especially for fine particulate matter (PM2.5) and ozone (O3). Projection studies of such kind involve complicated modelling approaches with uncertainties. We conducted a systematic review of researches and methods for projecting future PM2.5-/O3-related mortality to identify the uncertainties and optimal approaches for handling uncertainty. A literature search was conducted in October 2013, using the electronic databases: PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search was limited to peer-reviewed journal articles published in English from January 1980 to September 2013. Fifteen studies fulfilled the inclusion criteria. Most studies reported that an increase of climate change-induced PM2.5 and O3 may result in an increase in mortality. However, little research has been conducted in developing countries with high emissions and dense populations. Additionally, health effects induced by PM2.5 may dominate compared to those caused by O3, but projection studies of PM2.5-related mortality are fewer than those of O3-related mortality. There is a considerable variation in approaches of scenario-based projection researches, which makes it difficult to compare results. Multiple scenarios, models and downscaling methods have been used to reduce uncertainties. However, few studies have discussed what the main source of uncertainties is and which uncertainty could be most effectively reduced. Projecting air pollution-related mortality requires a systematic consideration of assumptions and uncertainties, which will significantly aid policymakers in efforts to manage potential impacts of PM2.5 and O3 on mortality in the context of climate change. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
The grey relational approach for evaluating measurement uncertainty with poor information
International Nuclear Information System (INIS)
Luo, Zai; Wang, Yanqing; Zhou, Weihu; Wang, Zhongyu
2015-01-01
The Guide to the Expression of Uncertainty in Measurement (GUM) is the master document for measurement uncertainty evaluation. However, the GUM may encounter problems and does not work well when the measurement data have poor information. In most cases, poor information means a small data sample and an unknown probability distribution. In these cases, the evaluation of measurement uncertainty has become a bottleneck in practical measurement. To solve this problem, a novel method called the grey relational approach (GRA), different from the statistical theory, is proposed in this paper. The GRA does not require a large sample size or probability distribution information of the measurement data. Mathematically, the GRA can be divided into three parts. Firstly, according to grey relational analysis, the grey relational coefficients between the ideal and the practical measurement output series are obtained. Secondly, the weighted coefficients and the measurement expectation function will be acquired based on the grey relational coefficients. Finally, the measurement uncertainty is evaluated based on grey modeling. In order to validate the performance of this method, simulation experiments were performed and the evaluation results show that the GRA can keep the average error around 5%. Besides, the GRA was also compared with the grey method, the Bessel method, and the Monte Carlo method by a real stress measurement. Both the simulation experiments and real measurement show that the GRA is appropriate and effective to evaluate the measurement uncertainty with poor information. (paper)
Ando, Amy W; Mallory, Mindy L
2012-04-24
Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit-cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change-induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area.
Ando, Amy W.; Mallory, Mindy L.
2012-01-01
Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit–cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change–induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area. PMID:22451914
Standard definitions of terms relating to dosimetry - approved standard 1973
International Nuclear Information System (INIS)
Anon.
1975-01-01
Definitions are presented for terms related to radiation dosimetry. These definitions are the same as, or similar to, those recommended by the International Commission on Radiological Units and Measurements (ICRU) as presented in the National Bureau of Sandards Handbook 62, but attempt has been made to define some of the terms more exactly
Standards for Educational Public Relations and Communications Professionals.
Chappelow, Marsha A.
2003-01-01
Describes National School Public Relations Association standards for school public relations and communications professionals and program. Includes reactions and comments about new Association standards from seven superintendents and four school public-relations professionals. (PKP)
Event-by-event simulation of single-neutron experiments to test uncertainty relations
International Nuclear Information System (INIS)
Raedt, H De; Michielsen, K
2014-01-01
Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. (paper)
Complementarity and the Nature of Uncertainty Relations in Einstein–Bohr Recoiling Slit Experiment
Directory of Open Access Journals (Sweden)
Shogo Tanimura
2015-07-01
Full Text Available A model of the Einstein–Bohr recoiling slit experiment is formulated in a fully quantum theoretical setting. In this model, the state and dynamics of a movable wall that has two slits in it, as well as the state of a particle incoming to the two slits, are described by quantum mechanics. Using this model, we analyzed complementarity between exhibiting an interference pattern and distinguishing the particle path. Comparing the Kennard–Robertson type and the Ozawa-type uncertainty relations, we conclude that the uncertainty relation involved in the double-slit experiment is not the Ozawa-type uncertainty relation but the Kennard-type uncertainty relation of the position and the momentum of the double-slit wall. A possible experiment to test the complementarity relation is suggested. It is also argued that various phenomena which occur at the interface of a quantum system and a classical system, including distinguishability, interference, decoherence, quantum eraser, and weak value, can be understood as aspects of entanglement. Quanta 2015; 4: 1–9.
Strekalova, Yulia A; James, Vaughan S
2017-09-01
User-generated information on the Internet provides opportunities for the monitoring of health information consumer attitudes. For example, information about cancer prevention may cause decisional conflict. Yet posts and conversations shared by health information consumers online are often not readily actionable for interpretation and decision-making due to their unstandardized format. This study extends prior research on the use of natural language as a predictor of consumer attitudes and provides a link to decision-making by evaluating the predictive role of uncertainty indicators expressed in natural language. Analyzed data included free-text comments and structured scale responses related to information about skin cancer prevention options. The study identified natural language indicators of uncertainty and showed that it can serve as a predictor of decisional conflict. The natural indicators of uncertainty reported here can facilitate the monitoring of health consumer perceptions about cancer prevention recommendations and inform education and communication campaign planning and evaluation.
Palenčár, Rudolf; Sopkuliak, Peter; Palenčár, Jakub; Ďuriš, Stanislav; Suroviak, Emil; Halaj, Martin
2017-06-01
Evaluation of uncertainties of the temperature measurement by standard platinum resistance thermometer calibrated at the defining fixed points according to ITS-90 is a problem that can be solved in different ways. The paper presents a procedure based on the propagation of distributions using the Monte Carlo method. The procedure employs generation of pseudo-random numbers for the input variables of resistances at the defining fixed points, supposing the multivariate Gaussian distribution for input quantities. This allows taking into account the correlations among resistances at the defining fixed points. Assumption of Gaussian probability density function is acceptable, with respect to the several sources of uncertainties of resistances. In the case of uncorrelated resistances at the defining fixed points, the method is applicable to any probability density function. Validation of the law of propagation of uncertainty using the Monte Carlo method is presented on the example of specific data for 25 Ω standard platinum resistance thermometer in the temperature range from 0 to 660 °C. Using this example, we demonstrate suitability of the method by validation of its results.
International Nuclear Information System (INIS)
Sumining; Agus Taftazani
2003-01-01
Uncertainty of analysis of Ti, V, Cl, Ce, Cr, Cs, Sc, Co, Fe and Ca in solid samples by INAA (/instrumental Neutron Activation Analysis) method using comparative technique and standard addition have been carried out at INAA laboratory of P3TM BATAN. The calculation of Ti have been presented as the example. Uncertainty sources of INAA are sampling, sample and standard preparation, irradiation and counting. Sample were come from IAEA (International Atomic Energy Agency) which had ready for analyzed therefore only for sample and standard preparation, irradiation and counting factors were determined. Analysis were done by relative technique, that sample and standard were irradiated together in same capsule therefore irradiation time, neutron flux, irradiation geometry and isotopic properties. will be eliminated. Uncertainty of counting factors were covering radioactivity decay during the counting, pulse losses caused by random counting, counting geometry, and counting rate. Relative technique makes the uncertainty come from counting time for sample and standard that was settled by same counting equipment can be neglected. Uncertainty of counting geometry and thickness of uranium was not detected so there is no contribution come from The fission product. Variation of fuel target nuclides number didn't occurred because the combustion was not occurred during irradiation, and analytical results were not influenced by the chemical status. (author)
International Nuclear Information System (INIS)
Lerche, I.; Noeth, S.
2002-01-01
The influence of two fundamentally different types of uncertainty on the value of oil field production are investigated here. First considered is the uncertainty caused by the fact that the expected value estimate is not one of the possible outcomes. To correctly allow for the risk attendant upon using the expected value as a measure of worth, even with statistically sharp parameters, one needs to incorporate the uncertainty of the expected value. Using a simple example we show how such incorporation allows for a clear determination of the relative risk of projects that may have the same expected value but very different risks. We also show how each project can be risked on its own using the expected value and variance. This uncertainty type is due to the possible pathways for different outcomes even when parameters categorizing the system are taken to be known. Second considered is the risk due to the fact that parameters in oil field estimates are just estimates and, as such, have their own intrinsic errors that influence the possible outcomes and make them less certain. This sort of risk depends upon the uncertainty of each parameter, and also the type of distribution the parameters are taken to be drawn from. In addition, not all uncertainties in parameters values are of equal importance in influencing an outcome probability. We show how can determine the relative importance for the parameters and so determine where to place effort to resolve the dominant contributions to risk if it is possible to do so. Considerations of whether to acquire new information, and also whether to undertake further studies under such an uncertain environment, are used as vehicles to address these concerns of risk due to uncertainty. In general, an oil field development project has to contend with all the above types of risk and uncertainty. It is therefore of importance to have quantitative measures of risk so that one can compare and contrast the various effects, and so that
Regeneration decisions in forestry under climate change related uncertainties and risks
DEFF Research Database (Denmark)
Schou, Erik; Thorsen, Bo Jellesmark; Jacobsen, Jette Bredahl
2015-01-01
) assigned to each outcome. Results show that the later a forest manager expects to obtain certainty about climate change or the more skewed their belief distribution, the more will decisions be based on ex ante assessments — suggesting that if forest managers believe that climate change uncertainty......Future climate development and its effects on forest ecosystems are not easily predicted or described in terms of standard probability concepts. Nevertheless, forest managers continuously make long-term decisions that will be subject to climate change impacts. The manager's assessment of possible...... to generate a set of alternative outcomes, investigating effects on decision making of three aspects of uncertainty: (i) the perceived time horizon before there will be certainty on outcome, (ii) the spread of impacts across the set of alternative outcomes, and (iii) the subjective probability (belief...
International Nuclear Information System (INIS)
Green, B.
2003-01-01
Standards for the electricity industry are developed to ensure quality and serve as a basis to which utilities should conform. Standards specify agreed upon properties for a manufactured product. They should be used for equipment specifications as well as operational procedures. Standardization is performed by regulators, transmission owners/operators, and organizations such as the National Electric Reliability Council (NERC), the Northeast Power Coordinating Council (NPCC), the North American Energy Standards Board (NAESB), and the Committee of Chief Risk Officers (CCRO). Before markets were opened to competition, operational standards were dictated by transmission owners and reliability issues were dealt with by NERC and NPCC. This presentation explained the process of standardization in the electric power industry in Canada, the derivation of standards, moving beyond NERC, and the transmission owners. Issues for Ontario Power Generation were highlighted. In contrast to the situation in the United States, there is no federal government backstop for developing Standards in Canada. There is no federal initiative toward open access. Canadian utilities participated in NERC, but compliance was voluntary. It is still questionable if Canadian utilities will implement NERC and NAESB Standards if they are codified
Uncertainties Related to Extreme Event Statistics of Sewer System Surcharge and Overflow
DEFF Research Database (Denmark)
Schaarup-Jensen, Kjeld; Johansen, C.; Thorndahl, Søren Liedtke
2005-01-01
Today it is common practice - in the major part of Europe - to base design of sewer systems in urban areas on recommended minimum values of flooding frequencies related to either pipe top level, basement level in buildings or level of road surfaces. Thus storm water runoff in sewer systems is only...... proceeding in an acceptable manner, if flooding of these levels is having an average return period bigger than a predefined value. This practice is also often used in functional analysis of existing sewer systems. If a sewer system can fulfil recommended flooding frequencies or not, can only be verified...... by performing long term simulations - using a sewer flow simulation model - and draw up extreme event statistics from the model simulations. In this context it is important to realize that uncertainties related to the input parameters of rainfall runoff models will give rise to uncertainties related...
Energy Technology Data Exchange (ETDEWEB)
Farah, J., E-mail: jad.farah@irsn.fr; Clairand, I.; Huet, C. [External Dosimetry Department, Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP-17, 92260 Fontenay-aux-Roses (France); Trianni, A. [Medical Physics Department, Udine University Hospital S. Maria della Misericordia (AOUD), p.le S. Maria della Misericordia, 15, 33100 Udine (Italy); Ciraj-Bjelac, O. [Vinca Institute of Nuclear Sciences (VINCA), P.O. Box 522, 11001 Belgrade (Serbia); De Angelis, C. [Department of Technology and Health, Istituto Superiore di Sanità (ISS), Viale Regina Elena 299, 00161 Rome (Italy); Delle Canne, S. [Fatebenefratelli San Giovanni Calibita Hospital (FBF), UOC Medical Physics - Isola Tiberina, 00186 Rome (Italy); Hadid, L.; Waryn, M. J. [Radiology Department, Hôpital Jean Verdier (HJV), Avenue du 14 Juillet, 93140 Bondy Cedex (France); Jarvinen, H.; Siiskonen, T. [Radiation and Nuclear Safety Authority (STUK), P.O. Box 14, 00881 Helsinki (Finland); Negri, A. [Veneto Institute of Oncology (IOV), Via Gattamelata 64, 35124 Padova (Italy); Novák, L. [National Radiation Protection Institute (NRPI), Bartoškova 28, 140 00 Prague 4 (Czech Republic); Pinto, M. [Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA-INMRI), C.R. Casaccia, Via Anguillarese 301, I-00123 Santa Maria di Galeria (RM) (Italy); Knežević, Ž. [Ruđer Bošković Institute (RBI), Bijenička c. 54, 10000 Zagreb (Croatia)
2015-07-15
Purpose: To investigate the optimal use of XR-RV3 GafChromic{sup ®} films to assess patient skin dose in interventional radiology while addressing the means to reduce uncertainties in dose assessment. Methods: XR-Type R GafChromic films have been shown to represent the most efficient and suitable solution to determine patient skin dose in interventional procedures. As film dosimetry can be associated with high uncertainty, this paper presents the EURADOS WG 12 initiative to carry out a comprehensive study of film characteristics with a multisite approach. The considered sources of uncertainties include scanner, film, and fitting-related errors. The work focused on studying film behavior with clinical high-dose-rate pulsed beams (previously unavailable in the literature) together with reference standard laboratory beams. Results: First, the performance analysis of six different scanner models has shown that scan uniformity perpendicular to the lamp motion axis and that long term stability are the main sources of scanner-related uncertainties. These could induce errors of up to 7% on the film readings unless regularly checked and corrected. Typically, scan uniformity correction matrices and reading normalization to the scanner-specific and daily background reading should be done. In addition, the analysis on multiple film batches has shown that XR-RV3 films have generally good uniformity within one batch (<1.5%), require 24 h to stabilize after the irradiation and their response is roughly independent of dose rate (<5%). However, XR-RV3 films showed large variations (up to 15%) with radiation quality both in standard laboratory and in clinical conditions. As such, and prior to conducting patient skin dose measurements, it is mandatory to choose the appropriate calibration beam quality depending on the characteristics of the x-ray systems that will be used clinically. In addition, yellow side film irradiations should be preferentially used since they showed a lower
Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model
Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.
2012-12-01
Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root
Standard Terminology Relating to Wear and Erosion
American Society for Testing and Materials. Philadelphia
2010-01-01
1.1 The terms and their definitions given herein represent terminology relating to wear and erosion of solid bodies due to mechanical interactions such as occur with cavitation, impingement by liquid jets or drops or by solid particles, or relative motion against contacting solid surfaces or fluids. This scope interfaces with but generally excludes those processes where material loss is wholly or principally due to chemical action and other related technical fields as, for instance, lubrication. 1.2 This terminology is not exhaustive; the absence of any particular term from this collection does not necessarily imply that its use within this scope is discouraged. However, the terms given herein are the recommended terms for the concepts they represent unless otherwise noted. 1.3 Certain general terms and definitions may be restricted and interpreted, if necessary, to make them particularly applicable to the scope as defined herein. 1.4 The purpose of this terminology is to encourage uniformity and accuracy ...
A quantum uncertainty relation based on Fisher's information
Energy Technology Data Exchange (ETDEWEB)
Sanchez-Moreno, P; Plastino, A R; Dehesa, J S, E-mail: pablos@ugr.es, E-mail: arplastino@ugr.es, E-mail: dehesa@ugr.es [Departamento de Fisica Atomica, Molecular y Nuclear and Instituto Carlos I de Fisica Teorica y Computacional, University of Granada, Granada (Spain)
2011-02-11
We explore quantum uncertainty relations involving the Fisher information functionals I{sub x} and I{sub p} evaluated, respectively, on a wavefunction {Psi}(x) defined on a D-dimensional configuration space and the concomitant wavefunction {Psi}-tilde(p) on the conjugate momentum space. We prove that the associated Fisher functionals obey the uncertainty relation I{sub x}I{sub p} {>=} 4D{sup 2} when either {Psi}(x) or {Psi}-tilde(p) is real. On the other hand, there is no lower bound to the above product for arbitrary complex wavefunctions. We give explicit examples of complex wavefunctions not obeying the above bound. In particular, we provide a parametrized wavefunction for which the product I{sub x}I{sub p} can be made arbitrarily small.
Morphonological Description of Relative Morphemes in Standard ...
African Journals Online (AJOL)
The theoretical framework that has been used in this analysis is Generative Phonology as stipulated by Chomsky and Halle in their book called Sound Pattern of English (SPE) of 1968. I have considered that relative morpheme forms are the result of two different formatives, which are O- of reference (O-ref.) and the Subject ...
Implementation of standard testbeds for numerical relativity
Energy Technology Data Exchange (ETDEWEB)
Babiuc, M C [Department of Physics and Physical Science, Marshall University, Huntington, WV 25755 (United States); Husa, S [Friedrich Schiller University Jena, Max-Wien-Platz 1, 07743 Jena (Germany); Alic, D [Department of Physics, University of the Balearic Islands, Cra Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinder, I [Center for Gravitational Wave Physics, Pennsylvania State University, University Park, PA 16802 (United States); Lechner, C [Weierstrass Institute for Applied Analysis and Stochastics (WIAS), Mohrenstrasse 39, 10117 Berlin (Germany); Schnetter, E [Center for Computation and Technology, 216 Johnston Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Szilagyi, B; Dorband, N; Pollney, D; Winicour, J [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut), Am Muehlenberg 1, 14076 Golm (Germany); Zlochower, Y [Center for Computational Relativity and Gravitation, School of Mathematical Sciences, Rochester Institute of Technology, 78 Lomb Memorial Drive, Rochester, New York 14623 (United States)
2008-06-21
We discuss results that have been obtained from the implementation of the initial round of testbeds for numerical relativity which was proposed in the first paper of the Apples with Apples Alliance. We present benchmark results for various codes which provide templates for analyzing the testbeds and to draw conclusions about various features of the codes. This allows us to sharpen the initial test specifications, design a new test and add theoretical insight.
Measurability of quantum fields and the energy-time uncertainty relation
International Nuclear Information System (INIS)
Mensky, Mikhail B
2011-01-01
Quantum restrictions on the measurability of an electromagnetic field strength and their relevance to the energy-time uncertainty relation are considered. The minimum errors in measuring electromagnetic field strengths, as they were estimated by the author (1988) in the framework of the phenomenological method of restricted path integral (RPI), are compared with the analogous estimates found by Landau and Peierls (1931) and by Bohr and Rosenfeld (1933) with the help of certain measurement setups. RPI-based restrictions, including those of Landau and Peierls as a special case, hold for any measuring schemes meeting the strict definition of measurement. Their fundamental nature is confirmed by the fact that their associated field detectability condition has the form of the energy-time uncertainty relation. The weaker restrictions suggested by Bohr and Rosenfeld rely on an extended definition of measurement. The energy-time uncertainty relation, which is the condition for the electromagnetic field to be detectable, is applied to the analysis of how the near-field scanning microscope works. (methodological notes)
Loague, Keith; Blanke, James S; Mills, Melissa B; Diaz-Diaz, Ricardo; Corwin, Dennis L
2012-01-01
Precious groundwater resources across the United States have been contaminated due to decades-long nonpoint-source applications of agricultural chemicals. Assessing the impact of past, ongoing, and future chemical applications for large-scale agriculture operations is timely for designing best-management practices to prevent subsurface pollution. Presented here are the results from a series of regional-scale vulnerability assessments for the San Joaquin Valley (SJV). Two relatively simple indices, the retardation and attenuation factors, are used to estimate near-surface vulnerabilities based on the chemical properties of 32 pesticides and the variability of both soil characteristics and recharge rates across the SJV. The uncertainties inherit to these assessments, derived from the uncertainties within the chemical and soil data bases, are estimated using first-order analyses. The results are used to screen and rank the chemicals based on mobility and leaching potential, without and with consideration of data-related uncertainties. Chemicals of historic high visibility in the SJV (e.g., atrazine, DBCP [dibromochloropropane], ethylene dibromide, and simazine) are ranked in the top half of those considered. Vulnerability maps generated for atrazine and DBCP, featured for their legacy status in the study area, clearly illustrate variations within and across the assessments. For example, the leaching potential is greater for DBCP than for atrazine, the leaching potential for DBCP is greater for the spatially variable recharge values than for the average recharge rate, and the leaching potentials for both DBCP and atrazine are greater for the annual recharge estimates than for the monthly recharge estimates. The data-related uncertainties identified in this study can be significant, targeting opportunities for improving future vulnerability assessments. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
Effect of minimal length uncertainty on the mass-radius relation of white dwarfs
Mathew, Arun; Nandy, Malay K.
2018-06-01
Generalized uncertainty relation that carries the imprint of quantum gravity introduces a minimal length scale into the description of space-time. It effectively changes the invariant measure of the phase space through a factor (1 + βp2) - 3 so that the equation of state for an electron gas undergoes a significant modification from the ideal case. It has been shown in the literature (Rashidi 2016) that the ideal Chandrasekhar limit ceases to exist when the modified equation of state due to the generalized uncertainty is taken into account. To assess the situation in a more complete fashion, we analyze in detail the mass-radius relation of Newtonian white dwarfs whose hydrostatic equilibria are governed by the equation of state of the degenerate relativistic electron gas subjected to the generalized uncertainty principle. As the constraint of minimal length imposes a severe restriction on the availability of high momentum states, it is speculated that the central Fermi momentum cannot have values arbitrarily higher than pmax ∼β - 1 / 2. When this restriction is imposed, it is found that the system approaches limiting mass values higher than the Chandrasekhar mass upon decreasing the parameter β to a value given by a legitimate upper bound. Instead, when the more realistic restriction due to inverse β-decay is considered, it is found that the mass and radius approach the values 1.4518 M⊙ and 601.18 km near the legitimate upper bound for the parameter β.
Impact of model uncertainty on soil quality standards for cadmium in rice paddy fields
Römkens, P.F.A.M.; Brus, D.J.; Guo, H.Y.; Chu, C.L.; Chiang, C.M.; Koopmans, G.F.
2011-01-01
At present, soil quality standards used for agriculture do not consider the influence of pH and CEC on the uptake of pollutants by crops. A database with 750 selected paired samples of cadmium (Cd) in soil and paddy rice was used to calibrate soil to plant transfer models using the soil metal
Quantum scattering in one-dimensional systems satisfying the minimal length uncertainty relation
Energy Technology Data Exchange (ETDEWEB)
Bernardo, Reginald Christian S., E-mail: rcbernardo@nip.upd.edu.ph; Esguerra, Jose Perico H., E-mail: jesguerra@nip.upd.edu.ph
2016-12-15
In quantum gravity theories, when the scattering energy is comparable to the Planck energy the Heisenberg uncertainty principle breaks down and is replaced by the minimal length uncertainty relation. In this paper, the consequences of the minimal length uncertainty relation on one-dimensional quantum scattering are studied using an approach involving a recently proposed second-order differential equation. An exact analytical expression for the tunneling probability through a locally-periodic rectangular potential barrier system is obtained. Results show that the existence of a non-zero minimal length uncertainty tends to shift the resonant tunneling energies to the positive direction. Scattering through a locally-periodic potential composed of double-rectangular potential barriers shows that the first band of resonant tunneling energies widens for minimal length cases when the double-rectangular potential barrier is symmetric but narrows down when the double-rectangular potential barrier is asymmetric. A numerical solution which exploits the use of Wronskians is used to calculate the transmission probabilities through the Pöschl–Teller well, Gaussian barrier, and double-Gaussian barrier. Results show that the probability of passage through the Pöschl–Teller well and Gaussian barrier is smaller in the minimal length cases compared to the non-minimal length case. For the double-Gaussian barrier, the probability of passage for energies that are more positive than the resonant tunneling energy is larger in the minimal length cases compared to the non-minimal length case. The approach is exact and applicable to many types of scattering potential.
DEFF Research Database (Denmark)
Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens
2016-01-01
regression and outlier treatment have been applied to achieve high accuracy. Furthermore, linear error propagation based on covariance matrix of estimated parameters was performed. Therefore, every estimated property value of the flammability-related properties is reported together with its corresponding 95......%-confidence interval of the prediction. Compared to existing models the developed ones have a higher accuracy, are simple to apply and provide uncertainty information on the calculated prediction. The average relative error and correlation coefficient are 11.5% and 0.99 for LFL, 15.9% and 0.91 for UFL, 2...
Uncertainty, joint uncertainty, and the quantum uncertainty principle
International Nuclear Information System (INIS)
Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad
2016-01-01
Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)
Characterisation of a reference site for quantifying uncertainties related to soil sampling
International Nuclear Information System (INIS)
Barbizzi, Sabrina; Zorzi, Paolo de; Belli, Maria; Pati, Alessandra; Sansone, Umberto; Stellato, Luisa; Barbina, Maria; Deluisa, Andrea; Menegon, Sandro; Coletti, Valter
2004-01-01
An integrated approach to quality assurance in soil sampling remains to be accomplished. - The paper reports a methodology adopted to face problems related to quality assurance in soil sampling. The SOILSAMP project, funded by the Environmental Protection Agency of Italy (APAT), is aimed at (i) establishing protocols for soil sampling in different environments; (ii) assessing uncertainties associated with different soil sampling methods in order to select the 'fit-for-purpose' method; (iii) qualifying, in term of trace elements spatial variability, a reference site for national and international inter-comparison exercises. Preliminary results and considerations are illustrated
The angle-angular momentum and entropic uncertainty relations for quantum scattering
International Nuclear Information System (INIS)
Ion, D.B.; Ion, M.L.
1999-01-01
Recently the entropic uncertainty relations are obtained in a more general form by using Tsallis-like entropies for the quantum scattering. Hence, using Riesz theorem, the state-independent entropic angle-angular momentum uncertainty relations are proved for the Tsallis-like scattering entropies of spinless particles. The generalized entropic inequalities for the Tsallis-like entropies are presented. The two upper bounds are optimal bounds and can be obtained via Lagrange multipliers by extremizing the Tsallis-like entropies subject to the normalization constraints, respectively. The proof of the lower bound is provided by considering the condition that the angular distribution of probability, P(x) has, everywhere, a finite magnitude. Next, by using the Riesz Theorem a general result was obtained, appearing as inequalities valid for the case of hadron-hadron scattering. An important entropic uncertainty relation for the scattering of spinless particle was thus obtained. For σ el and dσ/dΩ, fixed from experiment, we proved that the optimal scattering entropies are the maximum possible entropies in the scattering process. In as previous paper it was shown that the experimental values of the entropies for the pion--nucleus scatterings are systematically described by the optimal entropies, at all available pion kinetic energies. In this sense the obtained results can also be considered as new experimental signatures for the validity of the principle of minimum distance in space of scattering states. The extension of the optimal state analysis to the generalized non-extensive statistics case, as well as, a test of the entropic inequalities, can be obtained in similar way by using non-extensive optimal entropies. Since this kind of analysis is more involved the numerical examples will be given in a following more extended paper. Finally, we believe that the results obtained here are encouraging for further investigations of the entropic uncertainty relations as well
Adaptive relative pose control of spacecraft with model couplings and uncertainties
Sun, Liang; Zheng, Zewei
2018-02-01
The spacecraft pose tracking control problem for an uncertain pursuer approaching to a space target is researched in this paper. After modeling the nonlinearly coupled dynamics for relative translational and rotational motions between two spacecraft, position tracking and attitude synchronization controllers are developed independently by using a robust adaptive control approach. The unknown kinematic couplings, parametric uncertainties, and bounded external disturbances are handled with adaptive updating laws. It is proved via Lyapunov method that the pose tracking errors converge to zero asymptotically. Spacecraft close-range rendezvous and proximity operations are introduced as an example to validate the effectiveness of the proposed control approach.
Radon contents in groundwater and the uncertainty related to risk assessment
Energy Technology Data Exchange (ETDEWEB)
Fukui, Masami [Kyoto Univ. (Japan)
1997-02-01
The United States has proposed 11 Bq/l (300 pCi/l) as the maximum contaminant levels (MCLs) of radon. Japan has not set up the standards for drinking water. The problems about evaluation of effects of radon on organism and MCLs of radon in groundwater and drinking water in 12 countries were reported. The local area content the high concentrations of radon, but generally it`s low levels were observed in Nigeria, China and Mexico. The countries which content high concentration of radon were Greek, Slovakia, Bornholm Island and Scotland. There are high and low concentration area in US and Japan. I proposed an uncertainty scheme on risk assessment for the exposure by radon. (S.Y.)
International Nuclear Information System (INIS)
Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Pawlowski, R.A.; Spiesman, J.B.
1995-11-01
This report provides the results of comparisons of the cited and latest versions of ANSI standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review
International Nuclear Information System (INIS)
Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Pawlowski, R.A.; Spiesman, J.B.
1995-10-01
This report provides the results of comparisons of the cited and latest versions of ASTM standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review
Sabouri, Sarah; Gerber, Markus; Lemola, Sakari; Becker, Stephen P; Shamsi, Mahin; Shakouri, Zeinab; Sadeghi Bahmani, Dena; Kalak, Nadeem; Holsboer-Trachsler, Edith; Brand, Serge
2016-07-01
The Dark Triad (DT) describes a set of three closely related personality traits, Machiavellianism, narcissism, and psychopathy. The aim of this study was to examine the associations between DT traits, sleep disturbances, anxiety sensitivity and intolerance of uncertainty. A total of 341 adults (M=29years) completed a series of questionnaires related to the DT traits, sleep disturbances, anxiety sensitivity, and intolerance of uncertainty. A higher DT total score was associated with increased sleep disturbances, and higher scores for anxiety sensitivity and intolerance of uncertainty. In regression analyses Machiavellianism and psychopathy were predictors of sleep disturbances, anxiety sensitivity, and intolerance of uncertainty. Results indicate that specific DT traits, namely Machiavellianism and psychopathy, are associated with sleep disturbances, anxiety sensitivity and intolerance of uncertainty in young adults. Copyright © 2016 Elsevier Inc. All rights reserved.
Chen, Min-Nan; Sun, Wen-Yang; Huang, Ai-Jun; Ming, Fei; Wang, Dong; Ye, Liu
2018-01-01
In this work, we investigate the dynamics of quantum-memory-assisted entropic uncertainty relations under open systems, and how to steer the uncertainty under different types of decoherence. Specifically, we develop the dynamical behaviors of the uncertainty of interest under two typical categories of noise; bit flipping and depolarizing channels. It has been shown that the measurement uncertainty firstly increases and then decreases with the growth of the decoherence strength in bit flipping channels. In contrast, the uncertainty monotonically increases with the increase of the decoherence strength in depolarizing channels. Notably, and to a large degree, it is shown that the uncertainty depends on both the systematic quantum correlation and the minimal conditional entropy of the observed subsystem. Moreover, we present a possible physical interpretation for these distinctive behaviors of the uncertainty within such scenarios. Furthermore, we propose a simple and effective strategy to reduce the entropic uncertainty by means of a partially collapsed operation—quantum weak measurement. Therefore, our investigations might offer an insight into the dynamics of the measurment uncertainty under decoherence, and be of importance to quantum precision measurement in open systems.
Yang, Ming; Zhu, X. Ronald; Park, Peter C.; Titt, Uwe; Mohan, Radhe; Virshup, Gary; Clayton, James E.; Dong, Lei
2012-07-01
The purpose of this study was to analyze factors affecting proton stopping-power-ratio (SPR) estimations and range uncertainties in proton therapy planning using the standard stoichiometric calibration. The SPR uncertainties were grouped into five categories according to their origins and then estimated based on previously published reports or measurements. For the first time, the impact of tissue composition variations on SPR estimation was assessed and the uncertainty estimates of each category were determined for low-density (lung), soft, and high-density (bone) tissues. A composite, 95th percentile water-equivalent-thickness uncertainty was calculated from multiple beam directions in 15 patients with various types of cancer undergoing proton therapy. The SPR uncertainties (1σ) were quite different (ranging from 1.6% to 5.0%) in different tissue groups, although the final combined uncertainty (95th percentile) for different treatment sites was fairly consistent at 3.0-3.4%, primarily because soft tissue is the dominant tissue type in the human body. The dominant contributing factor for uncertainties in soft tissues was the degeneracy of Hounsfield numbers in the presence of tissue composition variations. To reduce the overall uncertainties in SPR estimation, the use of dual-energy computed tomography is suggested. The values recommended in this study based on typical treatment sites and a small group of patients roughly agree with the commonly referenced value (3.5%) used for margin design. By using tissue-specific range uncertainties, one could estimate the beam-specific range margin by accounting for different types and amounts of tissues along a beam, which may allow for customization of range uncertainty for each beam direction.
Covariant energy–momentum and an uncertainty principle for general relativity
Energy Technology Data Exchange (ETDEWEB)
Cooperstock, F.I., E-mail: cooperst@uvic.ca [Department of Physics and Astronomy, University of Victoria, P.O. Box 3055, Victoria, B.C. V8W 3P6 (Canada); Dupre, M.J., E-mail: mdupre@tulane.edu [Department of Mathematics, Tulane University, New Orleans, LA 70118 (United States)
2013-12-15
We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.
Management of uncertainties related to renewable generation participation in electricity markets
International Nuclear Information System (INIS)
Bourry, Franck
2009-01-01
The operation of Renewable Energy Sources (RES) units, such as wind or solar plants, is intrinsically dependent on the variability of the wind or solar resource. This makes large scale integration of RES into power systems particularly challenging. The research work in the frame of this thesis focuses on the participation of renewable power producers in liberalized electricity markets, and more precisely on the management of the regulation costs incurred by the producer for any imbalance between the contracted and delivered energy. In such context, the main objective of the thesis is to model and evaluate different methods for the management of imbalance penalties related to the participation of renewable power producers in short-term electricity markets. First, the thesis gives a classification of the existing solutions for the management of these imbalance penalties. A distinction is made between physical solutions which are related to the generation portfolio, and financial solutions which are based on market products. The physical solutions are considered in the frame of a Virtual Power Plant. A generic model of the imbalance penalty resulting from the use of physical or financial solutions is formulated, based on a market rule model. Then, the decision-making problem relative to both physical and financial solutions is formulated as an optimization problem under uncertainty. The approach is based on a loss function derived from the generic imbalance penalty model. Finally, the uncertainty related to the RES production is considered in the risk-based decision making process. The methods are illustrated using case studies based on real world data. (author)
Archive of Census Related Products (ACRP): 1990 Standard Extract Files
National Aeronautics and Space Administration — The 1990 Standard Extract Files portion of the Archive of Census Related Products (ACRP) contains population and housing data derived from the U.S. Census Bureau's...
Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu
2017-06-01
The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
There are a number of sources of uncertainty in regional climate change scenarios. When statistical downscaling is used to obtain regional climate change scenarios, the uncertainty may originate from the uncertainties in the global climate models used, the skill of the statistical model, and the forcing scenarios applied to the global climate model. The uncertainty associated with global climate models can be evaluated by examining the differences in the predictors and in the downscaled climate change scenarios based on a set of different global climate models. When standardized global climate model simulations such as the second phase of the Coupled Model Intercomparison Project (CMIP2) are used, the difference in the downscaled variables mainly reflects differences in the climate models and the natural variability in the simulated climates. It is proposed that the spread of the estimates can be taken as a measure of the uncertainty associated with global climate models. The proposed method is applied to the estimation of global-climate-model-related uncertainty in regional precipitation change scenarios in Sweden. Results from statistical downscaling based on 17 global climate models show that there is an overall increase in annual precipitation all over Sweden although a considerable spread of the changes in the precipitation exists. The general increase can be attributed to the increased large-scale precipitation and the enhanced westerly wind. The estimated uncertainty is nearly independent of region. However, there is a seasonal dependence. The estimates for winter show the highest level of confidence, while the estimates for summer show the least.
Managing uncertainty in multiple-criteria decision making related to sustainability assessment
DEFF Research Database (Denmark)
Dorini, Gianluca Fabio; Kapelan, Zoran; Azapagic, Adisa
2011-01-01
In real life, decisions are usually made by comparing different options with respect to several, often conflicting criteria. This requires subjective judgements on the importance of different criteria by DMs and increases uncertainty in decision making. This article demonstrates how uncertainty can......: (1) no uncertainty, (2) uncertainty in data/models and (3) uncertainty in models and decision-makers’ preferences. The results shows how characterising and propagating uncertainty can help increase the effectiveness of multi-criteria decision making processes and lead to more informed decision....... be handled in multi-criteria decision situations using Compromise Programming, one of the Multi-criteria Decision Analysis (MCDA) techniques. Uncertainty is characterised using a probabilistic approach and propagated using a Monte Carlo simulation technique. The methodological approach is illustrated...
Directory of Open Access Journals (Sweden)
Haleigh A. Boswell
2015-12-01
Full Text Available Analysis of blood alcohol concentration is a routine analysis performed in many forensic laboratories. This analysis commonly utilizes static headspace sampling, followed by gas chromatography combined with flame ionization detection (GC-FID. Studies have shown several “optimal” methods for instrumental operating conditions, which are intended to yield accurate and precise data. Given that different instruments, sampling methods, application specific columns and parameters are often utilized, it is much less common to find information on the robustness of these reported conditions. A major problem can arise when these “optimal” conditions may not also be robust, thus producing data with higher than desired uncertainty or potentially inaccurate results. The goal of this research was to incorporate the principles of quality by design (QBD in the adjustment and determination of BAC (blood alcohol concentration instrumental headspace parameters, thereby ensuring that minor instrumental variations, which occur as a matter of normal work, do not appreciably affect the final results of this analysis. This study discusses both the QBD principles as well as the results of the experiments, which allow for determination of more favorable instrumental headspace conditions. Additionally, method detection limits will also be reported in order to determine a reporting threshold and the degree of uncertainty at the common threshold value of 0.08 g/dL. Furthermore, the comparison of two internal standards, n-propanol and t-butanol, will be investigated. The study showed that an altered parameter of 85 °C headspace oven temperature and 15 psi headspace vial pressurization produces the lowest percent relative standard deviation of 1.3% when t-butanol is implemented as an internal standard, at least for one very common platform. The study also showed that an altered parameter of 100 °C headspace oven temperature and 15-psi headspace vial pressurization
Entanglement criterion for tripartite systems based on local sum uncertainty relations
Akbari-Kourbolagh, Y.; Azhdargalam, M.
2018-04-01
We propose a sufficient criterion for the entanglement of tripartite systems based on local sum uncertainty relations for arbitrarily chosen observables of subsystems. This criterion generalizes the tighter criterion for bipartite systems introduced by Zhang et al. [C.-J. Zhang, H. Nha, Y.-S. Zhang, and G.-C. Guo, Phys. Rev. A 81, 012324 (2010), 10.1103/PhysRevA.81.012324] and can be used for both discrete- and continuous-variable systems. It enables us to detect the entanglement of quantum states without having a complete knowledge of them. Its utility is illustrated by some examples of three-qubit, qutrit-qutrit-qubit, and three-mode Gaussian states. It is found that, in comparison with other criteria, this criterion is able to detect some three-qubit bound entangled states more efficiently.
The small sample uncertainty aspect in relation to bullwhip effect measurement
DEFF Research Database (Denmark)
Nielsen, Erland Hejn
2009-01-01
The bullwhip effect as a concept has been known for almost half a century starting with the Forrester effect. The bullwhip effect is observed in many supply chains, and it is generally accepted as a potential malice. Despite of this fact, the bullwhip effect still seems to be first and foremost...... a conceptual phenomenon. This paper intends primarily to investigate why this might be so and thereby investigate the various aspects, possibilities and obstacles that must be taken into account, when considering the potential practical use and measure of the bullwhip effect in order to actually get the supply...... chain under control. This paper will put special emphasis on the unavoidable small-sample uncertainty aspects relating to the measurement or estimation of the bullwhip effect. ...
Extreme Events in China under Climate Change: Uncertainty and related impacts (CSSP-FOREX)
Leckebusch, Gregor C.; Befort, Daniel J.; Hodges, Kevin I.
2016-04-01
Suitable adaptation strategies or the timely initiation of related mitigation efforts in East Asia will strongly depend on robust and comprehensive information about future near-term as well as long-term potential changes in the climate system. Therefore, understanding the driving mechanisms associated with the East Asian climate is of major importance. The FOREX project (Fostering Regional Decision Making by the Assessment of Uncertainties of Future Regional Extremes and their Linkage to Global Climate System Variability for China and East Asia) focuses on the investigation of extreme wind and rainfall related events over Eastern Asia and their possible future changes. Here, analyses focus on the link between local extreme events and their driving weather systems. This includes the coupling between local rainfall extremes and tropical cyclones, the Meiyu frontal system, extra-tropical teleconnections and monsoonal activity. Furthermore, the relation between these driving weather systems and large-scale variability modes, e.g. NAO, PDO, ENSO is analysed. Thus, beside analysing future changes of local extreme events, the temporal variability of their driving weather systems and related large-scale variability modes will be assessed in current CMIP5 global model simulations to obtain more robust results. Beyond an overview of FOREX itself, first results regarding the link between local extremes and their steering weather systems based on observational and reanalysis data are shown. Special focus is laid on the contribution of monsoonal activity, tropical cyclones and the Meiyu frontal system on the inter-annual variability of the East Asian summer rainfall.
Servin, Christian
2015-01-01
On various examples ranging from geosciences to environmental sciences, this book explains how to generate an adequate description of uncertainty, how to justify semiheuristic algorithms for processing uncertainty, and how to make these algorithms more computationally efficient. It explains in what sense the existing approach to uncertainty as a combination of random and systematic components is only an approximation, presents a more adequate three-component model with an additional periodic error component, and explains how uncertainty propagation techniques can be extended to this model. The book provides a justification for a practically efficient heuristic technique (based on fuzzy decision-making). It explains how the computational complexity of uncertainty processing can be reduced. The book also shows how to take into account that in real life, the information about uncertainty is often only partially known, and, on several practical examples, explains how to extract the missing information about uncer...
Mezzasalma, Stefano A
2007-03-15
The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected.
Quantifying uncertainties of climate signals related to the 11-year solar cycle
Kruschke, T.; Kunze, M.; Matthes, K. B.; Langematz, U.; Wahl, S.
2017-12-01
Although state-of-the-art reconstructions based on proxies and (semi-)empirical models converge in terms of total solar irradiance, they still significantly differ in terms of spectral solar irradiance (SSI) with respect to the mean spectral distribution of energy input and temporal variability. This study aims at quantifying uncertainties for the Earth's climate related to the 11-year solar cycle by forcing two chemistry-climate models (CCMs) - CESM1(WACCM) and EMAC - with five different SSI reconstructions (NRLSSI1, NRLSSI2, SATIRE-T, SATIRE-S, CMIP6-SSI) and the reference spectrum RSSV1-ATLAS3, derived from observations. We conduct a unique set of timeslice experiments. External forcings and boundary conditions are fixed and identical for all experiments, except for the solar forcing. The set of analyzed simulations consists of one solar minimum simulation, employing RSSV1-ATLAS3 and five solar maximum experiments. The latter are a result of adding the amplitude of solar cycle 22 according to the five reconstructions to RSSV1-ATLAS3. Our results show that the climate response to the 11y solar cycle is generally robust across CCMs and SSI forcings. However, analyzing the variance of the solar maximum ensemble by means of ANOVA-statistics reveals additional information on the uncertainties of the mean climate signals. The annual mean response agrees very well between the two CCMs for most parts of the lower and middle atmosphere. Only the upper mesosphere is subject to significant differences related to the choice of the model. However, the different SSI forcings lead to significant differences in ozone concentrations, shortwave heating rates, and temperature throughout large parts of the mesosphere and upper stratosphere. Regarding the seasonal evolution of the climate signals, our findings for short wave heating rates, and temperature are similar to the annual means with respect to the relative importance of the choice of the model or the SSI forcing for the
Badawy, B.; Fletcher, C. G.
2017-12-01
The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.
Uncertainty enabled Sensor Observation Services
Cornford, Dan; Williams, Matthew; Bastin, Lucy
2010-05-01
Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.
Energy Technology Data Exchange (ETDEWEB)
Panka, Istvan; Hegyi, Gyoergy; Maraczy, Csaba; Temesvari, Emese [Hungarian Academy of Sciences, Budapest (Hungary). Reactor Analysis Dept.
2017-11-15
The best-estimate KARATE code system has been widely used for core design calculations and simulations of slow transients of VVER reactors. Recently there has been an increasing need for assessing the uncertainties of such calculations by propagating the basic input uncertainties of the models through the full calculation chain. In order to determine the uncertainties of quantities of interest during the burnup, the statistical version of the KARATE code system has been elaborated. In the first part of the paper, the main features of the new code system are discussed. The applied statistical method is based on Monte-Carlo sampling of the considered input data taking into account mainly the covariance matrices of the cross sections and/or the technological uncertainties. In the second part of the paper, only the uncertainties of cross sections are considered and an equilibrium cycle related to a VVER-440 type reactor is investigated. The burnup dependence of the uncertainties of some safety related parameters (e.g. critical boron concentration, rod worth, feedback coefficients, assembly-wise radial power and burnup distribution) are discussed and compared to the recently used limits.
Uncertainties in relation to CO2 capture and sequestration. Preliminary results. Working Paper
International Nuclear Information System (INIS)
Gielen, D.
2003-03-01
This paper has been presented at an expert meeting on CO2 capture technology learning at the IEA headquarters, January 24th, 2003. The electricity sector is a key source of CO2 emissions and a strong increase of emissions is forecast in a business-as-usual scenario. A range of strategies have been proposed to reduce these emissions. This paper focuses on one of the promising strategies, CO2 capture and storage. The future role of CO2 capture in the electricity sector has been assessed, using the Energy Technology Perspectives model (ETP). Technology data have been collected and reviewed in cooperation with the IEA Greenhouse Gas R and D implementing agreement and other expert groups. CO2 capture and sequestration is based on relatively new technology. Therefore, its characteristics and its future role in the energy system is subject to uncertainties, as for any new technology. The analysis suggests that the choice of a reference electricity production technology and the characteristics of the CO2 storage option constitute the two main uncertainties, apart from a large number of other factors of lesser importance. Based on the choices made cost estimates can range from less than zero USD for coal fired power plants to more than 150 USD per ton of CO2 for gas fired power plants. The results suggest that learning effects are important, but they do not affect the CO2 capture costs significantly, other uncertainties dominate the cost estimates. The ETP model analysis, where choices are based on the ideal market hypothesis and rational price based decision making, suggest up to 18% of total global electricity production will be equipped with CO2 capture by 2040, in case of a penalty of 50 US$ per ton of CO2. However this high penetration is only achieved in case coal fired IGCC-SOFC power plants are developed successfully. Without such technology only a limited amount of CO2 is captured from gas fired power plants. Higher penalties may result in a higher share of CO2
Uncertainty Forecasts Improve Weather-Related Decisions and Attenuate the Effects of Forecast Error
Joslyn, Susan L.; LeClerc, Jared E.
2012-01-01
Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather…
Shen, Mingxi; Chen, Jie; Zhuan, Meijia; Chen, Hua; Xu, Chong-Yu; Xiong, Lihua
2018-01-01
Uncertainty estimation of climate change impacts on hydrology has received much attention in the research community. The choice of a global climate model (GCM) is usually considered as the largest contributor to the uncertainty of climate change impacts. The temporal variation of GCM uncertainty needs to be investigated for making long-term decisions to deal with climate change. Accordingly, this study investigated the temporal variation (mainly long-term) of uncertainty related to the choice of a GCM in predicting climate change impacts on hydrology by using multi-GCMs over multiple continuous future periods. Specifically, twenty CMIP5 GCMs under RCP4.5 and RCP8.5 emission scenarios were adapted to adequately represent this uncertainty envelope, fifty-one 30-year future periods moving from 2021 to 2100 with 1-year interval were produced to express the temporal variation. Future climatic and hydrological regimes over all future periods were compared to those in the reference period (1971-2000) using a set of metrics, including mean and extremes. The periodicity of climatic and hydrological changes and their uncertainty were analyzed using wavelet analysis, while the trend was analyzed using Mann-Kendall trend test and regression analysis. The results showed that both future climate change (precipitation and temperature) and hydrological response predicted by the twenty GCMs were highly uncertain, and the uncertainty increased significantly over time. For example, the change of mean annual precipitation increased from 1.4% in 2021-2050 to 6.5% in 2071-2100 for RCP4.5 in terms of the median value of multi-models, but the projected uncertainty reached 21.7% in 2021-2050 and 25.1% in 2071-2100 for RCP4.5. The uncertainty under a high emission scenario (RCP8.5) was much larger than that under a relatively low emission scenario (RCP4.5). Almost all climatic and hydrological regimes and their uncertainty did not show significant periodicity at the P = .05 significance
Carcioppolo, Nick; Yang, Fan; Yang, Qinghua
2016-09-01
Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.
DEFF Research Database (Denmark)
Heydorn, Kaj; Anglov, Thomas
2002-01-01
Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...
International Nuclear Information System (INIS)
Yu Watanabe; Masahito Ueda
2012-01-01
Full text: When we try to obtain information about a quantum system, we need to perform measurement on the system. The measurement process causes unavoidable state change. Heisenberg discussed a thought experiment of the position measurement of a particle by using a gamma-ray microscope, and found a trade-off relation between the error of the measured position and the disturbance in the momentum caused by the measurement process. The trade-off relation epitomizes the complementarity in quantum measurements: we cannot perform a measurement of an observable without causing disturbance in its canonically conjugate observable. However, at the time Heisenberg found the complementarity, quantum measurement theory was not established yet, and Kennard and Robertson's inequality erroneously interpreted as a mathematical formulation of the complementarity. Kennard and Robertson's inequality actually implies the indeterminacy of the quantum state: non-commuting observables cannot have definite values simultaneously. However, Kennard and Robertson's inequality reflects the inherent nature of a quantum state alone, and does not concern any trade-off relation between the error and disturbance in the measurement process. In this talk, we report a resolution to the complementarity in quantum measurements. First, we find that it is necessary to involve the estimation process from the outcome of the measurement for quantifying the error and disturbance in the quantum measurement. We clarify the implicitly involved estimation process in Heisenberg's gamma-ray microscope and other measurement schemes, and formulate the error and disturbance for an arbitrary quantum measurement by using quantum estimation theory. The error and disturbance are defined in terms of the Fisher information, which gives the upper bound of the accuracy of the estimation. Second, we obtain uncertainty relations between the measurement errors of two observables [1], and between the error and disturbance in the
Present status of standards relating to radiation control and protection
International Nuclear Information System (INIS)
Minami, Kentaro
1996-01-01
Japanese and international standards related to radiation control and radiation protective management are presented focusing on the forming condition, significance, current situation, and their relationship. Japanese Industrial Standards (JIS) is quite useful in the field of atomic energy as well as other fields in terms of optimization and rationalization of the management. JIS includes JIS Z 4001 Atomic Energy Terminology which corresponds to internationl standards ISO 921 Nuclear Glossary, and JIS Z 4005 Medical Radiation Terminology, covering about 500 articles, which corresponds to IEC 788 Medical Radiology-Terminology. The first standards regarding radiation protection was established in X-ray Film Badge, which is included in the field of personal dosimeter, in 1956. Currently, 36 JIS has been established in the field of radiation management dosimeter and 3 are under arrangement. As for radiation protective supplies, 9 JIS has been established so far. Before proposal of JIS, investigation had been conducted to improve, simplify, and standardize the standards of radiation dosimetric technique, dosimeters, dosimetric procedures, and improvement. In this article, the results of material surface contamination monitoring and body surface monitoring conducted in Atomic Energy Safety Association and Radiation Dosimetry Associationare reported, and ISO and IEC are also treated. (S.Y.)
Mazziotti, David A.; Erdahl, Robert M.
2001-04-01
For the description of ground-state correlation phenomena an accurate mapping of many-body quantum mechanics onto four particles is developed. The energy for a quantum system with no more than two-particle interactions may be expressed in terms of a two-particle reduced density matrix (2-RDM), but variational optimization of the 2-RDM requires that it corresponds to an N-particle wave function. We derive N-representability conditions on the 2-RDM that guarantee the validity of the uncertainty relations for all operators with two-particle interactions. One of these conditions is shown to be necessary and sufficient to make the RDM solutions of the dispersion condition equivalent to those from the contracted Schrödinger equation (CSE) [Mazziotti, Phys. Rev. A 57, 4219 (1998)]. In general, the CSE is a stronger N-representability condition than the dispersion condition because the CSE implies the dispersion condition as well as additional N-representability constraints from the Hellmann-Feynman theorem. Energy minimization subject to the representability constraints is performed for a boson model with 10, 30, and 75 particles. Even when traditional wave-function methods fail at large perturbations, the present method yields correlation energies within 2%.
International Nuclear Information System (INIS)
Amitabh, J.; Vaccaro, J.A.; Hill, K.E.
1998-01-01
We study the recently defined number-phase Wigner function S NP (n,θ) for a single-mode field considered to be in binomial and negative binomial states. These states interpolate between Fock and coherent states and coherent and quasi thermal states, respectively, and thus provide a set of states with properties ranging from uncertain phase and sharp photon number to sharp phase and uncertain photon number. The distribution function S NP (n,θ) gives a graphical representation of the complimentary nature of the number and phase properties of these states. We highlight important differences between Wigner's quasi probability function, which is associated with the position and momentum observables, and S NP (n,θ), which is associated directly with the photon number and phase observables. We also discuss the number-phase entropic uncertainty relation for the binomial and negative binomial states and we show that negative binomial states give a lower phase entropy than states which minimize the phase variance
Computer-related standards for the petroleum industry
International Nuclear Information System (INIS)
Winczewski, L.M.
1992-01-01
Rapid application of the computer to all areas of the petroleum industry is straining the capabilities of corporations and vendors to efficiently integrate computer tools into the work environment. Barriers to this integration arose form decades of competitive development of proprietary applications formats, along with compilation of data bases in isolation. Rapidly emerging industry-wide standards relating to computer applications and data management are poised to topple these barriers. This paper identifies the most active players within a rapidly evolving group of cooperative standardization activities sponsored by the petroleum industry. Summarized are their objectives, achievements, current activities and relationships to each other. The trends of these activities are assessed and projected
DeWeber, Jefferson T.; Wagner, Tyler
2018-01-01
Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30‐day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species’ distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold‐water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid‐century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation
DeWeber, Jefferson T; Wagner, Tyler
2018-06-01
Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30-day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species' distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold-water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid-century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions. Our
Relations between the technological standards and technological appropriation
Directory of Open Access Journals (Sweden)
Carlos Alberto PRADO GUERRERO
2010-06-01
Full Text Available The objective of this study is to analyze the educational practices of using Blackboard in blended learning environments with students of higher education to understand the relationship between technological appropriation and standards of educational technology. To achieve that goal, the following research question was raised: ¿To what extent are the standards of education technology with the appropriation of technology in blended learning environments in higher education related? The contextual framework of this work includes the following topics: the institution, teaching, teachers and students. The design methodology that was used is of a correlation type. Correlations were carried out to determine the frequency and level in the technological standards as well as the appropriation of technology. In the comparison of the results obtained by the students, the teachers and the platform; we found that students in the school study showed a high degree of technology ownership and this was the same for the performance shown on the technological standards. It was established that teachers play a key role in developing the technological appropriation of students and performance in technology standards.
Li, Xi-Zeng; Su, Bao-Xia
1996-01-01
It is found that the field of the combined mode of the probe wave and the phase-conjugate wave in the process of non-degenerate four-wave mixing exhibits higher-order squeezing to all even orders. And the generalized uncertainty relations in this process are also presented.
Analysing public relations education through international standards: The Portuguese case
Gonçalves, Gisela Marques Pereira; Spínola, Susana de Carvalho; Padamo, Celma
2013-01-01
By using international reports on PR education as a benchmark we analyse the status of PR higher education in Portugal. Despite differences among the study programs, the findings reveal that the standard five courses recommendation by the Commission on Public Relations Education (CPRE) are a part of Portuguese undergraduate curriculum. This includes 12 of the 14 content field guidelines needed to achieve the ideal master's program. Data shows, however, the difficulty of positioning public rel...
Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul
2012-11-26
The aim of this work is to develop group-contribution(+) (GC(+)) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality of parameter estimation, such as the parameter covariance, the standard errors in predicted properties, and the confidence intervals. For parameter estimation, large data sets of experimentally measured property values of a wide range of chemicals (hydrocarbons, oxygenated chemicals, nitrogenated chemicals, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22 environment-related properties, which include the fathead minnow 96-h LC(50), Daphnia magna 48-h LC(50), oral rat LD(50), aqueous solubility, bioconcentration factor, permissible exposure limit (OSHA-TWA), photochemical oxidation potential, global warming potential, ozone depletion potential, acidification potential, emission to urban air (carcinogenic and noncarcinogenic), emission to continental rural air (carcinogenic and noncarcinogenic), emission to continental fresh water (carcinogenic and noncarcinogenic), emission to continental seawater (carcinogenic and noncarcinogenic), emission to continental natural soil (carcinogenic and noncarcinogenic), and emission to continental agricultural soil (carcinogenic and noncarcinogenic) have been modeled and analyzed. The application
Assessment the impact of samplers change on the uncertainty related to geothermalwater sampling
Wątor, Katarzyna; Mika, Anna; Sekuła, Klaudia; Kmiecik, Ewa
2018-02-01
The aim of this study is to assess the impact of samplers change on the uncertainty associated with the process of the geothermal water sampling. The study was carried out on geothermal water exploited in Podhale region, southern Poland (Małopolska province). To estimate the uncertainty associated with sampling the results of determinations of metasilicic acid (H2SiO3) in normal and duplicate samples collected in two series were used (in each series the samples were collected by qualified sampler). Chemical analyses were performed using ICP-OES method in the certified Hydrogeochemical Laboratory of the Hydrogeology and Engineering Geology Department at the AGH University of Science and Technology in Krakow (Certificate of Polish Centre for Accreditation No. AB 1050). To evaluate the uncertainty arising from sampling the empirical approach was implemented, based on double analysis of normal and duplicate samples taken from the same well in the series of testing. The analyses of the results were done using ROBAN software based on technique of robust statistics analysis of variance (rANOVA). Conducted research proved that in the case of qualified and experienced samplers uncertainty connected with the sampling can be reduced what results in small measurement uncertainty.
Casola, J. H.; Huber, D.
2013-12-01
Many media, academic, government, and advocacy organizations have achieved sophistication in developing effective messages based on scientific information, and can quickly translate salient aspects of emerging climate research and evolving observations. However, there are several ways in which valid messages can be misconstrued by decision makers, leading them to inaccurate conclusions about the risks associated with climate impacts. Three cases will be discussed: 1) Issues of spatial scale in interpreting climate observations: Local climate observations may contradict summary statements about the effects of climate change on larger regional or global spatial scales. Effectively addressing these differences often requires communicators to understand local and regional climate drivers, and the distinction between a 'signal' associated with climate change and local climate 'noise.' Hydrological statistics in Missouri and California are shown to illustrate this case. 2) Issues of complexity related to extreme events: Climate change is typically invoked following a wide range of damaging meteorological events (e.g., heat waves, landfalling hurricanes, tornadoes), regardless of the strength of the relationship between anthropogenic climate change and the frequency or severity of that type of event. Examples are drawn from media coverage of several recent events, contrasting useful and potentially confusing word choices and frames. 3) Issues revolving around climate sensitivity: The so-called 'pause' or 'hiatus' in global warming has reverberated strongly through political and business discussions of climate change. Addressing the recent slowdown in warming yields an important opportunity to raise climate literacy in these communities. Attempts to use recent observations as a wedge between climate 'believers' and 'deniers' is likely to be counterproductive. Examples are drawn from Congressional testimony and media stories. All three cases illustrate ways that decision
Okubo, Sho; Nakayama, Hirotaka; Iwakuni, Kana; Inaba, Hajime; Sasada, Hiroyuki
2011-11-21
We determine the absolute frequencies of 56 rotation-vibration transitions of the ν(3) band of CH(4) from 88.2 to 90.5 THz with a typical uncertainty of 2 kHz corresponding to a relative uncertainty of 2.2 × 10(-11) over an average time of a few hundred seconds. Saturated absorption lines are observed using a difference-frequency-generation source and a cavity-enhanced absorption cell, and the transition frequencies are measured with a fiber-laser-based optical frequency comb referenced to a rubidium atomic clock linked to the international atomic time. The determined value of the P(7) F(2)((2)) line is consistent with the International Committee for Weights and Measures recommendation within the uncertainty. © 2011 Optical Society of America
Relating Standardized Visual Perception Measures to Simulator Visual System Performance
Kaiser, Mary K.; Sweet, Barbara T.
2013-01-01
Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).
The Evolution of Classroom Physics Knowledge in Relation to Certainty and Uncertainty
Tiberghien, Andrée; Cross, David; Sensevy, Gérard
2014-01-01
This paper deals with the joint construction of knowledge by the teacher and the students in a physics classroom. It is focused on the status of epistemic certainty/uncertainty of knowledge. The same element of knowledge can be introduced as possible and thus uncertain and then evolve towards a status of epistemic certainty; the status of other…
Gauge-independent scales related to the Standard Model vacuum instability
International Nuclear Information System (INIS)
Espinosa, J.R.; Garny, M.; Konstandin, T.; Riotto, A.
2016-08-01
The measured (central) values of the Higgs and top quark masses indicate that the Standard Model (SM) effective potential develops an instability at high field values. The scale of this instability, determined as the Higgs field value at which the potential drops below the electroweak minimum, is about 10"1"1 GeV. However, such a scale is unphysical as it is not gauge invariant and suffers from a gauge-fixing uncertainty of up to two orders of magnitude. Subjecting our system, the SM, to several probes of the instability (adding higher order operators to the potential; letting the vacuum decay through critical bubbles; heating up the system to very high temperature; inflating it) and asking in each case physical questions, we are able to provide several gauge-invariant scales related with the Higgs potential instability.
Gauge-Independent Scales Related to the Standard Model Vacuum Instability
Espinosa, Jose R.; Konstandin, Thomas; Riotto, Antonio
2017-01-01
The measured (central) values of the Higgs and top quark masses indicate that the Standard Model (SM) effective potential develops an instability at high field values. The scale of this instability, determined as the Higgs field value at which the potential drops below the electroweak minimum, is about $10^{11}$ GeV. However, such a scale is unphysical as it is not gauge-invariant and suffers from a gauge-fixing uncertainty of up to two orders of magnitude. Subjecting our system, the SM, to several probes of the instability (adding higher order operators to the potential; letting the vacuum decay through critical bubbles; heating up the system to very high temperature; inflating it) and asking in each case physical questions, we are able to provide several gauge-invariant scales related with the Higgs potential instability.
Directory of Open Access Journals (Sweden)
George Maldonado
2009-09-01
Full Text Available Abstract: In a follow-up study of mortality among North American synthetic rubber industry workers, cumulative exposure to 1,3-butadiene was positively associated with leukemia. Problems with historical exposure estimation, however, may have distorted the association. To evaluate the impact of potential inaccuracies in exposure estimation, we conducted uncertainty analyses of the relation between cumulative exposure to butadiene and leukemia. We created the 1,000 sets of butadiene estimates using job-exposure matrices consisting of exposure values that corresponded to randomly selected percentiles of the approximate probability distribution of plant-, work area/job group-, and year specific butadiene ppm. We then analyzed the relation between cumulative exposure to butadiene and leukemia for each of the 1,000 sets of butadiene estimates. In the uncertainty analysis, the point estimate of the RR for the first non zero exposure category (>0–<37.5 ppm-years was most likely to be about 1.5. The rate ratio for the second exposure category (37.5–<184.7 ppm-years was most likely to range from 1.5 to 1.8. The RR for category 3 of exposure (184.7–<425.0 ppm-years was most likely between 2.1 and 3.0. The RR for the highest exposure category (425.0+ ppm-years was likely to be between 2.9 and 3.7. This range off RR point estimates can best be interpreted as a probability distribution that describes our uncertainty in RR point estimates due to uncertainty in exposure estimation. After considering the complete probability distributions of butadiene exposure estimates, the exposure-response association of butadiene and leukemia was maintained. This exercise was a unique example of how uncertainty analyses can be used to investigate and support an observed measure of effect when occupational exposure estimates are employed in the absence of direct exposure measurements.
International Nuclear Information System (INIS)
Sakai, Ryutaro; Munakata, Masahiro; Ohoka, Masao; Kameya, Hiroshi
2009-11-01
In the safety assessment for a geological disposal of radioactive waste, it is important to develop a methodology for long-term estimation of regional groundwater flow from data acquisition to numerical analyses. In the uncertainties associated with estimation of regional groundwater flow, there are the one that concerns parameters and the one that concerns the hydrologeological evolution. The uncertainties of parameters include measurement errors and their heterogeneity. The authors discussed the uncertainties of hydraulic conductivity as a significant parameter for regional groundwater flow analysis. This study suggests that hydraulic conductivities of rock mass are controlled by rock characteristics such as fractures, porosity and test conditions such as hydraulic gradient, water quality, water temperature and that there exists variations more than ten times in hydraulic conductivity by difference due to test conditions such as hydraulic gradient or due to rock type variations such as rock fractures, porosity. In addition this study demonstrated that confining pressure change caused by uplift and subsidence and change of hydraulic gradient under the long-term evolution of hydrogeological environment could possibly produce variations more than ten times of magnitude in hydraulic conductivity. It was also shown that the effect of water quality change on hydraulic conductivity was not negligible and that the replacement of fresh water and saline water caused by sea level change could induce 0.6 times in current hydraulic conductivities in case of Horonobe site. (author)
Concept for an International Standard related to Space Weather Effects on Space Systems
Tobiska, W. Kent; Tomky, Alyssa
There is great interest in developing an international standard related to space weather in order to specify the tools and parameters needed for space systems operations. In particular, a standard is important for satellite operators who may not be familiar with space weather. In addition, there are others who participate in space systems operations that would also benefit from such a document. For example, the developers of software systems that provide LEO satellite orbit determination, radio communication availability for scintillation events (GEO-to-ground L and UHF bands), GPS uncertainties, and the radiation environment from ground-to-space for commercial space tourism. These groups require recent historical data, current epoch specification, and forecast of space weather events into their automated or manual systems. Other examples are national government agencies that rely on space weather data provided by their organizations such as those represented in the International Space Environment Service (ISES) group of 14 national agencies. Designers, manufacturers, and launchers of space systems require real-time, operational space weather parameters that can be measured, monitored, or built into automated systems. Thus, a broad scope for the document will provide a useful international standard product to a variety of engineering and science domains. The structure of the document should contain a well-defined scope, consensus space weather terms and definitions, and internationally accepted descriptions of the main elements of space weather, its sources, and its effects upon space systems. Appendices will be useful for describing expanded material such as guidelines on how to use the standard, how to obtain specific space weather parameters, and short but detailed descriptions such as when best to use some parameters and not others; appendices provide a path for easily updating the standard since the domain of space weather is rapidly changing with new advances
Uncertainty in the area-related QPF for heavy convective precipitation
Czech Academy of Sciences Publication Activity Database
Řezáčová, Daniela; Zacharov, Petr, jr.; Sokol, Zbyněk
2009-01-01
Roč. 93, 1-3 (2009), s. 238-246 ISSN 0169-8095. [European Conference on Severe Storms /4./. Miramare -Trieste, 10.09.2007-14.09.2007] R&D Projects: GA ČR GA205/07/0905; GA MŠk OC 112 Institutional research plan: CEZ:AV0Z30420517 Keywords : Convective storm * Quantitative precipitation forecast * Uncertainty in precipitation forecasting * Ensemble forecasting * Numerical weather prediction model Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 1.811, year: 2009 http://www.sciencedirect.com/science/journal/01698095
Standard guide for corrosion-related failure analysis
American Society for Testing and Materials. Philadelphia
2000-01-01
1.1 This guide covers key issues to be considered when examining metallic failures when corrosion is suspected as either a major or minor causative factor. 1.2 Corrosion-related failures could include one or more of the following: change in surface appearance (for example, tarnish, rust, color change), pin hole leak, catastrophic structural failure (for example, collapse, explosive rupture, implosive rupture, cracking), weld failure, loss of electrical continuity, and loss of functionality (for example, seizure, galling, spalling, swelling). 1.3 Issues covered include overall failure site conditions, operating conditions at the time of failure, history of equipment and its operation, corrosion product sampling, environmental sampling, metallurgical and electrochemical factors, morphology (mode) or failure, and by considering the preceding, deducing the cause(s) of corrosion failure. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibili...
High-voltage measurements on the 5 ppm relative uncertainty level with collinear laser spectroscopy
Krämer, J.; König, K.; Geppert, Ch; Imgram, P.; Maaß, B.; Meisner, J.; Otten, E. W.; Passon, S.; Ratajczyk, T.; Ullmann, J.; Nörtershäuser, W.
2018-04-01
We present the results of high-voltage collinear laser spectroscopy measurements on the 5 ppm relative uncertainty level using a pump and probe scheme at the 4s ^2S1/2 → 4p ^2P3/2 transition of {\\hspace{0pt}}40Ca+ involving the 3d ^2D5/2 metastable state. With two-stage laser interaction and a reference measurement we can eliminate systematic effects such as differences in the contact potentials due to different electrode materials and thermoelectric voltages, and the unknown starting potential of the ions in the ion source. Voltage measurements were performed between -5 kV and -19 kV and parallel measurements with stable high-voltage dividers calibrated to 5 ppm relative uncertainty were used as a reference. Our measurements are compatible with the uncertainty limits of the high-voltage dividers and demonstrate an unprecedented (factor of 20) increase in the precision of direct laser-based high-voltage measurements.
New entropic uncertainty relations and tests of PMD-SQS-optimal limits in pion-nucleus scattering
International Nuclear Information System (INIS)
Ion, D.B.; Ion, M.L.
2002-01-01
In this paper we define a new kind of quantum entropy, namely, the nonextensivity conjugated entropy S Jθ (p,q) bar.Then we prove the optimal nonextensivity conjugated entropic uncertainty relations (ONC-EUR) as well as optimal nonextensivity conjugated entropic uncertainty bands (ONC E UB). The results of the first experimental test of ONC-EUB in the pion-nucleus scattering, obtained by using 49-sets of experimental phase shift analysis, are presented. So, strong evidences for the saturation of the PMD-SQS-optimum limit are obtained with high accuracy (confidence level > 99%) for the nonextensivities: 1/2 ≤ p ≤ 2/3 and q = p/(2p-1). (authors)
The Scalable Coherent Interface and related standards projects
International Nuclear Information System (INIS)
Gustavson, D.B.
1991-09-01
The Scalable Coherent Interface (SCI) project (IEEE P1596) found a way to avoid the limits that are inherent in bus technology. SCI provides bus-like services by transmitting packets on a collection of point-to-point unidirectional links. The SCI protocols support cache coherence in a distributed-shared-memory multiprocessor model, message passing, I/O, and local-area-network-like communication over fiber optic or wire links. VLSI circuits that operate parallel links at 1000 MByte/s and serial links at 1000 Mbit/s will be available early in 1992. Several ongoing SCI-related projects are applying the SCI technology to new areas or extending it to more difficult problems. P1596.1 defines the architecture of a bridge between SCI and VME; P1596.2 compatibly extends the cache coherence mechanism for efficient operation with kiloprocessor systems; P1596.3 defines new low-voltage (about 0.25 V) differential signals suitable for low power interfaces for CMOS or GaAs VLSI implementations of SCI; P1596.4 defines a high performance memory chip interface using these signals; P1596.5 defines data transfer formats for efficient interprocessor communication in heterogeneous multiprocessor systems. This paper reports the current status of SCI, related standards, and new projects. 16 refs
Post, W. M.; Dale, V. H.; DeAngelis, D. L.; Mann, L. K.; Mulholland, P. J.; O`Neill, R. V.; Peng, T. -H.; Farrell, M. P.
1990-02-01
The global carbon cycle is the dynamic interaction among the earth's carbon sources and sinks. Four reservoirs can be identified, including the atmosphere, terrestrial biosphere, oceans, and sediments. Atmospheric CO{sub 2} concentration is determined by characteristics of carbon fluxes among major reservoirs of the global carbon cycle. The objective of this paper is to document the knowns, and unknowns and uncertainties associated with key questions that if answered will increase the understanding of the portion of past, present, and future atmospheric CO{sub 2} attributable to fossil fuel burning. Documented atmospheric increases in CO{sub 2} levels are thought to result primarily from fossil fuel use and, perhaps, deforestation. However, the observed atmospheric CO{sub 2} increase is less than expected from current understanding of the global carbon cycle because of poorly understood interactions among the major carbon reservoirs.
Fox, Jesse; Anderegg, Courtney
2014-11-01
Due to their pervasiveness and unique affordances, social media play a distinct role in the development of modern romantic relationships. This study examines how a social networking site is used for information seeking about a potential or current romantic partner. In a survey, Facebook users (N=517) were presented with Facebook behaviors categorized as passive (e.g., reading a partner's profile), active (e.g., "friending" a common third party), or interactive (e.g., commenting on the partner's wall) uncertainty reduction strategies. Participants reported how normative they perceived these behaviors to be during four possible stages of relationship development (before meeting face-to-face, after meeting face-to-face, casual dating, and exclusive dating). Results indicated that as relationships progress, perceived norms for these behaviors change. Sex differences were also observed, as women perceived passive and interactive strategies as more normative than men during certain relationship stages.
Directory of Open Access Journals (Sweden)
Mélanie Trudel
2017-03-01
Full Text Available Low-flow is the flow of water in a river during prolonged dry weather. This paper investigated the uncertainty originating from hydrological model calibration and structure in low-flow simulations under climate change conditions. Two hydrological models of contrasting complexity, GR4J and SWAT, were applied to four sub-watersheds of the Yamaska River, Canada. The two models were calibrated using seven different objective functions including the Nash-Sutcliffe coefficient (NSEQ and six other objective functions more related to low flows. The uncertainty in the model parameters was evaluated using a PARAmeter SOLutions procedure (PARASOL. Twelve climate projections from different combinations of General Circulation Models (GCMs and Regional Circulation Models (RCMs were used to simulate low-flow indices in a reference (1970–2000 and future (2040–2070 horizon. Results indicate that the NSEQ objective function does not properly represent low-flow indices for either model. The NSE objective function applied to the log of the flows shows the lowest total variance for all sub-watersheds. In addition, these hydrological models should be used with care for low-flow studies, since they both show some inconsistent results. The uncertainty is higher for SWAT than for GR4J. With GR4J, the uncertainties in the simulations for the 7Q2 index (the 7-day low-flow value with a 2-year return period are lower for the future period than for the reference period. This can be explained by the analysis of hydrological processes. In the future horizon, a significant worsening of low-flow conditions was projected.
Duffy, P.; Keller, M. M.; Morton, D. C.
2016-12-01
Carbon accounting for REDD+ requires knowledge of deforestation, degradation, and associated changes in forest carbon stocks. Degradation is more difficult to detect than deforestation so SilvaCarbon, an US inter-agency effort, has set a priority to better characterize forest degradation effects on carbon loss. By combining information from forest inventory and lidar data products, impacts of deforestation, degradation, and associated changes in forest carbon stocks can be more accurately characterized across space. Our approach employs a hierarchical Bayesian modeling (HBM) framework where the assimilation of information from multiple sources is accomplished using a change of support (COS) technique. The COS formulation allows data from multiple spatial resolutions to be assimilated into an intermediate resolution. This approach is being applied in Paragominas, a jurisdiction in the eastern Brazilian Amazon with a high proportion of logged and burned degraded forests where political change has opened the way for REDD+. We build on a long history of research including our extensive studies of logging damage. Our primary objective is to quantify above-ground carbon stocks and corresponding uncertainty in a spatially explicit manner. A secondary objective is to quantify the relative contribution of lower level data products to the overall uncertainty, allowing for more focused subsequent data collection in the context of uncertainty reduction. This approach provides a mechanism to assimilate information from multiple sources to produce spatially-explicit maps of carbon stocks and changes with corresponding spatially explicit maps of uncertainty. Importantly, this approach also provides a mechanism that can be used to assess the value of information from specific data products.
Binary trading relations and the limits of EDI standards
DEFF Research Database (Denmark)
Damsgaard, Jan; Truex, D.
2000-01-01
This paper provides a critical examination of electronic data interchange (EDI) standards and their application in different types of trading relationships. It argues that EDI standards are not directly comparable to more stable sets of technical standards in that they are dynamically tested...... and negotiated in use with each trading exchange. It takes the position that EDI standards are an emergent language form and must mean different things at the institutional and local levels. Using the lens of emergent linguistic analysis it shows how the institutional and local levels must always be distinct...... and yet can coexist. EDI standards can never represent the creation of an 'Esperanto of institutional communication'. Instead we believe that standards must be developed such that they support and accommodate general basic grammatical forms that can be customised to individual needs. The analysis...
Changes in IEC standards related to diagnostic radiology
International Nuclear Information System (INIS)
Porubszky, T.; Barsai, J.
2007-01-01
Complete test of publication follows. Purposes. Technical Committee TC62 of International Electrotechnical Commission (IEC) deals with medical electrical equipment (i.e. medical devices using electricity). Standardization concerning diagnostic radiology equipment is task of its Sub-Committee SC62B. An outlook of its activities and present situation, and especially of radiation protection aspects, is given. Materials and methods. Third edition of basic safety standard for medical electrical equipment IEC 60601-1 was issued in 2005. Elaboration of new collateral and particular standards - applicable together with it - is in progress. These standards are generally at the same time also European - EN - and national standards. There is a great importance of radiation protection in diagnostic X-ray equipment. Collateral standard IEC 6060-1-3 about it was at first issued in 1994. Rapid development of imaging technology demands updating of requirements. SC62B in 2003 founded a maintenance team MT37 for preparation of the second edition of this standard. According to new safety philosophy of IEC all modality specific requirements are to be collected in 'safety and essential performance' particular standards. A new working group WG42 - founded in 2006 - elaborates a new particular standard IEC 60601-2-54 for radiographic and radioscopic equipment. Maintenance team MT32 deals with safety and performance standards for X-ray tube assemblies. The authors actively participate in these activities. Results and discussion. Present and future system of diagnostic radiology IEC standards and some interesting details are presented. Conclusions. International standards - although they are not 'obligatory' - are generally the basis of safety and performance certification of diagnostic radiology equipment and often also of their quality assurance.
International Nuclear Information System (INIS)
Mizokami, Shinya; Hotta, Akitoshi; Kudo, Yoshiro; Yonehara, Tadashi; Watada, Masayuki; Sakaba, Hiroshi
2009-01-01
Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)
Energy Technology Data Exchange (ETDEWEB)
Uslar, Mathias; Specht, Michael; Daenekas, Christian; Trefke, Joern; Rohjans, Sebastian; Gonzalez, Jose M.; Rosinger, Christine; Bleiker, Robert [OFFIS - Institut fuer Informatik, Oldenburg (Germany)
2013-03-01
Introduction to Standardization for Smart Grids. Presents a tutorial and best practice of Smart Grid Prototype Projects. Written by leading experts in the field. Besides the regulatory and market aspects, the technical level dealing with the knowledge from multiple disciplines and the aspects of technical system integration to achieve interoperability and integration has been a strong focus in the Smart Grid. This topic is typically covered by the means of using (technical) standards for processes, data models, functions and communication links. Standardization is a key issue for Smart Grids due to the involvement of many different sectors along the value chain from the generation to the appliances. The scope of Smart Grid is broad, therefore, the standards landscape is unfortunately very large and complex. This is why the three European Standards Organizations ETSI, CEN and CENELEC created a so called Joint Working Group (JWG). This was the first harmonized effort in Europe to bring together the needed disciplines and experts delivering the final report in May 2011. After this approach proved useful, the Commission used the Mandate M/490: Standardization Mandate to European Standardization Organizations (ESOs) to support European Smart Grid deployment. The focal point addressing the ESO's response to M/490 will be the CEN, CENELEC and ETSI Smart Grids Coordination Group (SG-CG). Based on this mandate, meaningful standardization of architectures, use cases, communication technologies, data models and security standards takes place in the four existing working groups. This book provides an overview on the various building blocks and standards identified as the most prominent ones by the JWG report as well as by the first set of standards group - IEC 61850 and CIM, IEC PAS 62559 for documenting Smart Grid use cases, security requirements from the SGIS groups and an introduction on how to apply the Smart Grid Architecture Model SGAM for utilities. In addition
International Nuclear Information System (INIS)
Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Spiesman, J.B.
1995-11-01
This report provides the results of comparisons of the cited and latest versions of ANS, ASME, AWS and NFPA standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review
Sanchez-Vila, X.; de Barros, F.; Bolster, D.; Nowak, W.
2010-12-01
Assessing the potential risk of hydro(geo)logical supply systems to human population is an interdisciplinary field. It relies on the expertise in fields as distant as hydrogeology, medicine, or anthropology, and needs powerful translation concepts to provide decision support and policy making. Reliable health risk estimates need to account for the uncertainties in hydrological, physiological and human behavioral parameters. We propose the use of fault trees to address the task of probabilistic risk analysis (PRA) and to support related management decisions. Fault trees allow decomposing the assessment of health risk into individual manageable modules, thus tackling a complex system by a structural “Divide and Conquer” approach. The complexity within each module can be chosen individually according to data availability, parsimony, relative importance and stage of analysis. The separation in modules allows for a true inter- and multi-disciplinary approach. This presentation highlights the three novel features of our work: (1) we define failure in terms of risk being above a threshold value, whereas previous studies used auxiliary events such as exceedance of critical concentration levels, (2) we plot an integrated fault tree that handles uncertainty in both hydrological and health components in a unified way, and (3) we introduce a new form of stochastic fault tree that allows to weaken the assumption of independent subsystems that is required by a classical fault tree approach. We illustrate our concept in a simple groundwater-related setting.
Energy Technology Data Exchange (ETDEWEB)
Cacais, F.L.; Delgado, J.U., E-mail: facacais@gmail.com [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Loayza, V.M. [Instituto Nacional de Metrologia (INMETRO), Rio de Janeiro, RJ (Brazil). Qualidade e Tecnologia
2016-07-01
In preparing solutions for the production of radionuclide metrology standards is necessary measuring the quantity Activity by mass. The gravimetric method by elimination is applied to perform weighing with smaller uncertainties. At this work is carried out the validation, by the Monte Carlo method, of the uncertainty calculation approach implemented by Lourenco and Bobin according to ISO GUM for the method by elimination. The results obtained by both uncertainty calculation methods were consistent indicating that were fulfilled the conditions for the application of ISO GUM in the preparation of radioactive standards. (author)
International Nuclear Information System (INIS)
Pickles, W.L.; McClure, J.W.; Howell, R.H.
1978-01-01
A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s. 5 figures
[Possible relation between clinical guidelines and legal standard of medicine].
Furukawa, Toshiharu; Kitagawa, Yuko
2010-10-01
Legal standard of medicine is not equal across the all kinds of medical institutions. Each medical institution is required its respective standard of medicine in which its doctors are expected to have studied medical informations, which have been spread among medical institutions with similar characteristics. Therefore, in principle, clinical guidelines for the treatment of a disease formed by public committees do not directly become the medical standards of respective disease treatment. However, doctors would be legally required to practice medicine with reference to the clinical guidelines because medical informations, mediated by internet or many kinds of media, have been spread very fast to all medical institutions these days. Moreover, doctors would be required to inform their patients of non-standardized new treatments, even if such treatments are not listed in clinical guidelines in case patients have special concern about new treat-
International Nuclear Information System (INIS)
Zou, Hong-Mei; Fang, Mao-Fa; Yang, Bai-Yuan; Guo, You-Neng; He, Wei; Zhang, Shi-Yang
2014-01-01
The quantum entropic uncertainty relation and entanglement witness in the two-atom system coupling with the non-Markovian environments are studied using the time-convolutionless master-equation approach. The influence of the non-Markovian effect and detuning on the lower bound of the quantum entropic uncertainty relation and entanglement witness is discussed in detail. The results show that, only if the two non-Markovian reservoirs are identical, increasing detuning and non-Markovian effect can reduce the lower bound of the entropic uncertainty relation, lengthen the time region during which the entanglement can be witnessed, and effectively protect the entanglement region witnessed by the lower bound of the entropic uncertainty relation. The results can be applied in quantum measurement, quantum cryptography tasks and quantum information processing. (paper)
Directory of Open Access Journals (Sweden)
Kleist David
2018-04-01
Full Text Available The Multilateral Convention to Implement Tax Treaty Related Measures to Prevent Base Erosion and Profit Shifting (MLI, which was signed in June 2017, raises a multitude of questions relating not only to the text of the treaty provisions but also to the way the MLI will interact with tax treaties, for instance, and what it will mean for the future development of tax treaty law and international cooperation in tax matters. This article focuses on two aspects of the MLI. First, it deals with the substance of the MLI by providing an overview of its background and content, including the many options available to the contracting states under the MLI. Second, some thoughts are presented on the effects of the MLI in terms of complexity and uncertainty.
Peixoto, J. G. P.; de Almeida, C. E.
2001-09-01
It is recognized by the international guidelines that it is necessary to offer calibration services for mammography beams in order to improve the quality of clinical diagnosis. Major efforts have been made by several laboratories in order to establish an appropriate and traceable calibration infrastructure and to provide the basis for a quality control programme in mammography. The contribution of the radiation metrology network to the users of mammography is reviewed in this work. Also steps required for the implementation of a mammography calibration system using a constant potential x-ray and a clinical mammography x-ray machine are presented. The various qualities of mammography radiation discussed in this work are in accordance with the IEC 61674 and the AAPM recommendations. They are at present available at several primary standard dosimetry laboratories (PSDLs), namely the PTB, NIST and BEV and a few secondary standard dosimetry laboratories (SSDLs) such as at the University of Wisconsin and at the IAEA's SSDL. We discuss the uncertainties involved in all steps of the calibration chain in accord with the ISO recommendations.
Investment and uncertainty in the international oil and gas industry
International Nuclear Information System (INIS)
Mohn, Klaus; Misund, Baard
2009-01-01
The standard theory of irreversible investments and real options suggests a negative relation between investment and uncertainty. Richer models with compound option structures open for a positive relationship. This paper presents a micro-econometric study of corporate investment and uncertainty in a period of market turbulence and restructuring in the international oil and gas industry. Based on data for 115 companies over the period 1992-2005, we estimate four different specifications of the q model of investment, with robust results for the uncertainty variables. The estimated models suggest that macroeconomic uncertainty creates a bottleneck for oil and gas investment and production, whereas industry-specific uncertainty has a stimulating effect. (author)
Investigative study of standards for Digital Repositories and related services
Foulonneau, Muriel; André, Francis
2007-01-01
This study is meant for institutional repository managers, service providers, repository software developers and generally, all players taking an active part in the creation of the digital repository infrastructure for e-research and e-learning. It reviews the current standards, protocols and
Energy Technology Data Exchange (ETDEWEB)
Scott, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daly, Don S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Ying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McJeon, Haewon C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moss, Richard H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Patel, Pralit L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Marty J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rice, Jennie S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhou, Yuyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2014-12-06
This report presents data and assumptions employed in an application of PNNL’s Global Change Assessment Model with a newly-developed Monte Carlo analysis capability. The model is used to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The report provides a summary of how residential and commercial buildings are modeled, together with assumptions made for the distributions of state–level population, Gross Domestic Product (GDP) per worker, efficiency and cost of residential and commercial energy equipment by end use, and efficiency and cost of residential and commercial building shells. The cost and performance of equipment and of building shells are reported separately for current building and equipment efficiency standards and for more aggressive standards. The report also details assumptions concerning future improvements brought about by projected trends in technology.
Uncertainties in downscaled relative humidity for a semi-arid region ...
Indian Academy of Sciences (India)
variables are extracted from the (1) National Centers for Environmental Prediction ... and (2) simulations of the third generation Canadian Coupled Global Climate ... Ef, MAE and P. Cumulative distribution functions were prepared from the ... Climate change; downscaling; hydroclimatology; relative humidity; multi-step linear ...
New conducted electrical weapons: Electrical safety relative to relevant standards.
Panescu, Dorin; Nerheim, Max; Kroll, Mark W; Brave, Michael A
2017-07-01
We have previously published about TASER ® conducted electrical weapons (CEW) compliance with international standards. CEWs deliver electrical pulses that can inhibit a person's neuromuscular control or temporarily incapacitate. An eXperimental Rotating-Field (XRF) waveform CEW and the X2 CEW are new 2-shot electrical weapon models designed to target a precise amount of delivered charge per pulse. They both can deploy 1 or 2 dart pairs, delivered by 2 separate cartridges. Additionally, the XRF controls delivery of incapacitating pulses over 4 field vectors, in a rotating sequence. As in our previous study, we were motivated by the need to understand the cardiac safety profile of these new CEWs. The goal of this paper is to analyze the nominal electrical outputs of TASER XRF and X2 CEWs in reference to provisions of all relevant international standards that specify safety requirements for electrical medical devices and electrical fences. Although these standards do not specifically mention CEWs, they are the closest electrical safety standards and hence give very relevant guidance. The outputs of several TASER XRF and X2 CEWs were measured under normal operating conditions. The measurements were compared against manufacturer specifications. CEWs electrical output parameters were reviewed against relevant safety requirements of UL 69, IEC 60335-2-76 Ed 2.1, IEC 60479-1, IEC 60479-2, AS/NZS 60479.1, AS/NZS 60479.2, IEC 60601-1 and BS EN 60601-1. Our study confirmed that the nominal electrical outputs of TASER XRF and X2 CEWs lie within safety bounds specified by relevant standards.
Energy Technology Data Exchange (ETDEWEB)
Molotkov, S. N., E-mail: sergei.molotkov@gmail.com [Russian Federation, Academy of Cryptography (Russian Federation)
2012-12-15
Any key-generation session contains a finite number of quantum-state messages, and it is there-fore important to understand the fundamental restrictions imposed on the minimal length of a string required to obtain a secret key with a specified length. The entropy uncertainty relations for smooth min and max entropies considerably simplify and shorten the proof of security. A proof of security of quantum key distribution with phase-temporal encryption is presented. This protocol provides the maximum critical error compared to other protocols up to which secure key distribution is guaranteed. In addition, unlike other basic protocols (of the BB84 type), which are vulnerable with respect to an attack by 'blinding' of avalanche photodetectors, this protocol is stable with respect to such an attack and guarantees key security.
International Nuclear Information System (INIS)
Molotkov, S. N.
2012-01-01
Any key-generation session contains a finite number of quantum-state messages, and it is there-fore important to understand the fundamental restrictions imposed on the minimal length of a string required to obtain a secret key with a specified length. The entropy uncertainty relations for smooth min and max entropies considerably simplify and shorten the proof of security. A proof of security of quantum key distribution with phase-temporal encryption is presented. This protocol provides the maximum critical error compared to other protocols up to which secure key distribution is guaranteed. In addition, unlike other basic protocols (of the BB84 type), which are vulnerable with respect to an attack by “blinding” of avalanche photodetectors, this protocol is stable with respect to such an attack and guarantees key security.
International Nuclear Information System (INIS)
Carlson, A.D.
1984-01-01
The accuracy of neutron cross-section measurement is limited by the uncertainty in the standard cross-section and the errors associated with using it. Any improvement in the standard immediately improves all cross-section measurements which have been made relative to that standard. Light element, capture and fission standards are discussed. (U.K.)
Deterministic uncertainty analysis
International Nuclear Information System (INIS)
Worley, B.A.
1987-12-01
This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs
Investigative study of standards for digital repositories and related services
Foulonneau, Muriel; Badolato, Anne-Marie
2008-01-01
This study is meant for institutional repository managers, service providers, repository software developers and generally, all players taking an active part in the creation of the digital repository infrastructure for e-research and e-learning. It reviews the current standards, protocols and applications in the domain of digital repositories. Special attention is being paid to the interoperability of repositories to enhance the exchange of data in repositories. It aims to stimulate discussion about these topics and supports initiatives for the integration of and, where needed, development of
Energy Technology Data Exchange (ETDEWEB)
Masahito, Hayashi [ERATO, Quantum Computation and Information Project, Japan Science and Technology Agency, Tokyo (Japan); Reynaud, S. [Universite Pierre et Marie Curie, Lab. Kastler Brossel, 75 - Paris (France); Jaekel, M.Th. [Ecole Nationale Superieure de Chimie, Lab. de Physique Theorique 75 - Paris (France); Fiuraaek, J. [Palacky Univ., Dept. of Optics (Czech Republic); Garcia-Patron, R.; Cerf, N.J. [QUIC, Ecole Polytechnique, Universite Libre de Bruxelles, Brussels (Belgium); Hage, B.; Chelkowski, S.; Franzen, A.; Lastzka, N.; Vahlbruch, N.; Danzmann, K.; Schnabel, R. [Hannover Univ., Institut Faur Atom- und Molekaulphysik, Max-Planck-Institut, Gravitationsphysik (Albert-Einstein-Institut) (Germany); Hassan, S.S. [Bahrain Univ., Dept. of Mathematics, College of Science (Bahrain); Joshi, A. [Arkansas, Univ., Dept. of Physics, Fayetteville, AR (United States); Jakob, M. [ARC Seibersdorf Research GmbH (ARCS), Tech Gate Vienna, Vienna (Austria); Bergou, J.A. [New York City Univ., Dept. of Physics, Hunter College, NY (United States); Kozlovskii, A.V. [P.N.Lebedev Physical Institute, Moscow (Russian Federation); Prakash, H. [Allahabad Univ., Dept. of Physics (India)]|[Allahabad Univ., M. N. Saha Centre of Space Studies, Institute of Interdisciplinary Studies (India); Kumar, R. [Allahabad Univ., Dept. of Physics (India)]|[Udai Pratap Autonomous College (India)
2005-07-01
The purpose of the conference was to bring together people working in the field of quantum optics, with special emphasis on non-classical light sources and related areas, quantum computing, statistical mechanics and mathematical physics. As a novelty, this edition will include the topics of quantum imaging, quantum phase noise and number theory in quantum mechanics. This document gives the program of the conference and gathers the abstracts.
Energy Technology Data Exchange (ETDEWEB)
Masahito, Hayashi [ERATO, Quantum Computation and Information Project, Japan Science and Technology Agency, Tokyo (Japan); Reynaud, S [Universite Pierre et Marie Curie, Lab. Kastler Brossel, 75 - Paris (France); Jaekel, M Th [Ecole Nationale Superieure de Chimie, Lab. de Physique Theorique 75 - Paris (France); Fiuraaek, J [Palacky Univ., Dept. of Optics (Czech Republic); Garcia-Patron, R; Cerf, N J [QUIC, Ecole Polytechnique, Universite Libre de Bruxelles, Brussels (Belgium); Hage, B; Chelkowski, S; Franzen, A; Lastzka, N; Vahlbruch, N; Danzmann, K; Schnabel, R [Hannover Univ., Institut Faur Atom- und Molekaulphysik, Max-Planck-Institut, Gravitationsphysik (Albert-Einstein-Institut) (Germany); Hassan, S S [Bahrain Univ., Dept. of Mathematics, College of Science (Bahrain); Joshi, A [Arkansas, Univ., Dept. of Physics, Fayetteville, AR (United States); Jakob, M [ARC Seibersdorf Research GmbH (ARCS), Tech Gate Vienna, Vienna (Austria); Bergou, J A [New York City Univ., Dept. of Physics, Hunter College, NY (United States); Kozlovskii, A V [P.N.Lebedev Physical Institute, Moscow (Russian Federation); Prakash, H [Allahabad Univ., Dept. of Physics (India); [Allahabad Univ., M. N. Saha Centre of Space Studies, Institute of Interdisciplinary Studies (India); Kumar, R [Allahabad Univ., Dept. of Physics (India); [Udai Pratap Autonomous College (India)
2005-07-01
The purpose of the conference was to bring together people working in the field of quantum optics, with special emphasis on non-classical light sources and related areas, quantum computing, statistical mechanics and mathematical physics. As a novelty, this edition will include the topics of quantum imaging, quantum phase noise and number theory in quantum mechanics. This document gives the program of the conference and gathers the abstracts.
Chen, Xing; Zhang, Xing
2016-01-01
Despite the importance of adoption of mobile health services by an organization on the diffusion of mobile technology in the big data era, it has received minimal attention in literature. This study investigates how relative advantage and perceived credibility affect an organization's adoption of mobile health services, as well as how environmental uncertainty changes the relationship of relative advantage and perceived credibility with adoption. A research model that integrates relative advantage, perceived credibility, environmental uncertainty, and an organization's intention to use mobile health service is developed. Quantitative data are collected from senior managers and information systems managers in 320 Chinese healthcare organizations. The empirical findings show that while relative advantage and perceived credibility both have positive effects on an organization's intention to use mobile health services, relative advantage plays a more important role than perceived credibility. Moreover, environmental uncertainty positively moderates the effect of relative advantage on an organization's adoption of mobile health services. Thus, mobile health services in environments characterized with high levels of uncertainty are more likely to be adopted because of relative advantage than in environments with low levels of uncertainty.
Directory of Open Access Journals (Sweden)
Xing Chen
2016-01-01
Full Text Available Despite the importance of adoption of mobile health services by an organization on the diffusion of mobile technology in the big data era, it has received minimal attention in literature. This study investigates how relative advantage and perceived credibility affect an organization’s adoption of mobile health services, as well as how environmental uncertainty changes the relationship of relative advantage and perceived credibility with adoption. A research model that integrates relative advantage, perceived credibility, environmental uncertainty, and an organization’s intention to use mobile health service is developed. Quantitative data are collected from senior managers and information systems managers in 320 Chinese healthcare organizations. The empirical findings show that while relative advantage and perceived credibility both have positive effects on an organization’s intention to use mobile health services, relative advantage plays a more important role than perceived credibility. Moreover, environmental uncertainty positively moderates the effect of relative advantage on an organization’s adoption of mobile health services. Thus, mobile health services in environments characterized with high levels of uncertainty are more likely to be adopted because of relative advantage than in environments with low levels of uncertainty.
Quantification of uncertainties of modeling and simulation
International Nuclear Information System (INIS)
Ma Zhibo; Yin Jianwei
2012-01-01
The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)
The Rhetoric of Arrogance: The Public Relations Response of the Standard Oil Trust.
Boyd, Josh
2001-01-01
Illustrates one of the earliest American public relations debacles (ending in the dissolution of the Standard Oil Trust in 1911). Presents background on Standard Oil and offers an overview Ida Tarbell's influential "History of the Standard Oil company." Argues that Standard failed to respond to these accounts adequately, reinforcing…
Energy Technology Data Exchange (ETDEWEB)
Lo Bianco, A.S.; Oliveira, H.P.S.; Peixoto, J.G.P., E-mail: abianco@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI)
2009-07-01
To implant the primary standard of the magnitude kerma in the air for X-ray between 10 - 50 keV, the National Metrology Laboratory of Ionizing Radiations (LNMRI) must evaluate all the uncertainties of measurement related with Victtoren chamber. So, it was evaluated the uncertainty of the kerma in the air consequent of the inaccuracy in the active volume of the chamber using the calculation of Monte Carlo as a tool through the Penelope software
A review of standards related to biomass combustion
Energy Technology Data Exchange (ETDEWEB)
Villeneuve, J.; Savoie, P. [Agriculture and Agri-Food Canada, Quebec City, PQ (Canada)
2010-07-01
Air quality is evaluated by the concentration of particulate matter (PM) per unit of air volume. PM10 refers to all particles smaller than 10 micrometers in diameter. The European Commission has established acceptable levels of PM10, but the rules are less precise for evaluating the amount of PM that can be emitted from a furnace's chimney. The province of Quebec allows up to 340 mg/m{sup 3} of PM for large furnaces and 600 mg/m{sup 3} for smaller furnaces. Although wood products can be burned in the province, the burning of all other biomass such as straw, stover and grass is forbidden. The City of Vancouver has stricter emissions standards for PM, notably 50 mg/m{sup 3} for large furnaces and 35 mg/m{sup 3} for smaller furnaces. The reason for this difference is that most furnaces in Quebec are used in rural areas whereas the densely populated City of Vancouver must control emissions at the source. It was concluded that although a universal standard on combustion emissions is not feasible because of different socio-economic conditions and population density, furnaces should emit levels of PM which decrease as the surrounding area population concentration increases. Stringent regulations may be met through advances in technology such as chimney height, bag filters, multicyclones, and precipitators.
A consolidated and standardized relational database for ER data
International Nuclear Information System (INIS)
Zygmunt, B.C.
1995-01-01
The three US Department of Energy (DOE) installations on the Oak Ridge Reservation (ORR) (Oak Ridge National Laboratory, Y-12, and K-25) were established during World War II as part of the Manhattan Project that ''built the bomb.'' That research, and work in more recent years, has resulted in the generation of radioactive materials and other toxic wastes. Lockheed Martin Energy Systems manages the three Oak Ridge installations (as well as the Environmental Restoration (ER) programs at the DOE plants in Portsmouth, Ohio, and Paducah, Kentucky). DOE Oak Ridge Operations has been mandated by federal and state agreements to provide a consolidated repository of environmental data and is tasked to support environmental data management activities at all five installations. The Oak Ridge Environmental Information System (OREIS) was initiated to fulfill these requirements. The primary use of OREIS data is to provide access to project results by regulators. A secondary use is to serve as background data for other projects. This paper discusses the benefits of a consolidated and standardized database; reasons for resistance to the consolidation of data; implementing a consolidated database, including attempts at standardization, deciding what to include in the consolidated database, establishing lists of valid values, and addressing quality control (QC) issues; and the evolution of a consolidated database, which includes developing and training a user community, dealing with configuration control issues, and incorporating historical data. OREIS is used to illustrate these topics
International Nuclear Information System (INIS)
Wang, Gang; Zhang, Zhonghua; Li, Zhengkun; Xu, Jinxin; You, Qiang
2016-01-01
Measurement of the mutual inductance is one of the key techniques in the joule balance to determine the Planck constant h, where a standard-square-wave compensation method was proposed to accurately measure the dc value of the mutual inductance. With this method, analog switches are used to compose an analog-switch signal generator to synthesize the excitation and compensation voltages. However, the accuracy of the compensation voltage is influenced by the non-ideal behaviors of analog-switches. In this paper, the effect from these non-ideal switches is analyzed in detail and evaluated with the equivalent circuits. A programmable Josephson voltage standard (PJVS) is used to generate a reference compensation voltage to measure the time integration of the voltage waveform generated by the analog-switch signal generator. Moreover, the effect is also evaluated experimentally by comparing the difference between the mutual inductance measured with the analog-switch signal generator and the value determined by the PJVS-analog-switch generator alternately in the same mutual inductance measurement system. The result shows that the impact of analog switches is 1.97 × 10 −7 with an uncertainty of 1.83 × 10 −7 (k = 1) and confirms that the analog switch method can be used regularly instead of the PJVS in the mutual inductance measurement for the joule balance experiment. (paper)
International Nuclear Information System (INIS)
Fontenot, Jonas D; Bloch, Charles; Followill, David; Titt, Uwe; Newhauser, Wayne D
2010-01-01
Theoretical calculations have shown that proton therapy can reduce the incidence of radiation-induced secondary malignant neoplasms (SMN) compared with photon therapy for patients with prostate cancer. However, the uncertainties associated with calculations of SMN risk had not been assessed. The objective of this study was to quantify the uncertainties in projected risks of secondary cancer following contemporary proton and photon radiotherapies for prostate cancer. We performed a rigorous propagation of errors and several sensitivity tests to estimate the uncertainty in the ratio of relative risk (RRR) due to the largest contributors to the uncertainty: the radiation weighting factor for neutrons, the dose-response model for radiation carcinogenesis and interpatient variations in absorbed dose. The interval of values for the radiation weighting factor for neutrons and the dose-response model were derived from the literature, while interpatient variations in absorbed dose were taken from actual patient data. The influence of each parameter on a baseline RRR value was quantified. Our analysis revealed that the calculated RRR was insensitive to the largest contributors to the uncertainty. Uncertainties in the radiation weighting factor for neutrons, the shape of the dose-risk model and interpatient variations in therapeutic and stray doses introduced a total uncertainty of 33% to the baseline RRR calculation.
Regulations and standardization relative to the biomass combustion
International Nuclear Information System (INIS)
Autret, E.
2009-01-01
It does not exist regulations on pollutants emissions on domestic wood burning furnaces, however, these appliances are submitted to the European and french standardization concerning the safety rules, the use rules and the tests methods. Since 2007, these wood burning appliances on the market must have the European Community label. The green flame label was elaborated by the environment and energy control Agency (A.D.E.M.E.), and manufacturers of domestic appliances to promote the use of competitive wood burning appliances. concerning the collective and industrial heating, the installations of more 2 MW are framed by different categories of the installations classified for environment protection (I.C.P.E.) regulation according their fuel and power. The combustion installations of less than 2 MW are a particular case, they are framed by a sanitary department regulation and are controlled by the department directions of sanitary and social affairs. the limit values of emissions are summarized in tables. (N.C.)
DEFF Research Database (Denmark)
Hukkerikar, Amol; Kalakul, Sawitree; Sarup, Bent
2012-01-01
The aim of this work is to develop group-3 contribution+ (GC+)method (combined group-contribution (GC) method and atom connectivity index (CI)) based 15 property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated...... property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality......, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22...
The relation between bone demineralization, physical activity and anthropometric standards
Directory of Open Access Journals (Sweden)
Milena Barbosa Camara
2017-03-01
Full Text Available This paper aimed to verify the correlation between bone mineral density and the level of physical activity, as well as the food intake and the anthropometric parameters. It intended to analyse the bone mineral density (BMD of menopausal women through the bone densitometry test (DO in the lumbar region (L1 to L4, femoral neck and total femur, and also use Bouchard’s self-recall of daily activities; employing the food record from Buker and Stuart to dose and quantify the daily intake of calcium and vitamin D. The data were analysed via Kolmogorov-Smirnov’s test, and default value of α = 0.05 was set to compare the BMD averages. It was observed that one hundred percent of the assessed individuals had a BMD level below the average fixed by WHO: 14.4% with osteopenia and 85.6% with osteoporosis; a lower BMD in the femoral area (0.721g and the biggest loss among the sedentary ones (0.698g. It was noticed that there was a correlation between the physical activities and the BMD only when associated with anthropometric standards and the daily ingestion of vitamin D.
International Nuclear Information System (INIS)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software
Lindley, Dennis V
2013-01-01
Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.
The standardization of data relational mode in the materials database for nuclear power engineering
International Nuclear Information System (INIS)
Wang Xinxuan
1996-01-01
A relational database needs standard data relation ships. The data relation ships include hierarchical structures and repeat set records. Code database is created and the relational database is created between spare parts and materials and properties of the materials. The data relation ships which are not standard are eliminated and all the relation modes are made to meet the demands of the 3NF (Third Norm Form)
International Nuclear Information System (INIS)
Al-Hashimi, M.H.; Wiese, U.-J.
2012-01-01
We consider a 1-parameter family of self-adjoint extensions of the Hamiltonian for a particle confined to a finite interval with perfectly reflecting boundary conditions. In some cases, one obtains negative energy states which seem to violate the Heisenberg uncertainty relation. We use this as a motivation to derive a generalized uncertainty relation valid for an arbitrarily shaped quantum dot with general perfectly reflecting walls in d dimensions. In addition, a general uncertainty relation for non-Hermitian operators is derived and applied to the non-Hermitian momentum operator in a quantum dot. We also consider minimal uncertainty wave packets in this situation, and we prove that the spectrum depends monotonically on the self-adjoint extension parameter. In addition, we construct the most general boundary conditions for semiconductor heterostructures such as quantum dots, quantum wires, and quantum wells, which are characterized by a 4-parameter family of self-adjoint extensions. Finally, we consider perfectly reflecting boundary conditions for relativistic fermions confined to a finite volume or localized on a domain wall, which are characterized by a 1-parameter family of self-adjoint extensions in the (1+1)-d and (2+1)-d cases, and by a 4-parameter family in the (3+1)-d and (4+1)-d cases. - Highlights: ► Finite volume Heisenberg uncertainty relation. ► General self-adjoint extensions for relativistic fermions. ► New prospective for the problem of particle in a box.
Uncertainty in spatial planning proceedings
Directory of Open Access Journals (Sweden)
Aleš Mlakar
2009-01-01
Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.
Standardized reporting for rapid relative effectiveness assessments of pharmaceuticals.
Kleijnen, Sarah; Pasternack, Iris; Van de Casteele, Marc; Rossi, Bernardette; Cangini, Agnese; Di Bidino, Rossella; Jelenc, Marjetka; Abrishami, Payam; Autti-Rämö, Ilona; Seyfried, Hans; Wildbacher, Ingrid; Goettsch, Wim G
2014-11-01
Many European countries perform rapid assessments of the relative effectiveness (RE) of pharmaceuticals as part of the reimbursement decision making process. Increased sharing of information on RE across countries may save costs and reduce duplication of work. The objective of this article is to describe the development of a tool for rapid assessment of RE of new pharmaceuticals that enter the market, the HTA Core Model® for Rapid Relative Effectiveness Assessment (REA) of Pharmaceuticals. Eighteen member organisations of the European Network of Health Technology Assessment (EUnetHTA) participated in the development of the model. Different versions of the model were developed and piloted in this collaboration and adjusted accordingly based on feedback on the content and feasibility of the model. The final model deviates from the traditional HTA Core Model® used for assessing other types of technologies. This is due to the limited scope (strong focus on RE), the timing of the assessment (just after market authorisation), and strict timelines (e.g. 90 days) required for performing the assessment. The number of domains and assessment elements was limited and it was decided that the primary information sources should preferably be a submission file provided by the marketing authorisation holder and the European Public Assessment Report. The HTA Core Model® for Rapid REA (version 3.0) was developed to produce standardised transparent RE information of pharmaceuticals. Further piloting can provide input for possible improvements, such as further refining the assessment elements and new methodological guidance on relevant areas.
Conditional uncertainty principle
Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun
2018-04-01
We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.
Zhang, Zuo-Yuan; Wei, DaXiu; Liu, Jin-Ming
2018-06-01
The precision of measurements for two incompatible observables in a physical system can be improved with the assistance of quantum memory. In this paper, we investigate the quantum-memory-assisted entropic uncertainty relation for a spin-1 Heisenberg model in the presence of external magnetic fields, the systemic quantum entanglement (characterized by the negativity) is analyzed as contrast. Our results show that for the XY spin chain in thermal equilibrium, the entropic uncertainty can be reduced by reinforcing the coupling between the two particles or decreasing the temperature of the environment. At zero-temperature, the strong magnetic field can result in the growth of the entropic uncertainty. Moreover, in the Ising case, the variation trends of the uncertainty are relied on the choices of anisotropic parameters. Taking the influence of intrinsic decoherence into account, we find that the strong coupling accelerates the inflation of the uncertainty over time, whereas the high magnetic field contributes to its reduction during the temporal evolution. Furthermore, we also verify that the evolution behavior of the entropic uncertainty is roughly anti-correlated with that of the entanglement in the whole dynamical process. Our results could offer new insights into quantum precision measurement for the high spin solid-state systems.
International Nuclear Information System (INIS)
Sakai, K.; Hishida, H.
1978-01-01
Probabilistic fuel pin gap distributions within a wire-spaced fuel subassembly and sensitivities of the related uncertainties to fuel pin gaps are discussed. The analyses consist mainly of expressing a local fuel pin gap in terms of sensitivity functions of the related uncertainties and calculating the corresponding probabilistic distribution through taking all the possible combinations of the distribution of uncertainties. The results of illustrative calculations show that with the reliability level of 0.9987, the maximum deviation of the pin gap at the cladding hot spot of a center fuel subassembly is 8.05% from its nominal value and the corresponding probabilistic pin gap distribution is shifted to the narrower side due to the external confinement of a pin bundle with a wrapper tube. (Auth.)
Koch, Michael
Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.
International Nuclear Information System (INIS)
Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.
2005-01-01
In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)
2010-10-01
... Regarding Auditing, Attestation, and Related Professional Practice Standards Related To Brokers and Dealers... Oversight Board in the Dodd-Frank Wall Street Reform and Consumer Protection Act to establish auditing... 60617
IEEE Std 382-1980: IEEE standard for qualification of safety-related valve actuators
International Nuclear Information System (INIS)
Anon.
1992-01-01
This standard describes the qualification of all types of power-driven valve actuators, including damper actuators, for safety-related functions in nuclear power generating stations. This standard may also be used to separately qualify actuator components. This standard establishes the minimum requirements for, and guidance regarding, the methods and procedures for qualification of all safety-related functions of power-driven valve actuators
Uncertainty in hydrological signatures
McMillan, Hilary; Westerberg, Ida
2015-04-01
Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty
Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.
2002-01-01
The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.
Goutzamanis, Stelliana; Doyle, Joseph S; Thompson, Alexander; Dietze, Paul; Hellard, Margaret; Higgs, Peter
2018-04-02
People who inject drugs (PWID) are most at risk of hepatitis C virus infection in Australia. The introduction of transient elastography (TE) (measuring hepatitis fibrosis) and direct acting antiviral medications will likely alter the experience of living with hepatitis C. We aimed to explore positive and negative influences on wellbeing and stress among PWID with hepatitis C. The Treatment and Prevention (TAP) study examines the feasibility of treating hepatitis C mono-infected PWID in community settings. Semi-structured interviews were conducted with 16 purposively recruited TAP participants. Participants were aware of their hepatitis C seropositive status and had received fibrosis assessment (measured by TE) prior to interview. Questions were open-ended, focusing on the impact of health status on wellbeing and self-reported stress. Interviews were voice recorded, transcribed verbatim and thematically analysed, guided by Mishel's (1988) theory of Uncertainty in Illness. In line with Mishel's theory of Uncertainty in Illness all participants reported hepatitis C-related uncertainty, particularly mis-information or a lack of knowledge surrounding liver health and the meaning of TE results. Those with greater fibrosis experienced an extra layer of prognostic uncertainty. Experiences of uncertainty were a key motivation to seek treatment, which was seen as a way to regain some stability in life. Treatment completion alleviated hepatitis C-related stress, and promoted feelings of empowerment and confidence in addressing other life challenges. TE scores seemingly provide some certainty. However, when paired with limited knowledge, particularly among people with severe fibrosis, TE may be a source of uncertainty and increased personal stress. This suggests the need for simple education programs and resources on liver health to minimise stress.
Moyers, M F
2014-06-01
Adequate evaluation of the results from multi-institutional trials involving light ion beam treatments requires consideration of the planning margins applied to both targets and organs at risk. A major uncertainty that affects the size of these margins is the conversion of x ray computed tomography numbers (XCTNs) to relative linear stopping powers (RLSPs). Various facilities engaged in multi-institutional clinical trials involving proton beams have been applying significantly different margins in their patient planning. This study was performed to determine the variance in the conversion functions used at proton facilities in the U.S.A. wishing to participate in National Cancer Institute sponsored clinical trials. A simplified method of determining the conversion function was developed using a standard phantom containing only water and aluminum. The new method was based on the premise that all scanners have their XCTNs for air and water calibrated daily to constant values but that the XCTNs for high density/high atomic number materials are variable with different scanning conditions. The standard phantom was taken to 10 different proton facilities and scanned with the local protocols resulting in 14 derived conversion functions which were compared to the conversion functions used at the local facilities. For tissues within ±300 XCTN of water, all facility functions produced converted RLSP values within ±6% of the values produced by the standard function and within 8% of the values from any other facility's function. For XCTNs corresponding to lung tissue, converted RLSP values differed by as great as ±8% from the standard and up to 16% from the values of other facilities. For XCTNs corresponding to low-density immobilization foam, the maximum to minimum values differed by as much as 40%. The new method greatly simplifies determination of the conversion function, reduces ambiguity, and in the future could promote standardization between facilities. Although it
What is the uncertainty principle of non-relativistic quantum mechanics?
Riggs, Peter J.
2018-05-01
After more than ninety years of discussions over the uncertainty principle, there is still no universal agreement on what the principle states. The Robertson uncertainty relation (incorporating standard deviations) is given as the mathematical expression of the principle in most quantum mechanics textbooks. However, the uncertainty principle is not merely a statement of what any of the several uncertainty relations affirm. It is suggested that a better approach would be to present the uncertainty principle as a statement about the probability distributions of incompatible variables and the resulting restrictions on quantum states.
Kulasiri, Don; Liang, Jingyi; He, Yao; Samarasinghe, Sandhya
2017-04-21
We investigate the epistemic uncertainties of parameters of a mathematical model that describes the dynamics of CaMKII-NMDAR complex related to memory formation in synapses using global sensitivity analysis (GSA). The model, which was published in this journal, is nonlinear and complex with Ca 2+ patterns with different level of frequencies as inputs. We explore the effects of parameter on the key outputs of the model to discover the most sensitive ones using GSA and partial ranking correlation coefficient (PRCC) and to understand why they are sensitive and others are not based on the biology of the problem. We also extend the model to add presynaptic neurotransmitter vesicles release to have action potentials as inputs of different frequencies. We perform GSA on this extended model to show that the parameter sensitivities are different for the extended model as shown by PRCC landscapes. Based on the results of GSA and PRCC, we reduce the original model to a less complex model taking the most important biological processes into account. We validate the reduced model against the outputs of the original model. We show that the parameter sensitivities are dependent on the inputs and GSA would make us understand the sensitivities and the importance of the parameters. A thorough phenomenological understanding of the relationships involved is essential to interpret the results of GSA and hence for the possible model reduction. Copyright © 2017 Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Freitas, M C; Martinho, E [LNETI/ICEN, Sacavem (Portugal)
1989-04-15
Instrumental neutron activation analysis with the k{sub o}-standardization method was applied to eight geological, environmental and biological reference materials, including leaves, blood, fish, sediments, soils and limestone. To a first approximation, the results were normally distributed around the certified values with a standard deviation of 10%. Results obtained by using the relative method based on well characterized multi-element standards for IAEA CRM Soil-7 are reported.
International Nuclear Information System (INIS)
Aven, Terje; Pedersen, Linda Martens
2014-01-01
Production assurance analyses of production systems are in practice typically carried out using flow network modelling and Monte Carlo simulations. Based on the network and probability distribution assumptions for equipment lifetime and restoration time, the simulation tool produces predictions/estimates and uncertainty distributions of the production availability, which is defined as the ratio of production to planned production, or any other reference level, over a specified period of time. To adequately communicate the results from the analyses, it is essential that there is in place a framework which clarifies how to understand the concepts introduced, including the uncertainty distributions produced. Some key elements of such a conceptual framework are well established in the industry, for example the use of probability models to represent the stochastic variation related to lifetimes and restoration times. However an overall framework linking this variation, as well as “model uncertainties”, to the epistemic uncertainty distribution for the output production availability, has been lacking. The purpose of the present paper is to present such a framework, and in this way provide new insights to and guidelines on how to understand and present the uncertainties in practical production assurance analyses. An example related to a subsea production system is used to illustrate the framework and the guidelines
Liu, Baoding
2015-01-01
When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...
Review of international standards related to the design for control rooms on nuclear power plants
International Nuclear Information System (INIS)
Kitamura, Masashi; Yoshikawa, Hidekazu; Fujita, Yushi
2005-01-01
The improvement of Human-Machine Interface (HMI) design for control rooms on nuclear power plants (NPP) has been accomplished world wide, especially after the TMI-2 accident. The design process and guidelines are standardized in IEC60964 and supplemental standards as international standard. However, technological update is required due to the increased use of computerized control and monitoring equipment and systems in control rooms on NPP in recent years. Standards are becoming more important for computerized control rooms because there is more freedom to design than conventional hardware based system. For computerized control rooms, standards for hardware and software of HMI systems should be also considered. Standards and guidelines for computerized control rooms on NPP have been developed recently in each body such as IEC, ISO, and IEEE etc. Therefore, reviewing these standards and guidelines related to control rooms design of NPP can be useful not only for revision of the international standards such as IEC60964, but also for users of the standards and guidelines. In this paper, we reviewed the international standards related to the design for control rooms, in the two aspects of HMI design and hardware and software design, considering the undergoing revision work and their application. (author)
77 FR 50757 - Charging Standard Administrative Fees for Nonprogram-Related Information
2012-08-22
... are announcing the standardized administrative fees we will charge to recover our full cost of... will ensure fees are consistent and that we collect the full cost of supplying our information when a... standard fees that are calculated to reflect the full cost of providing information for nonprogram-related...
Do U Txt? Event-Related Potentials to Semantic Anomalies in Standard and Texted English
Berger, Natalie I.; Coch, Donna
2010-01-01
Texted English is a hybrid, technology-based language derived from standard English modified to facilitate ease of communication via instant and text messaging. We compared semantic processing of texted and standard English sentences by recording event-related potentials in a classic semantic incongruity paradigm designed to elicit an N400 effect.…
Justification for recommended uncertainties
International Nuclear Information System (INIS)
Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.
2007-01-01
The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Vaginismus and dyspareunia : Relationship with general and sex-related moral standards
Borg, Charmaine; de Jong, Peter J.; Schultz, Willibrord Weijmar
Introduction. Relatively strong adherence to conservative values and/or relatively strict sex-related moral standards logically restricts the sexual repertoire and will lower the threshold for experiencing negative emotions in a sexual context. In turn, this may generate withdrawal and avoidance
Shope, Christopher L.; Angeroth, Cory E.
2015-01-01
Effective management of surface waters requires a robust understanding of spatiotemporal constituent loadings from upstream sources and the uncertainty associated with these estimates. We compared the total dissolved solids loading into the Great Salt Lake (GSL) for water year 2013 with estimates of previously sampled periods in the early 1960s.We also provide updated results on GSL loading, quantitatively bounded by sampling uncertainties, which are useful for current and future management efforts. Our statistical loading results were more accurate than those from simple regression models. Our results indicate that TDS loading to the GSL in water year 2013 was 14.6 million metric tons with uncertainty ranging from 2.8 to 46.3 million metric tons, which varies greatly from previous regression estimates for water year 1964 of 2.7 million metric tons. Results also indicate that locations with increased sampling frequency are correlated with decreasing confidence intervals. Because time is incorporated into the LOADEST models, discrepancies are largely expected to be a function of temporally lagged salt storage delivery to the GSL associated with terrestrial and in-stream processes. By incorporating temporally variable estimates and statistically derived uncertainty of these estimates,we have provided quantifiable variability in the annual estimates of dissolved solids loading into the GSL. Further, our results support the need for increased monitoring of dissolved solids loading into saline lakes like the GSL by demonstrating the uncertainty associated with different levels of sampling frequency.
Sinner, K.; Teasley, R. L.
2016-12-01
Groundwater models serve as integral tools for understanding flow processes and informing stakeholders and policy makers in management decisions. Historically, these models tended towards a deterministic nature, relying on historical data to predict and inform future decisions based on model outputs. This research works towards developing a stochastic method of modeling recharge inputs from pipe main break predictions in an existing groundwater model, which subsequently generates desired outputs incorporating future uncertainty rather than deterministic data. The case study for this research is the Barton Springs segment of the Edwards Aquifer near Austin, Texas. Researchers and water resource professionals have modeled the Edwards Aquifer for decades due to its high water quality, fragile ecosystem, and stakeholder interest. The original case study and model that this research is built upon was developed as a co-design problem with regional stakeholders and the model outcomes are generated specifically for communication with policy makers and managers. Recently, research in the Barton Springs segment demonstrated a significant contribution of urban, or anthropogenic, recharge to the aquifer, particularly during dry period, using deterministic data sets. Due to social and ecological importance of urban water loss to recharge, this study develops an evaluation method to help predicted pipe breaks and their related recharge contribution within the Barton Springs segment of the Edwards Aquifer. To benefit groundwater management decision processes, the performance measures captured in the model results, such as springflow, head levels, storage, and others, were determined by previous work in elicitation of problem framing to determine stakeholder interests and concerns. The results of the previous deterministic model and the stochastic model are compared to determine gains to stakeholder knowledge through the additional modeling
Uncertainties in repository modeling
Energy Technology Data Exchange (ETDEWEB)
Wilson, J.R.
1996-12-31
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.
Uncertainties in repository modeling
International Nuclear Information System (INIS)
Wilson, J.R.
1996-01-01
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling
Energy Technology Data Exchange (ETDEWEB)
Wedenberg, Minna, E-mail: minna.wedenberg@raysearchlabs.com
2013-11-15
Purpose: To apply a statistical bootstrap analysis to assess the uncertainty in the dose–response relation for the endpoints pneumonitis and myelopathy reported in the QUANTEC review. Methods and Materials: The bootstrap method assesses the uncertainty of the estimated population-based dose-response relation due to sample variability, which reflects the uncertainty due to limited numbers of patients in the studies. A large number of bootstrap replicates of the original incidence data were produced by random sampling with replacement. The analysis requires only the dose, the number of patients, and the number of occurrences of the studied endpoint, for each study. Two dose–response models, a Poisson-based model and the Lyman model, were fitted to each bootstrap replicate using maximum likelihood. Results: The bootstrap analysis generates a family of curves representing the range of plausible dose–response relations, and the 95% bootstrap confidence intervals give an estimated upper and lower toxicity risk. The curve families for the 2 dose–response models overlap for doses included in the studies at hand but diverge beyond that, with the Lyman model suggesting a steeper slope. The resulting distributions of the model parameters indicate correlation and non-Gaussian distribution. For both data sets, the likelihood of the observed data was higher for the Lyman model in >90% of the bootstrap replicates. Conclusions: The bootstrap method provides a statistical analysis of the uncertainty in the estimated dose–response relation for myelopathy and pneumonitis. It suggests likely values of model parameter values, their confidence intervals, and how they interrelate for each model. Finally, it can be used to evaluate to what extent data supports one model over another. For both data sets considered here, the Lyman model was preferred over the Poisson-based model.
Quantifying relative importance: Computing standardized effects in models with binary outcomes
Grace, James B.; Johnson, Darren; Lefcheck, Jonathan S.; Byrnes, Jarrett E.K.
2018-01-01
Scientists commonly ask questions about the relative importances of processes, and then turn to statistical models for answers. Standardized coefficients are typically used in such situations, with the goal being to compare effects on a common scale. Traditional approaches to obtaining standardized coefficients were developed with idealized Gaussian variables in mind. When responses are binary, complications arise that impact standardization methods. In this paper, we review, evaluate, and propose new methods for standardizing coefficients from models that contain binary outcomes. We first consider the interpretability of unstandardized coefficients and then examine two main approaches to standardization. One approach, which we refer to as the Latent-Theoretical or LT method, assumes that underlying binary observations there exists a latent, continuous propensity linearly related to the coefficients. A second approach, which we refer to as the Observed-Empirical or OE method, assumes responses are purely discrete and estimates error variance empirically via reference to a classical R2 estimator. We also evaluate the standard formula for calculating standardized coefficients based on standard deviations. Criticisms of this practice have been persistent, leading us to propose an alternative formula that is based on user-defined “relevant ranges”. Finally, we implement all of the above in an open-source package for the statistical software R.
Assessing Groundwater Model Uncertainty for the Central Nevada Test Area
International Nuclear Information System (INIS)
Pohll, Greg; Pohlmann, Karl; Hassan, Ahmed; Chapman, Jenny; Mihevc, Todd
2002-01-01
The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation
International Nuclear Information System (INIS)
Shaw, W.; Grindrod, P.
1989-01-01
This document encompasses two main items. The first consists of a review of four aspects of fuzzy sets, namely, the general framework, the role of expert judgment, mathematical and computational aspects, and present applications. The second consists of the application of fuzzy-set theory to simplified problems in radionuclide migration, with comparisons between fuzzy and probabilistic approaches, treated both analytically and computationally. A new approach to fuzzy differential equations is presented, and applied to simple ordinary and partial differential equations. It is argued that such fuzzy techniques represent a viable alternative to probabilistic risk assessment, for handling systems subject to uncertainties
On the Linear Relation between the Mean and the Standard Deviation of a Response Time Distribution
Wagenmakers, Eric-Jan; Brown, Scott
2007-01-01
Although it is generally accepted that the spread of a response time (RT) distribution increases with the mean, the precise nature of this relation remains relatively unexplored. The authors show that in several descriptive RT distributions, the standard deviation increases linearly with the mean. Results from a wide range of tasks from different…
Energy Technology Data Exchange (ETDEWEB)
Conover, David R.
2014-09-11
The purpose of this document is to identify laws, rules, model codes, codes, standards, regulations, specifications (CSR) related to safety that could apply to stationary energy storage systems (ESS) and experiences to date securing approval of ESS in relation to CSR. This information is intended to assist in securing approval of ESS under current CSR and to identification of new CRS or revisions to existing CRS and necessary supporting research and documentation that can foster the deployment of safe ESS.
Directory of Open Access Journals (Sweden)
WANG Yong
2016-05-01
Full Text Available As points of interest (POIon the internet, exists widely incomplete addresses and inconsistent literal expressions, a fast standardization processing method of network POIs address information based on spatial constraints was proposed. Based on the model of the extensible address expression, first of all, address information of POI was segmented and extracted. Address elements are updated by means of matching with the address tree layer by layer. Then, by defining four types of positional relations, corresponding set are selected from standard POI library as candidate for enrichment and amendment of non-standard address. At last, the fast standardized processing of POI address information was achieved with the help of backtracking address elements with minimum granularity. Experiments in this paper proved that the standardization processing of an address can be realized by means of this method with higher accuracy in order to build the address database.
Duerdoth, Ian
2009-01-01
The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…
Directory of Open Access Journals (Sweden)
R. Roofegari Nejad
2016-06-01
Full Text Available This paper presents novel methods for Demand Response (DR programs by considering welfare state of consumers, to deal with the operational uncertainties, such as wind energy and energy price, within the framework of a smart microgrid. In this regard, total loads of microgrid are classified into two groups and each one is represented by a typical load. First group is energy storage capability represents by heater loads and second is curtailment capability loads represents by lighting loads. Next by the proposed DR methods, consumed energy of the all loads is coupled to the wind energy rate and energy price. Finally these methods are applied in the operation of a smart microgrid, consists of dispatchable supplier (microturbine, nondispatchable supplier (wind turbine, energy storage system and loads with the capability of energy exchanging with upstream distribution network. In order to consider uncertainties, Monte Carlo simulation method is used, which various scenarios are generated and applied in the operation of microgrid. In the end, the simulation results on a typical microgrid show that implementing proposed DR methods contributes to increasing total operational profit of smart microgrid and also decreasing the risk of low profit too.
Uncertainty in Forest Net Present Value Estimations
Directory of Open Access Journals (Sweden)
Ilona Pietilä
2010-09-01
Full Text Available Uncertainty related to inventory data, growth models and timber price fluctuation was investigated in the assessment of forest property net present value (NPV. The degree of uncertainty associated with inventory data was obtained from previous area-based airborne laser scanning (ALS inventory studies. The study was performed, applying the Monte Carlo simulation, using stand-level growth and yield projection models and three alternative rates of interest (3, 4 and 5%. Timber price fluctuation was portrayed with geometric mean-reverting (GMR price models. The analysis was conducted for four alternative forest properties having varying compartment structures: (A a property having an even development class distribution, (B sapling stands, (C young thinning stands, and (D mature stands. Simulations resulted in predicted yield value (predicted NPV distributions at both stand and property levels. Our results showed that ALS inventory errors were the most prominent source of uncertainty, leading to a 5.1–7.5% relative deviation of property-level NPV when an interest rate of 3% was applied. Interestingly, ALS inventory led to significant biases at the property level, ranging from 8.9% to 14.1% (3% interest rate. ALS inventory-based bias was the most significant in mature stand properties. Errors related to the growth predictions led to a relative standard deviation in NPV, varying from 1.5% to 4.1%. Growth model-related uncertainty was most significant in sapling stand properties. Timber price fluctuation caused the relative standard deviations ranged from 3.4% to 6.4% (3% interest rate. The combined relative variation caused by inventory errors, growth model errors and timber price fluctuation varied, depending on the property type and applied rates of interest, from 6.4% to 12.6%. By applying the methodology described here, one may take into account the effects of various uncertainty factors in the prediction of forest yield value and to supply the
International Nuclear Information System (INIS)
Anon.
1978-01-01
This standard presents guidelines for evaluating site-related geotechnical parameters for nuclear power sites. Aspects considered include geology, ground water, foundation engineering, and earthwork engineering. These guidelines identify the basic geotechnical parameters to be considered in site evaluation, and in the design, construction, and performance of foundations and earthwork aspects for nuclear power plants. Also included are tabulations of typical field and laboratory investigative methods useful in identifying geotechnical parameters. Those areas where interrelationships with other standards may exist are indicated
Norms and international standards related to reduce risk management: A literature review
Directory of Open Access Journals (Sweden)
César Fuentes
2011-09-01
Full Text Available The current work aims to develop a revision of the literature within the main concepts in the international rules and standards related to risk management in companies. By this way, there will be an analysis of issues such as the COSO - ERM model, an introduction to the ISO 27000 and 31000 standards; and the Project Management according to PMI targeted at risk management
A standard curve based method for relative real time PCR data processing
Directory of Open Access Journals (Sweden)
Krause Andreas
2005-03-01
Full Text Available Abstract Background Currently real time PCR is the most precise method by which to measure gene expression. The method generates a large amount of raw numerical data and processing may notably influence final results. The data processing is based either on standard curves or on PCR efficiency assessment. At the moment, the PCR efficiency approach is preferred in relative PCR whilst the standard curve is often used for absolute PCR. However, there are no barriers to employ standard curves for relative PCR. This article provides an implementation of the standard curve method and discusses its advantages and limitations in relative real time PCR. Results We designed a procedure for data processing in relative real time PCR. The procedure completely avoids PCR efficiency assessment, minimizes operator involvement and provides a statistical assessment of intra-assay variation. The procedure includes the following steps. (I Noise is filtered from raw fluorescence readings by smoothing, baseline subtraction and amplitude normalization. (II The optimal threshold is selected automatically from regression parameters of the standard curve. (III Crossing points (CPs are derived directly from coordinates of points where the threshold line crosses fluorescence plots obtained after the noise filtering. (IV The means and their variances are calculated for CPs in PCR replicas. (V The final results are derived from the CPs' means. The CPs' variances are traced to results by the law of error propagation. A detailed description and analysis of this data processing is provided. The limitations associated with the use of parametric statistical methods and amplitude normalization are specifically analyzed and found fit to the routine laboratory practice. Different options are discussed for aggregation of data obtained from multiple reference genes. Conclusion A standard curve based procedure for PCR data processing has been compiled and validated. It illustrates that
Development of a Dynamic Lidar Uncertainty Framework
Energy Technology Data Exchange (ETDEWEB)
Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County
2017-08-07
As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict
ICYESS 2013: Understanding and Interpreting Uncertainty
Rauser, F.; Niederdrenk, L.; Schemann, V.; Schmidt, A.; Suesser, D.; Sonntag, S.
2013-12-01
We will report the outcomes and highlights of the Interdisciplinary Conference of Young Earth System Scientists (ICYESS) on Understanding and Interpreting Uncertainty in September 2013, Hamburg, Germany. This conference is aimed at early career scientists (Masters to Postdocs) from a large variety of scientific disciplines and backgrounds (natural, social and political sciences) and will enable 3 days of discussions on a variety of uncertainty-related aspects: 1) How do we deal with implicit and explicit uncertainty in our daily scientific work? What is uncertain for us, and for which reasons? 2) How can we communicate these uncertainties to other disciplines? E.g., is uncertainty in cloud parameterization and respectively equilibrium climate sensitivity a concept that is understood equally well in natural and social sciences that deal with Earth System questions? Or vice versa, is, e.g., normative uncertainty as in choosing a discount rate relevant for natural scientists? How can those uncertainties be reconciled? 3) How can science communicate this uncertainty to the public? Is it useful at all? How are the different possible measures of uncertainty understood in different realms of public discourse? Basically, we want to learn from all disciplines that work together in the broad Earth System Science community how to understand and interpret uncertainty - and then transfer this understanding to the problem of how to communicate with the public, or its different layers / agents. ICYESS is structured in a way that participation is only possible via presentation, so every participant will give their own professional input into how the respective disciplines deal with uncertainty. Additionally, a large focus is put onto communication techniques; there are no 'standard presentations' in ICYESS. Keynote lectures by renowned scientists and discussions will lead to a deeper interdisciplinary understanding of what we do not really know, and how to deal with it. Many
Uncertainties in Climatological Seawater Density Calculations
Dai, Hao; Zhang, Xining
2018-03-01
In most applications, with seawater conductivity, temperature, and pressure data measured in situ by various observation instruments e.g., Conductivity-Temperature-Depth instruments (CTD), the density which has strong ties to ocean dynamics and so on is computed according to equations of state for seawater. This paper, based on density computational formulae in the Thermodynamic Equation of Seawater 2010 (TEOS-10), follows the Guide of the expression of Uncertainty in Measurement (GUM) and assesses the main sources of uncertainties. By virtue of climatological decades-average temperature/Practical Salinity/pressure data sets in the global ocean provided by the National Oceanic and Atmospheric Administration (NOAA), correlation coefficients between uncertainty sources are determined and the combined standard uncertainties uc>(ρ>) in seawater density calculations are evaluated. For grid points in the world ocean with 0.25° resolution, the standard deviations of uc>(ρ>) in vertical profiles cover the magnitude order of 10-4 kg m-3. The uc>(ρ>) means in vertical profiles of the Baltic Sea are about 0.028kg m-3 due to the larger scatter of Absolute Salinity anomaly. The distribution of the uc>(ρ>) means in vertical profiles of the world ocean except for the Baltic Sea, which covers the range of >(0.004,0.01>) kg m-3, is related to the correlation coefficient r>(SA,p>) between Absolute Salinity SA and pressure p. The results in the paper are based on sensors' measuring uncertainties of high accuracy CTD. Larger uncertainties in density calculations may arise if connected with lower sensors' specifications. This work may provide valuable uncertainty information required for reliability considerations of ocean circulation and global climate models.
Vaginismus and dyspareunia: relationship with general and sex-related moral standards.
Borg, Charmaine; de Jong, Peter J; Weijmar Schultz, Willibrord
2011-01-01
Relatively strong adherence to conservative values and/or relatively strict sex-related moral standards logically restricts the sexual repertoire and will lower the threshold for experiencing negative emotions in a sexual context. In turn, this may generate withdrawal and avoidance behavior, which is at the nucleus of vaginismus. To examine whether indeed strong adherence to conservative morals and/or strict sexual standards may be involved in vaginismus. The Schwartz Value Survey (SVS) to investigate the individual's value pattern and the Sexual Disgust Questionnaire (SDQ) to index the willingness to perform certain sexual activities as an indirect measure of sex-related moral standards. The SVS and SDQ were completed by three groups: women diagnosed with vaginismus (N=24), a group of women diagnosed with dyspareunia (N=24), and a healthy control group of women without sexual complaints (N=32). Specifically, the vaginismus group showed relatively low scores on liberal values together with comparatively high scores on conservative values. Additionally, the vaginismus group was more restricted in their readiness to perform particular sex-related behaviors than the control group. The dyspareunia group, on both the SVS and the SDQ, placed between the vaginismus and the control group, but not significantly different than either of the groups. The findings are consistent with the view that low liberal and high conservative values, along with restricted sexual standards, are involved in the development/maintenance of vaginismus. © 2010 International Society for Sexual Medicine.
Evaluating prediction uncertainty
International Nuclear Information System (INIS)
McKay, M.D.
1995-03-01
The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented
Energy Technology Data Exchange (ETDEWEB)
Chen, Chang-Yuan, E-mail: yctcccy@163.net [School of Physics and Electronics, Yancheng Teachers University, Yancheng 224051 (China); You, Yuan; Lu, Fa-Lin [School of Physics and Electronics, Yancheng Teachers University, Yancheng 224051 (China); Dong, Shi-Hai, E-mail: dongsh2@yahoo.com [Departamento de Física, Escuela Superior de Física y Matemáticas, Instituto Politécnico Nacional, Edificio 9, Unidad Profesional Adolfo López Mateos, Mexico D.F. 07738 (Mexico)
2013-06-17
We present the position–momentum uncertainties for the Pöschl–Teller potential. We observe that the Δx decreases with the potential depth λ but increases with quantum number n. Interestingly, we find that the Δp first increases and then decreases with the n. The ΔxΔp first decreases and then increases with the λ, but almost becomes a constant (n+1/2)ℏ for a larger λ. Particularly, there exists a squeezed phenomenon in position x for the lower states. The squeezing in x compensated for by an increase in momentum p, such that ΔxΔp⩾ℏ/2 is still satisfied.
International Nuclear Information System (INIS)
Chen, Chang-Yuan; You, Yuan; Lu, Fa-Lin; Dong, Shi-Hai
2013-01-01
We present the position–momentum uncertainties for the Pöschl–Teller potential. We observe that the Δx decreases with the potential depth λ but increases with quantum number n. Interestingly, we find that the Δp first increases and then decreases with the n. The ΔxΔp first decreases and then increases with the λ, but almost becomes a constant (n+1/2)ℏ for a larger λ. Particularly, there exists a squeezed phenomenon in position x for the lower states. The squeezing in x compensated for by an increase in momentum p, such that ΔxΔp⩾ℏ/2 is still satisfied.
Chapter 3: Traceability and uncertainty
International Nuclear Information System (INIS)
McEwen, Malcolm
2014-01-01
Chapter 3 presents: an introduction; Traceability (measurement standard, role of the Bureau International des Poids et Mesures, Secondary Standards Laboratories, documentary standards and traceability as process review); Uncertainty (Example 1 - Measurement, M raw (SSD), Example 2 - Calibration data, N D.w 60 Co, kQ, Example 3 - Correction factor, P TP ) and Conclusion
DEFF Research Database (Denmark)
Nguyen, Daniel Xuyen
This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...
Technical Review of Law Enforcement Standards and Guides Relative to Incident Management
Energy Technology Data Exchange (ETDEWEB)
Stenner, Robert D.; Salter, R.; Stanton, J. R.; Fisher, D.
2009-03-24
In an effort to locate potential law enforcement-related standards that support incident management, a team from the Pacific Northwest National Laboratory (PNNL) contacted representatives from the National Institute of Standards-Office of Law Enforcement Standards (NIST-OLES), National Institute of Justice (NIJ), Federal Bureau of Investigation (FBI), Secret Service, ASTM International committees that have a law enforcement focus, and a variety of individuals from local and regional law enforcement organizations. Discussions were held with various state and local law enforcement organizations. The NIJ has published several specific equipment-related law enforcement standards that were included in the review, but it appears that law enforcement program and process-type standards are developed principally by organizations that operate at the state and local level. Input is provided from state regulations and codes and from external non-government organizations (NGOs) that provide national standards. The standards that are adopted from external organizations or developed independently by state authorities are available for use by local law enforcement agencies on a voluntary basis. The extent to which they are used depends on the respective jurisdictions involved. In some instances, use of state and local disseminated standards is mandatory, but in most cases, use is voluntary. Usually, the extent to which these standards are used appears to depend on whether or not jurisdictions receive certification from a “governing” entity due to their use and compliance with the standards. In some cases, these certification-based standards are used in principal but without certification or other compliance monitoring. In general, these standards appear to be routinely used for qualification, selection for employment, and training. In these standards, the term “Peace Officer” is frequently used to refer to law enforcement personnel. This technical review of national law
Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian
2013-04-01
Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a
DEFF Research Database (Denmark)
Diky, Vladimir; Chirico, Robert D.; Muzny, Chris
ThermoData Engine (TDE, NIST Standard Reference Databases 103a and 103b) is the first product that implements the concept of Dynamic Data Evaluation in the fields of thermophysics and thermochemistry, which includes maintaining the comprehensive and up-to-date database of experimentally measured ...... uncertainties, curve deviations, and inadequacies of the models. Uncertainty analysis shows relative contributions to the total uncertainty from each component and pair of components....
Evaluation of uncertainty and detection limits in radioactivity measurements
Energy Technology Data Exchange (ETDEWEB)
Herranz, M. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain); Idoeta, R. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain)], E-mail: raquel.idoeta@ehu.es; Legarda, F. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain)
2008-10-01
The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories.
Evaluation of uncertainty and detection limits in radioactivity measurements
International Nuclear Information System (INIS)
Herranz, M.; Idoeta, R.; Legarda, F.
2008-01-01
The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories
International Nuclear Information System (INIS)
Obarski, Gregory E.; Splett, Jolene D.
2001-01-01
We have developed a transfer standard for the spectral density of relative intensity noise (RIN) of optical fiber sources near 1550 nm. Amplified spontaneous emission (ASE) from an erbium-doped fiber amplifier (EDFA), when it is optically filtered over a narrow band (<5 nm), yields a stable RIN spectrum that is practically constant to several tens of gigahertz. The RIN is calculated from the power spectral density as measured with a calibrated optical spectrum analyzer. For a typical device it is -110 dB/Hz, with uncertainty ≤0.12 dB/Hz. The invariance of the RIN under attenuation yields a considerable dynamic range with respect to rf noise levels. Results are compared with those from a second method that uses a distributed-feedback laser (DFB) that has a Poisson-limited RIN. Application of each method to the same RIN measurement system yields frequency-dependent calibration functions that, when they are averaged, differ by ≤0.2 dB. [copyright] 2001 Optical Society of America
Energy Technology Data Exchange (ETDEWEB)
Obarski, Gregory E.; Splett, Jolene D.
2001-06-01
We have developed a transfer standard for the spectral density of relative intensity noise (RIN) of optical fiber sources near 1550 nm. Amplified spontaneous emission (ASE) from an erbium-doped fiber amplifier (EDFA), when it is optically filtered over a narrow band ({lt}5 nm), yields a stable RIN spectrum that is practically constant to several tens of gigahertz. The RIN is calculated from the power spectral density as measured with a calibrated optical spectrum analyzer. For a typical device it is {minus}110 dB/Hz, with uncertainty {le}0.12 dB/Hz. The invariance of the RIN under attenuation yields a considerable dynamic range with respect to rf noise levels. Results are compared with those from a second method that uses a distributed-feedback laser (DFB) that has a Poisson-limited RIN. Application of each method to the same RIN measurement system yields frequency-dependent calibration functions that, when they are averaged, differ by {le}0.2 dB. {copyright} 2001 Optical Society of America
Adolescents' Motivation for Reading: Group Differences and Relation to Standardized Achievement
Wolters, Christopher A.; Denton, Carolyn A.; York, Mary J.; Francis, David J.
2014-01-01
The purpose of this study was to extend the research on adolescents' motivation for reading by examining important group differences and the relation of motivation to standardized achievement. Adolescents (N = 406) ranging from grade 7 to grade 12 completed a self-report survey that assessed 13 different aspects of their reading motivation…
P. Dorian Owen
2009-01-01
The relative standard deviation of win percentages, the most widely used measure of within-season competitive balance, has an upper bound which is very sensitive to variation in the numbers of teams and games played. Taking into account this upper bound provides additional insight into comparisons of competitive balance across leagues or over time.
Challenging the 3.0 GPA Eligibility Standard for Public Relations Internships.
Maynard, Michael L.
1999-01-01
Analyzes the appropriateness of a 3.0 GPA standard for public relations internship eligibility at one university. Seeks to determine at what GPA cutoff faculty can feel confident that the student will gain from the internship without damaging the program's reputation. Finds students with a 2.7 GPA did as well as students with GPAs ranging from 3.0…
2013-02-25
... wellness services and chronic disease management; and pediatric services, including oral \\8\\ and vision... Act uses the terms ``dental'' and ``oral'' interchangeably when referring to the pediatric dental care... Parts 147, 155, and 156 Patient Protection and Affordable Care Act; Standards Related to Essential...
Uncertainty calculations made easier
International Nuclear Information System (INIS)
Hogenbirk, A.
1994-07-01
The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)
Manzini, E.; Karpechko, A.Yu.; Anstey, J.; Shindell, Drew Todd; Baldwin, M.P.; Black, R.X.; Cagnazzo, C.; Calvo, N.; Charlton-Perez, A.; Christiansen, B.;
2014-01-01
Future changes in the stratospheric circulation could have an important impact on northern winter tropospheric climate change, given that sea level pressure (SLP) responds not only to tropospheric circulation variations but also to vertically coherent variations in troposphere-stratosphere circulation. Here we assess northern winter stratospheric change and its potential to influence surface climate change in the Coupled Model Intercomparison Project-Phase 5 (CMIP5) multimodel ensemble. In the stratosphere at high latitudes, an easterly change in zonally averaged zonal wind is found for the majority of the CMIP5 models, under the Representative Concentration Pathway 8.5 scenario. Comparable results are also found in the 1% CO2 increase per year projections, indicating that the stratospheric easterly change is common feature in future climate projections. This stratospheric wind change, however, shows a significant spread among the models. By using linear regression, we quantify the impact of tropical upper troposphere warming, polar amplification, and the stratospheric wind change on SLP. We find that the intermodel spread in stratospheric wind change contributes substantially to the intermodel spread in Arctic SLP change. The role of the stratosphere in determining part of the spread in SLP change is supported by the fact that the SLP change lags the stratospheric zonally averaged wind change. Taken together, these findings provide further support for the importance of simulating the coupling between the stratosphere and the troposphere, to narrow the uncertainty in the future projection of tropospheric circulation changes.
Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon
2018-01-01
The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.
Saviano, Alessandro Morais; Francisco, Fabiane Lacerda; Ostronoff, Celina Silva; Lourenço, Felipe Rebello
2015-01-01
The aim of this study was to develop, optimize, and validate a microplate bioassay for relative potency determination of linezolid in pharmaceutical samples using quality-by-design and design space approaches. In addition, a procedure is described for estimating relative potency uncertainty based on microbiological response variability. The influence of culture media composition was studied using a factorial design and a central composite design was adopted to study the influence of inoculum proportion and triphenyltetrazolium chloride in microbial growth. The microplate bioassay was optimized regarding the responses of low, medium, and high doses of linezolid, negative and positive controls, and the slope, intercept, and correlation coefficient of dose-response curves. According to optimization results, design space ranges were established using: (a) low (1.0 μg/mL), medium (2.0 μg/mL), and high (4.0 μg/mL) doses of pharmaceutical samples and linezolid chemical reference substance; (b) Staphylococcus aureus ATCC 653 in an inoculum proportion of 10%; (c) antibiotic No. 3 culture medium pH 7.0±0.1; (d) 6 h incubation at 37.0±0.1ºC; and (e) addition of 50 μL of 0.5% (w/v) triphenyltetrazolium chloride solution. The microplate bioassay was linear (r2=0.992), specific, precise (repeatability RSD=2.3% and intermediate precision RSD=4.3%), accurate (mean recovery=101.4%), and robust. The overall measurement uncertainty was reasonable considering the increased variability inherent in microbiological response. Final uncertainty was comparable with those obtained with other microbiological assays, as well as chemical methods.
Ruymán Brito-Brito, Pedro; García-Tesouro, Esther; Fernández-Gutiérrez, Domingo Ángel; García-Hernández, Alfonso Miguel; Fernández-Gutiérrez, Raquel; Burillo-Putze, Guillermo
2018-01-01
To validate a Spanish adaptation of the Mishel Uncertainty of Illness Scale for use with emergency-department (ED) patients and their accompanying relatives or friends (the UIS-ED). We first developed a version of the questionnaire for Spanish ED situations. Next we assessed the content validity index for each of its items, revised it, and reassessed its face validity to produce a second version, which we then piloted in 20 hospital ED patients. A third revised version was then validated in a population of 320 adults (160 patients and 160 accompanying persons) who attended the ED between November 2015 and September 2016. The 12-item UIS-ED (60 points) was administered by 2 nurses while the patients and accompanying persons were in the ED. We gathered sociodemographic and clinical data as well as the subjects' perception about the information they were given. The mean (SD) uncertainty score among patients was 29 (11) points. Accompanying persons had a mean score of 36 (13) points. Factorial analysis confirmed the instrument's construct validity, finding that both dimensions of the original Mishel scale (complexity and ambiguity) were present in 6 items each. Factorial analysis explained 60% of the total variance in the patient version and 67% of the variance in the version for accompanying persons. Reliability statistics were good, with Cronbach's α values ranging from 0.912 to 0.938. Split-half reliability statistics ranged from 0.901 to 0.933. Correlations were significant in the analysis of convergent validity. The UIS-ED questionnaire may prove to be a simple, valid, and reliable way for assessing uncertainty in patients and their accompanying friends or relatives attending Spanish EDs.
International Nuclear Information System (INIS)
Jang, Misuk; Jeon, Jong Seon; Kang, Hyun Sik; Kim, Seoung Rae
2016-01-01
In this paper, we would introduce and review technical standards related to sodium fire and plutonium criticality safety. This paper may be helpful to identify considerations in the development of equipment, standards, and etc., to meet the safety requirements in the design, construction and operating of TFFF, KAPF and SFR. The feasibility and conceptual designs are being examined on related facilities, for example, TRU Fuel Fabrication Facilities (TFFF), Korea Advanced Pyro-process Facility (KAPF), and Sodium Cooled Fast Reactor (SFR), in Korea. However, the safety concerns of these facilities have been controversial in part because of the Sodium fire accident and Plutonium related radiation safety caused by transport and handling accident. Thus, many researches have been performed to ensure safety and various documents including safety requirements have been developed. In separating and reducing the long-lived radioactive transuranic(TRU) in the spent nuclear fuel, reusing as the potential energy of uranium fuel resources and reducing the high level wastes, TFFF would be receiving the attention of many people. Thus, people would wonder whether compliance with technical standards that ensures safety. For new facility design, one of the important tasks is to review of technical standards, especially for sodium and Plutonium because of water related highly reactive characteristics and criticality hazard respectively. We have introduced and reviewed two important technical standards for TFFF, which are sodium fire and plutonium criticality safety, in this paper. This paper would provide a brief guidance, about how to start and what is important, to people who are responsible for the initial design to operation of TFFF
Energy Technology Data Exchange (ETDEWEB)
Jang, Misuk; Jeon, Jong Seon; Kang, Hyun Sik; Kim, Seoung Rae [NESS, Daejeon (Korea, Republic of)
2016-10-15
In this paper, we would introduce and review technical standards related to sodium fire and plutonium criticality safety. This paper may be helpful to identify considerations in the development of equipment, standards, and etc., to meet the safety requirements in the design, construction and operating of TFFF, KAPF and SFR. The feasibility and conceptual designs are being examined on related facilities, for example, TRU Fuel Fabrication Facilities (TFFF), Korea Advanced Pyro-process Facility (KAPF), and Sodium Cooled Fast Reactor (SFR), in Korea. However, the safety concerns of these facilities have been controversial in part because of the Sodium fire accident and Plutonium related radiation safety caused by transport and handling accident. Thus, many researches have been performed to ensure safety and various documents including safety requirements have been developed. In separating and reducing the long-lived radioactive transuranic(TRU) in the spent nuclear fuel, reusing as the potential energy of uranium fuel resources and reducing the high level wastes, TFFF would be receiving the attention of many people. Thus, people would wonder whether compliance with technical standards that ensures safety. For new facility design, one of the important tasks is to review of technical standards, especially for sodium and Plutonium because of water related highly reactive characteristics and criticality hazard respectively. We have introduced and reviewed two important technical standards for TFFF, which are sodium fire and plutonium criticality safety, in this paper. This paper would provide a brief guidance, about how to start and what is important, to people who are responsible for the initial design to operation of TFFF.
Röder, Mirjam; Thornley, Patricia
2018-04-01
Considering the urgent need to shift to low carbon energy carriers, waste wood resources could provide an alternative energy feedstock and at the same time reduce emissions from landfill. This research examines the climate change impacts and related emission uncertainties of waste wood based energy. For this, different grades of waste wood and energy application have been investigated using lifecycle assessment. Sensitivity analysis has then been applied for supply chain processes and feedstock properties for the main emission contributing categories: transport, processing, pelletizing, urea resin fraction and related N 2 O formation. The results show, depending on the waste wood grade, the conversion option, scale and the related reference case, that emission reductions of up to 91% are possible for non-treated wood waste. Compared to this, energy from treated wood waste with low contamination can achieve up to 83% emission savings, similar to untreated waste wood pellets, but in some cases emissions from waste wood based energy can exceed the ones of the fossil fuel reference - in the worst case by 126%. Emission reductions from highly contaminated feedstocks are largest when replacing electricity from large-scale coal and landfill. The highest emission uncertainties are related to the wood's resin fraction and N 2 O formation during combustion and, pelletizing. Comparing wood processing with diesel and electricity powered equipment also generated high variations in the results, while emission variations related to transport are relatively small. Using treated waste wood as a bioenergy feedstock can be a valid option to reduce emissions from energy production but this is only realisable if coal and landfill gas are replaced. To achieve meaningful emission reduction in line with national and international climate change targets, pre-treatment of waste wood would be required to reduce components that form N 2 O during the energy conversion. Copyright © 2017
A New Framework for Quantifying Lidar Uncertainty
Energy Technology Data Exchange (ETDEWEB)
Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.; Churchfield, Matthew J.
2017-03-24
As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.
A novel dose uncertainty model and its application for dose verification
International Nuclear Information System (INIS)
Jin Hosang; Chung Heetaek; Liu Chihray; Palta, Jatinder; Suh, Tae-Suk; Kim, Siyong
2005-01-01
Based on statistical approach, a novel dose uncertainty model was introduced considering both nonspatial and spatial dose deviations. Non-space-oriented uncertainty is mainly caused by dosimetric uncertainties, and space-oriented dose uncertainty is the uncertainty caused by all spatial displacements. Assuming these two parts are independent, dose difference between measurement and calculation is a linear combination of nonspatial and spatial dose uncertainties. Two assumptions were made: (1) the relative standard deviation of nonspatial dose uncertainty is inversely proportional to the dose standard deviation σ, and (2) the spatial dose uncertainty is proportional to the gradient of dose. The total dose uncertainty is a quadratic sum of the nonspatial and spatial uncertainties. The uncertainty model provides the tolerance dose bound for comparison between calculation and measurement. In the statistical uncertainty model based on a Gaussian distribution, a confidence level of 3σ theoretically confines 99.74% of measurements within the bound. By setting the confidence limit, the tolerance bound for dose comparison can be made analogous to that of existing dose comparison methods (e.g., a composite distribution analysis, a γ test, a χ evaluation, and a normalized agreement test method). However, the model considers the inherent dose uncertainty characteristics of the test points by taking into account the space-specific history of dose accumulation, while the previous methods apply a single tolerance criterion to the points, although dose uncertainty at each point is significantly different from others. Three types of one-dimensional test dose distributions (a single large field, a composite flat field made by two identical beams, and three-beam intensity-modulated fields) were made to verify the robustness of the model. For each test distribution, the dose bound predicted by the uncertainty model was compared with simulated measurements. The simulated
Uncertainty analysis of environmental models
International Nuclear Information System (INIS)
Monte, L.
1990-01-01
In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition
International Nuclear Information System (INIS)
Liu Shuyu; Hu Changqin
2007-01-01
This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1 H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard
DEFF Research Database (Denmark)
Larsman, P; Thorn, S; Søgaard, K
2009-01-01
The current study investigated the associations between work-related perceived stress and surface electromyographic (sEMG) parameters (muscle activity and muscle rest) during standardized simulated computer work (typing, editing, precision, and Stroop tasks). It was part of the European case......-control study, NEW (Neuromuscular assessment in the Elderly Worker). The present cross-sectional study was based on a questionnaire survey and sEMG measurements among Danish and Swedish female computer users aged 45 or older (n=49). The results show associations between work-related perceived stress...... and trapezius muscle activity and rest during standardized simulated computer work, and provide partial empirical support for the hypothesized pathway of stress induced muscle activity in the association between an adverse psychosocial work environment and musculoskeletal symptoms in the neck and shoulder....
Effect of uncertainties on probabilistic-based design capacity of hydrosystems
Tung, Yeou-Koung
2018-02-01
Hydrosystems engineering designs involve analysis of hydrometric data (e.g., rainfall, floods) and use of hydrologic/hydraulic models, all of which contribute various degrees of uncertainty to the design process. Uncertainties in hydrosystem designs can be generally categorized into aleatory and epistemic types. The former arises from the natural randomness of hydrologic processes whereas the latter are due to knowledge deficiency in model formulation and model parameter specification. This study shows that the presence of epistemic uncertainties induces uncertainty in determining the design capacity. Hence, the designer needs to quantify the uncertainty features of design capacity to determine the capacity with a stipulated performance reliability under the design condition. Using detention basin design as an example, the study illustrates a methodological framework by considering aleatory uncertainty from rainfall and epistemic uncertainties from the runoff coefficient, curve number, and sampling error in design rainfall magnitude. The effects of including different items of uncertainty and performance reliability on the design detention capacity are examined. A numerical example shows that the mean value of the design capacity of the detention basin increases with the design return period and this relation is found to be practically the same regardless of the uncertainty types considered. The standard deviation associated with the design capacity, when subject to epistemic uncertainty, increases with both design frequency and items of epistemic uncertainty involved. It is found that the epistemic uncertainty due to sampling error in rainfall quantiles should not be ignored. Even with a sample size of 80 (relatively large for a hydrologic application) the inclusion of sampling error in rainfall quantiles resulted in a standard deviation about 2.5 times higher than that considering only the uncertainty of the runoff coefficient and curve number. Furthermore, the
Uncertainty in artificial intelligence
Levitt, TS; Lemmer, JF; Shachter, RD
1990-01-01
Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i
Energy Technology Data Exchange (ETDEWEB)
Dehesa, J S [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, 18071-Granada (Spain); Gonzalez-Ferez, R [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, 18071-Granada (Spain); Sanchez-Moreno, P [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, 18071-Granada (Spain)
2007-02-23
The inequality
>= (l+1/2){sup 2}
International Nuclear Information System (INIS)
Nha, Hyunchul; Kim, Jaewan
2006-01-01
We derive a class of inequalities, from the uncertainty relations of the su(1,1) and the su(2) algebra in conjunction with partial transposition, that must be satisfied by any separable two-mode states. These inequalities are presented in terms of the su(2) operators J x =(a † b+ab † )/2, J y =(a † b-ab † )/2i, and the total photon number a +N b >. They include as special cases the inequality derived by Hillery and Zubairy [Phys. Rev. Lett. 96, 050503 (2006)], and the one by Agarwal and Biswas [New J. Phys. 7, 211 (2005)]. In particular, optimization over the whole inequalities leads to the criterion obtained by Agarwal and Biswas. We show that this optimal criterion can detect entanglement for a broad class of non-Gaussian entangled states, i.e., the su(2) minimum-uncertainty states. Experimental schemes to test the optimal criterion are also discussed, especially the one using linear optical devices and photodetectors
CSIR Research Space (South Africa)
Jansen van Rensburg, Gerhardus J
2011-10-01
Full Text Available In the present study, numerical results obtained on different but related shapes are compared by using a non-rigid mapping. Non-rigid registration is employed to obtain mesh representations of different human skull geometries with the same mesh...
Directory of Open Access Journals (Sweden)
Rami Ahmad El-Nabulsi
2015-08-01
Full Text Available Recently, non-standard Lagrangians have gained a growing importance in theoretical physics and in the theory of non-linear differential equations. However, their formulations and implications in general relativity are still in their infancies despite some advances in contemporary cosmology. The main aim of this paper is to fill the gap. Though non-standard Lagrangians may be defined by a multitude form, in this paper, we considered the exponential type. One basic feature of exponential non-standard Lagrangians concerns the modified Euler-Lagrange equation obtained from the standard variational analysis. Accordingly, when applied to spacetime geometries, one unsurprisingly expects modified geodesic equations. However, when taking into account the time-like paths parameterization constraint, remarkably, it was observed that mutually discrete gravity and discrete spacetime emerge in the theory. Two different independent cases were obtained: A geometrical manifold with new spacetime coordinates augmented by a metric signature change and a geometrical manifold characterized by a discretized spacetime metric. Both cases give raise to Einstein’s field equations yet the gravity is discretized and originated from “spacetime discreteness”. A number of mathematical and physical implications of these results were discussed though this paper and perspectives are given accordingly.
International Nuclear Information System (INIS)
Crabol, B.
1985-04-01
An original concept on the difference of behaviour of the high frequency (small-scale) and low frequency (large-scale) atmospheric turbulence relatively to the mean wind speed has been introduced. Through a dimensional analysis based on TAYLOR's formulation, it has been shown that the parameter of the atmospheric dispersion standard-deviations was the travel distance near the source, and the travel time far from the source. Using hypotheses on the energy spectrum in the atmosphere, a numerical application has made it possible to quantify the evolution of the horizontal standard deviation for different mean wind speeds between 0,2 and 10m/s. The areas of validity of the parameter (travel distance or travel time) are clearly shown. The first one is confined in the near field and is all the smaller if the wind speed decreases. For t > 5000s, the dependence on the wind speed of the horizontal standard-deviation expressed in function of the travel time becomes insignificant. The horizontal standard-deviation is only function of the travel time. Results are compared with experimental data obtained in the atmosphere. The similar evolution of the calculated and experimental curves confirms the validity of the hypothesis and input data in calculation. This study can be applied to radioactive effluents transport in the atmosphere
Statistical characterization of roughness uncertainty and impact on wind resource estimation
DEFF Research Database (Denmark)
Kelly, Mark C.; Ejsing Jørgensen, Hans
2017-01-01
In this work we relate uncertainty in background roughness length (z0) to uncertainty in wind speeds, where the latter are predicted at a wind farm location based on wind statistics observed at a different site. Sensitivity of predicted winds to roughness is derived analytically for the industry...... between mean wind speed and AEP. Following our developments, we provide guidance on approximate roughness uncertainty magnitudes to be expected in industry practice, and we also find that sites with larger background roughness incur relatively larger uncertainties.......-standard European Wind Atlas method, which is based on the geostrophic drag law. We statistically consider roughness and its corresponding uncertainty, in terms of both z0 derived from measured wind speeds as well as that chosen in practice by wind engineers. We show the combined effect of roughness uncertainty...
Garcia-Aristizabal, Alexander; Bucchignani, Edoardo; Marzocchi, Warner; Uhinga, Guido
2013-04-01
Extreme meteorological phenomena such as heavy precipitation, extreme temperature, or strong winds, may have considerable impacts on the economy, infrastructure, health, as well as may represent a non-negligible threat for human life. A changing climate may lead to changes in the frequency, intensity, spatial extent, duration, and timing of weather and climate extremes, and can result in unprecedented extreme events. Climatological parameters, that are reference variables for the assessment of climate-related hazards, can be generally obtained from data catalogues; nevertheless, it is often the case that the time window of the observations, if available at all, is too short for a correct analysis of the most extreme and less frequent events. For this reason there is a growing interest on the use of 'synthetic' data derived from climatological models which in addition, allow the possibility to perform climate projections considering different plausible emission/concentration scenarios in the modelling. Within this context, the scenario-based climate projections can be useful to assess possible temporal variations on climatological parameters (and hence in climate-related hazards) under climate change conditions. Here we discuss the characterization of some climate-related hazards based on the analysis of climatological parameters, debating relevant issues in the use of both observed and synthetic data, the consideration of climate-change scenarios, and the quantification and communication of uncertainties. In particular, to account for possible non-stationary conditions in the analysis of extremes under climate-change conditions, we have adopted a practical covariate approach recently used in different hydrological and meteorological applications, and used a Bayesian framework for the parameter estimation and uncertainty propagation.
Estimation of the uncertainties considered in NPP PSA level 2
International Nuclear Information System (INIS)
Kalchev, B.; Hristova, R.
2005-01-01
The main approaches of the uncertainties analysis are presented. The sources of uncertainties which should be considered in PSA level 2 for WWER reactor such as: uncertainties propagated from level 1 PSA; uncertainties in input parameters; uncertainties related to the modelling of physical phenomena during the accident progression and uncertainties related to the estimation of source terms are defined. The methods for estimation of the uncertainties are also discussed in this paper
The effect of short-range spatial variability on soil sampling uncertainty
Energy Technology Data Exchange (ETDEWEB)
Perk, Marcel van der [Department of Physical Geography, Utrecht University, P.O. Box 80115, 3508 TC Utrecht (Netherlands)], E-mail: m.vanderperk@geo.uu.nl; De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Laboratori, Misure ed Attivita di Campo, Via di Castel Romano, 100-00128 Roma (Italy); Fajgelj, Ales; Sansone, Umberto [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-1400 Vienna (Austria); Jeran, Zvonka; Jacimovic, Radojko [Jozef Stefan Institute, Jamova 39, 1000 Ljubljana (Slovenia)
2008-11-15
This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.
The effect of short-range spatial variability on soil sampling uncertainty.
Van der Perk, Marcel; de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Sansone, Umberto; Jeran, Zvonka; Jaćimović, Radojko
2008-11-01
This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.
Trowbridge, Philip R; Kahl, J Steve; Sassan, Dari A; Heath, Douglas L; Walsh, Edward M
2010-07-01
Six watersheds in New Hampshire were studied to determine the effects of road salt on stream water quality. Specific conductance in streams was monitored every 15 min for one year using dataloggers. Chloride concentrations were calculated from specific conductance using empirical relationships. Stream chloride concentrations were directly correlated with development in the watersheds and were inversely related to streamflow. Exceedances of the EPA water quality standard for chloride were detected in the four watersheds with the most development. The number of exceedances during a year was linearly related to the annual average concentration of chloride. Exceedances of the water quality standard were not predicted for streams with annual average concentrations less than 102 mg L(-1). Chloride was imported into three of the watersheds at rates ranging from 45 to 98 Mg Cl km(-2) yr(-1). Ninety-one percent of the chloride imported was road salt for deicing roadways and parking lots. A simple, mass balance equation was shown to predict annual average chloride concentrations from streamflow and chloride import rates to the watershed. This equation, combined with the apparent threshold for exceedances of the water quality standard, can be used for screening-level TMDLs for road salt in impaired watersheds.
Uncertainties and Solutions Related to Use of WRB (2007) in the Boreo-nemoral zone, Case of Latvia
Kasparinskis, Raimonds; Nikodemus, Olgerts; Rolavs, Nauris
2014-05-01
Relatively high diversity of soils groups according to the WRB (2007) classification is observed in forest ecosystems in the boreo-nemoral zone in Latvia. This is due to the geological genesis of area and environmental conditions (Kasparinskis, Nikodemus, 2012), as well as historical land use and management (Nikodemus et al., 2013). Due to the relatively young soils, Albic, Spodic and Cambic horizons are relatively weakly expressed in many cases. Relatively well developed Albic horizons occur in sandy forest soils, but unusually well expressed Spodic features are observed. In some cases there is a Cambic horizon, however location of Cambisols in the WRB (2007) soil classification sequence does not provide an opportunity to classify these soils as Cambisols, but they are classified as Arenosols. This sequence does not reflect the logical sheme of soil development, and therefore raises the question about location of Podzols, Arenosols and Cambisols in the sequence of WRB (2007) soil classification. Soils with two parent materials (abrupt textural change) are relatively common in Latvia, where conceptually on the small scale mapping results in classification as the soil group Planosols, but in many cases there is occurrence of Fluvic materials, as parent material in the upper part of the soil profile is formed by Baltic Ice lake sandy sediments - this leads to question about the location of Fluvisols and Planosols in the sequence of the WRB (2007) soil classification. Soil research has found cases, where a relatively well developed Spodic horizon was established as the result of ground water table depth in areas of abrupt textural change. In this case the profile corresponds to the soil group of Podzols, however in some cases - Gleysols not Planosols due to a high ground water table. Therefore there is a need for discussion also about the location of Podzols and Planosols in the sequence of the WRB (2007) soil classification. The above mentioned questions raise
Characterizing Epistemic Uncertainty for Launch Vehicle Designs
Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad
2016-01-01
NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.
Gallagher, C. B.; Ferraro, A.
2018-05-01
A possible alternative to the standard model of measurement-based quantum computation (MBQC) is offered by the sequential model of MBQC—a particular class of quantum computation via ancillae. Although these two models are equivalent under ideal conditions, their relative resilience to noise in practical conditions is not yet known. We analyze this relationship for various noise models in the ancilla preparation and in the entangling-gate implementation. The comparison of the two models is performed utilizing both the gate infidelity and the diamond distance as figures of merit. Our results show that in the majority of instances the sequential model outperforms the standard one in regard to a universal set of operations for quantum computation. Further investigation is made into the performance of sequential MBQC in experimental scenarios, thus setting benchmarks for possible cavity-QED implementations.
DEFF Research Database (Denmark)
Fabi, Valentina; Andersen, Rune Korsholm; Corgnati, Stefano
2016-01-01
The interactions between building occupants and control systems have a high influence on energy consumption and on indoor environmental quality. In the perspective of a future of "nearly-zero" energy buildings, it is crucial to analyse the energy-related interactions deeply to predict realistic e...... who take daylight level into account and switch on lights only if necessary and people who totally disregard the natural lighting. This underlines the importance of how individuality is at the base of the definition of the different types of users....
DEFF Research Database (Denmark)
Rudolph, David Philipp
Offshore wind farms are widely considered to become a cornerstone of energy transition for securing the energy supply and tackling climate change simultaneously. But recent developments have demonstrated that offshore wind farms are far from being conflict-free, evoking confrontations with other...... stakeholder interests. Drawing on comparative case studies in Scotland and Germany, this paper addresses and explores various conflict lines emerging from the installation of offshore wind farms and contesting local community interests and concerns. Local resistance against wind farms opens up a vast debate...... explanations due to obscuring underlying rationales. By going beyond the stigmatisation of NIMBYism, the notion of space-related conflicts is intended to turn the attention towards conflicting interests and values that are aimed at space. This does not imply that such interests can be simply located...
Directory of Open Access Journals (Sweden)
Heros Augusto Santos Lobo
2012-04-01
Full Text Available The general theory of systems is based in the integrated analysis of the spatiotemporal relations among the components, the system matrix and also the arising processes. In tourist systems, the current studies are focused on the description of its components and in some of its interaction relationships. The present contribution focuses on the processes between the components and the matrix of the tourist systems, considering the inherent complexity of open systems, its homeostasis and entropy in function of the carrying capacity of processing the received inputs, and also some questions linked to the steady state, the self-maintenance and the collapse of tourist system generated by structural-deterministic or stochastic causes. In the final considerations, the low similarity of the processes developed in different tourist systems and also in different spatiotemporal conditions in the same system are raised, highlighting the practical impossibility of universal models generation to the tourist systems.
Relationship between non-standard work arrangements and work-related accident absence in Belgium.
Alali, Hanan; Braeckman, Lutgart; Van Hecke, Tanja; De Clercq, Bart; Janssens, Heidi; Wahab, Magd Abdel
2017-03-28
The main objective of this study is to examine the relationship between indicators of non-standard work arrangements, including precarious contract, long working hours, multiple jobs, shift work, and work-related accident absence, using a representative Belgian sample and considering several socio-demographic and work characteristics. This study was based on the data of the fifth European Working Conditions Survey (EWCS). For the analysis, the sample was restricted to 3343 respondents from Belgium who were all employed workers. The associations between non-standard work arrangements and work-related accident absence were studied with multivariate logistic regression modeling techniques while adjusting for several confounders. During the last 12 months, about 11.7% of workers were absent from work because of work-related accident. A multivariate regression model showed an increased injury risk for those performing shift work (OR 1.546, 95% CI 1.074-2.224). The relationship between contract type and occupational injuries was not significant (OR 1.163, 95% CI 0.739-1.831). Furthermore, no statistically significant differences were observed for those performing long working hours (OR 1.217, 95% CI 0.638-2.321) and those performing multiple jobs (OR 1.361, 95% CI 0.827-2.240) in relation to work-related accident absence. Those who rated their health as bad, low educated workers, workers from the construction sector, and those exposed to biomechanical exposure (BM) were more frequent victims of work-related accident absence. No significant gender difference was observed. Indicators of non-standard work arrangements under this study, except shift work, were not significantly associated with work-related accident absence. To reduce the burden of occupational injuries, not only risk reduction strategies and interventions are needed but also policy efforts are to be undertaken to limit shift work. In general, preventive measures and more training on the job are needed to
Analysis of uncertainties in the IAEA/WHO TLD postal dose audit system
Energy Technology Data Exchange (ETDEWEB)
Izewska, J. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)], E-mail: j.izewska@iaea.org; Hultqvist, M. [Department of Medical Radiation Physics, Karolinska Institute, Stockholm University, Stockholm (Sweden); Bera, P. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)
2008-02-15
The International Atomic Energy Agency (IAEA) and the World Health Organisation (WHO) operate the IAEA/WHO TLD postal dose audit programme. Thermoluminescence dosimeters (TLDs) are used as transfer devices in this programme. In the present work the uncertainties in the dose determination from TLD measurements have been evaluated. The analysis of uncertainties comprises uncertainties in the calibration coefficient of the TLD system and uncertainties in factors correcting for dose response non-linearity, fading of TL signal, energy response and influence of TLD holder. The individual uncertainties have been combined to estimate the total uncertainty in the dose evaluated from TLD measurements. The combined relative standard uncertainty in the dose determined from TLD measurements has been estimated to be 1.2% for irradiations with Co-60 {gamma}-rays and 1.6% for irradiations with high-energy X-rays. Results from irradiations by the Bureau international des poids et mesures (BIPM), Primary Standard Dosimetry Laboratories (PSDLs) and Secondary Standards Dosimetry Laboratories (SSDLs) compare favourably with the estimated uncertainties, whereas TLD results of radiotherapy centres show higher standard deviations than those derived theoretically.
Metrology and process control: dealing with measurement uncertainty
Potzick, James
2010-03-01
Metrology is often used in designing and controlling manufacturing processes. A product sample is processed, some relevant property is measured, and the process adjusted to bring the next processed sample closer to its specification. This feedback loop can be remarkably effective for the complex processes used in semiconductor manufacturing, but there is some risk involved because measurements have uncertainty and product specifications have tolerances. There is finite risk that good product will fail testing or that faulty product will pass. Standard methods for quantifying measurement uncertainty have been presented, but the question arises: how much measurement uncertainty is tolerable in a specific case? Or, How does measurement uncertainty relate to manufacturing risk? This paper looks at some of the components inside this process control feedback loop and describes methods to answer these questions.
Benchmarking observational uncertainties for hydrology (Invited)
McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.
2013-12-01
There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has
Some remarks on modeling uncertainties
International Nuclear Information System (INIS)
Ronen, Y.
1983-01-01
Several topics related to the question of modeling uncertainties are considered. The first topic is related to the use of the generalized bias operator method for modeling uncertainties. The method is expanded to a more general form of operators. The generalized bias operator is also used in the inverse problem and applied to determine the anisotropic scattering law. The last topic discussed is related to the question of the limit to accuracy and how to establish its value. (orig.) [de
Energy Technology Data Exchange (ETDEWEB)
Coggins, J.L.
1979-12-14
An investigation of the relative utility and performance of nine major household consumer products covered by the Energy Policy and Conservation Act is summarized. The objective was to define the terms utility and performance, to recommend methods for quantifying these two concepts, and to recommend an approach for dealing with utility and performance issues in the energy efficiency standards program. The definitions developed are: performance of a consumer product is the objective measure of how well, with the expected level of consumer input (following the manufacturer's instructions for installation and operation), the product does its intended job; and utility of a consumer product is a subjective measure, based on the consumer's perception, of the capability of the product to satisfy human needs. Quantification is based on test procedures and consumer survey methods which are largely already in use by industry. Utility and performance issues are important in product classification for prescribing energy efficiency standards. The recommended approach to utility and performance issues and classification is: prior to setting standards, evaluate utility and performance issues in the most quantitative way allowed by resources and schedules in order to develop classification guidelines. This approach requires no changes in existing Department of Energy test procedures.
Evaluation of Uncertainties in the Determination of Phosphorus by RNAA
International Nuclear Information System (INIS)
Rick L. Paul
2000-01-01
A radiochemical neutron activation analysis (RNAA) procedure for the determination of phosphorus in metals and other materials has been developed and critically evaluated. Uncertainties evaluated as type A include those arising from measurement replication, yield determination, neutron self-shielding, irradiation geometry, measurement of the quantity for concentration normalization (sample mass, area, etc.), and analysis of standards. Uncertainties evaluated as type B include those arising from beta contamination corrections, beta decay curve fitting, and beta self-absorption corrections. The evaluation of uncertainties in the determination of phosphorus is illustrated for three different materials in Table I. The metal standard reference materials (SRMs) 2175 and 861 were analyzed for value assignment of phosphorus; implanted silicon was analyzed to evaluate the technique for certification of phosphorus. The most significant difference in the error evaluation of the three materials lies in the type B uncertainties. The relatively uncomplicated matrix of the high-purity silicon allows virtually complete purification of phosphorus from other beta emitters; hence, minimal contamination correction is needed. Furthermore, because the chemistry is less rigorous, the carrier yield is more reproducible, and self-absorption corrections are less significant. Improvements in the chemical purification procedures for phosphorus in complex matrices will decrease the type B uncertainties for all samples. Uncertainties in the determination of carrier yield, the most significant type A error in the analysis of the silicon, also need to be evaluated more rigorously and minimized in the future
International Nuclear Information System (INIS)
Pickles, W.L.; McClure, J.W.; Howell, R.H.
1978-05-01
A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s
Farrance, Ian; Frenkel, Robert
2014-01-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional
Farrance, Ian; Frenkel, Robert
2014-02-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship
Systematic review: work-related stress and the HSE management standards.
Brookes, K; Limbert, C; Deacy, C; O'Reilly, A; Scott, S; Thirlaway, K
2013-10-01
The Health and Safety Executive (HSE) has defined six management standards representing aspects of work that, if poorly managed, are associated with lower levels of employee health and productivity, and increased sickness absence. The HSE indicator tool aims to measure organizations' performance in managing the primary stressors identified by the HSE management standards. The aims of the study are to explore how the HSE indicator tool has been implemented within organizations and to identify contexts in which the tool has been used, its psychometric properties and relationships with alternative measures of well-being and stress. Studies that matched specific criteria were included in the review. Abstracts were considered by two researchers to ensure a reliable process. Full texts were obtained when abstracts met the inclusion criteria. Thirteen papers were included in the review. Using factor analysis and measures of reliability, the studies suggest that the HSE indicator tool is a psychometrically sound measure. The tool has been used to measure work-related stress across different occupational groups, with a clear relationship between the HSE tool and alternative measures of well-being. Limitations of the tool and recommendations for future research are discussed. The HSE indicator tool is a psychometrically sound measure of organizational performance against the HSE management standards. As such it can provide a broad overview of sources of work-related stress within organizations. More research is required to explore the use of the tool in the design of interventions to reduce stress, and its use in different contexts and with different cultural and gender groups.
Uncertainty Analyses and Strategy
International Nuclear Information System (INIS)
Kevin Coppersmith
2001-01-01
performance difficult. Likewise, a demonstration of the magnitude of conservatisms in the dose estimates that result from conservative inputs is difficult to determine. To respond to these issues, the DOE explored the significance of uncertainties and the magnitude of conservatisms in the SSPA Volumes 1 and 2 (BSC 2001 [DIRS 155950]; BSC 2001 [DIRS 154659]). The three main goals of this report are: (1) To briefly summarize and consolidate the discussion of much of the work that has been done over the past few years to evaluate, clarify, and improve the representation of uncertainties in the TSPA and performance projections for a potential repository. This report does not contain any new analyses of those uncertainties, but it summarizes in one place the main findings of that work. (2) To develop a strategy for how uncertainties may be handled in the TSPA and supporting analyses and models to support a License Application, should the site be recommended. It should be noted that the strategy outlined in this report is based on current information available to DOE. The strategy may be modified pending receipt of additional pertinent information, such as the Yucca Mountain Review Plan. (3) To discuss issues related to communication about uncertainties, and propose some approaches the DOE may use in the future to improve how it communicates uncertainty in its models and performance assessments to decision-makers and to technical audiences
Uncertainty in prostate cancer. Ethnic and family patterns.
Germino, B B; Mishel, M H; Belyea, M; Harris, L; Ware, A; Mohler, J
1998-01-01
Prostate cancer occurs 37% more often in African-American men than in white men. Patients and their family care providers (FCPs) may have different experiences of cancer and its treatment. This report addresses two questions: 1) What is the relationship of uncertainty to family coping, psychological adjustment to illness, and spiritual factors? and 2) Are these patterns of relationship similar for patients and their family care givers and for whites and African-Americans? A sample of white and African-American men and their family care givers (N = 403) was drawn from an ongoing study, testing the efficacy of an uncertainty management intervention with men with stage B prostate cancer. Data were collected at study entry, either 1 week after post-surgical catheter removal or at the beginning of primary radiation treatment. Measures of uncertainty, adult role behavior, problem solving, social support, importance of God in one's life, family coping, psychological adjustment to illness, and perceptions of health and illness met standard criteria for internal consistency. Analyses of baseline data using Pearson's product moment correlations were conducted to examine the relationships of person, disease, and contextual factors to uncertainty. For family coping, uncertainty was significantly and positively related to two domains in white family care providers only. In African-American and white family care providers, the more uncertainty experienced, the less positive they felt about treatment. Uncertainty for all care givers was related inversely to positive feelings about the patient recovering from the illness. For all patients and for white family members, uncertainty was related inversely to the quality of the domestic environment. For everyone, uncertainty was related inversely to psychological distress. Higher levels of uncertainty were related to a poorer social environment for African-American patients and for white family members. For white patients and their
Heitmann, Carina Yvonne; Feldker, Katharina; Neumeister, Paula; Zepp, Britta Maria; Peterburs, Jutta; Zwitserlood, Pienie; Straube, Thomas
2016-04-01
Our understanding of altered emotional processing in social anxiety disorder (SAD) is hampered by a heterogeneity of findings, which is probably due to the vastly different methods and materials used so far. This is why the present functional magnetic resonance imaging (fMRI) study investigated immediate disorder-related threat processing in 30 SAD patients and 30 healthy controls (HC) with a novel, standardized set of highly ecologically valid, disorder-related complex visual scenes. SAD patients rated disorder-related as compared with neutral scenes as more unpleasant, arousing and anxiety-inducing than HC. On the neural level, disorder-related as compared with neutral scenes evoked differential responses in SAD patients in a widespread emotion processing network including (para-)limbic structures (e.g. amygdala, insula, thalamus, globus pallidus) and cortical regions (e.g. dorsomedial prefrontal cortex (dmPFC), posterior cingulate cortex (PCC), and precuneus). Functional connectivity analysis yielded an altered interplay between PCC/precuneus and paralimbic (insula) as well as cortical regions (dmPFC, precuneus) in SAD patients, which emphasizes a central role for PCC/precuneus in disorder-related scene processing. Hyperconnectivity of globus pallidus with amygdala, anterior cingulate cortex (ACC) and medial prefrontal cortex (mPFC) additionally underlines the relevance of this region in socially anxious threat processing. Our findings stress the importance of specific disorder-related stimuli for the investigation of altered emotion processing in SAD. Disorder-related threat processing in SAD reveals anomalies at multiple stages of emotion processing which may be linked to increased anxiety and to dysfunctionally elevated levels of self-referential processing reported in previous studies. © 2016 Wiley Periodicals, Inc.
Uncertainty in artificial intelligence
Shachter, RD; Henrion, M; Lemmer, JF
1990-01-01
This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und
International Nuclear Information System (INIS)
Robouch, P.; Arana, G.; Eguskiza, M.; Etxebarria, N.
2000-01-01
The concepts of the Guide to the expression of Uncertainties in Measurements for chemical measurements (GUM) and the recommendations of the Eurachem document 'Quantifying Uncertainty in Analytical Methods' are applied to set up the uncertainty budget for k 0 -NAA. The 'universally applicable spreadsheet technique', described by KRAGTEN, is applied to the k 0 -NAA basic equations for the computation of uncertainties. The variance components - individual standard uncertainties - highlight the contribution and the importance of the different parameters to be taken into account. (author)
Measurement uncertainty: Friend or foe?
Infusino, Ilenia; Panteghini, Mauro
2018-02-02
The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Uncertainty in relative cost investigation
International Nuclear Information System (INIS)
Bunn, D.; Viahos, K.
1989-01-01
One of the consequences of the privatization of the Central Electricity Generating Board has been a weakening of the economic case for nuclear generation over coal. Nuclear has higher capital, but lower operating costs than coal and is therefore favoured in capital budgeting by discounting at lower rates of return. In the Sizewell case (in 1987), discounting at the public sector rate of 5 per cent favoured nuclear. However, the private sector will require higher rates of return, thus rendering nuclear less attractive. Hence the imposition by the government of a diversity constraint on the privatized industry to ensure that contracts are made for a minimum fraction of non-fossil (essentially nuclear) energy. An electricity capacity planning model was developed to estimate the costs of imposing various non-fossil energy constraints on the planning decision of a privatized electricity supply industry, as a function of various discount rates. Using a large-scale linear programming technique, the model optimizes over a 50 year horizon the schedule of installation, and mix of generating capacity, both with and without a minimum non-fossil constraint. The conclusion is that the opportunity cost of diversity may be a complex joint substation of more than one type of plant (eg coal and gas) depending on the discount rate. (author)
Using Options to Manage Dynamic Uncertainty in Acquisition Projects
National Research Council Canada - National Science Library
Ceylan, B. K; Ford, David N
2002-01-01
Uncertainty in acquisition projects and environments can degrade performance. Traditional project planning, management tools, and methods can effectively deal with uncertainties in relatively stable environments...
2011-07-15
... adjustment model, in an annually updated Federal notice of benefit and payment parameters. In addition to the... uncertainty of insurance risk in the individual market by making payments for high- cost cases. The temporary... program is intended to provide adequate payments to health insurance issuers that attract high-risk...
Determination of Formula for Vickers Hardness Measurements Uncertainty
International Nuclear Information System (INIS)
Purba, Asli
2007-01-01
The purpose of formula determination is to obtain the formula of Vickers hardness measurements uncertainty. The approach to determine the formula: influenced parameters identification, creating a cause and effect diagram, determination of sensitivity, determination of standard uncertainty and determination of formula for Vickers hardness measurements uncertainty. The results is a formula for determination of Vickers hardness measurements uncertainty. (author)
Glysson, G. Douglas; Skinner, John V.
1991-01-01
In the late 1950's, intense demands for water and growing concerns about declines in the quality of water generated the need for more water-resources data. About thirty Federal agencies, hundreds of State, county and local agencies, and many private organizations had been collecting water data. However, because of differences in procedures and equipment, many of the data bases were incompatible. In 1964, as a step toward establishing more uniformity, the Bureau of the Budget (now the Office of Management and Budget, OMB) issued 'Circular A-67' which presented guidelines for collecting water data and also served as a catalyst for creating the Office of Water Data Coordination (OWDC) within the U.S. Geological Survey. This paper discusses past, present, and future aspects of the relation between methods in the National Handbook and standards published by ASTM (American Society for Testing and Materials) Committee D-19 on Water's Subcommittee D-19.07 on Sediment, Geomorphology, and Open Channel Flow. The discussion also covers historical aspects of standards - development work jointly conducted by OWDC and ASTM.
Stereo-particle image velocimetry uncertainty quantification
International Nuclear Information System (INIS)
Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J
2017-01-01
Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric
American Society for Testing and Materials. Philadelphia
2007-01-01
1.1 This standard provides a practice whereby industrial radiographic imaging systems may be comparatively assessed using the concept of relative image quality response (RIQR). The RIQR method presented within this practice is based upon the use of equivalent penetrameter sensitivity (EPS) described within Practice E 1025 and subsection 5.2 of this practice. Figure 1 illustrates a relative image quality indicator (RIQI) that has four different steel plaque thicknesses (.015, .010, .008, and .005 in.) sequentially positioned (from top to bottom) on a ¾-in. thick steel plate. The four plaques contain a total of 14 different arrays of penetrameter-type hole sizes designed to render varied conditions of threshold visibility ranging from 1.92 % EPS (at the top) to .94 % EPS (at the bottom) when exposed to nominal 200 keV X-ray radiation. Each “EPS” array consists of 30 identical holes; thus, providing the user with a quantity of threshold sensitivity levels suitable for relative image qualitative response com...
Quantitative angle-insensitive flow measurement using relative standard deviation OCT.
Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping
2017-10-30
Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .
Calculation of uncertainties; Calculo de incertidumbres
Energy Technology Data Exchange (ETDEWEB)
Diaz-Asencio, Misael [Centro de Estudios Ambientales de Cienfuegos (Cuba)
2012-07-01
One of the most important aspects in relation to the quality assurance in any analytical activity is the estimation of measurement uncertainty. There is general agreement that 'the expression of the result of a measurement is not complete without specifying its associated uncertainty'. An analytical process is the mechanism for obtaining methodological information (measurand) of a material system (population). This implies the need for the definition of the problem, the choice of methods for sampling and measurement and proper execution of these activities for obtaining information. The result of a measurement is only an approximation or estimate of the value of the measurand, which is complete only when accompanied by an estimate of the uncertainty of the analytical process. According to the 'Vocabulary of Basic and General Terms in Metrology' measurement uncertainty' is the parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand (or magnitude). This parameter could be a standard deviation or a confidence interval. The uncertainty evaluation requires detailed look at all possible sources, but not disproportionately. We can make a good estimate of the uncertainty concentrating efforts on the largest contributions. The key steps of the process of determining the uncertainty in the measurements are: - the specification of the measurand; - identification of the sources of uncertainty - the quantification of individual components of uncertainty, - calculate the combined standard uncertainty; - report of uncertainty. [Spanish] Uno de los aspectos mas importantes en relacion con el aseguramiento de la calidad en cualquier actividad analitica es la estimacion de la incertidumbre de la medicion. Existe el acuerdo general que 'la expresion del resultado de una medicion no esta completa sin especificar su incertidumbre asociada'. Un proceso analitico es el mecanismo
Realising the Uncertainty Enabled Model Web
Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.
2012-12-01
The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address
Ma, Weina; Yang, Liu; Lv, Yanni; Fu, Jia; Zhang, Yanmin; He, Langchong
2017-06-23
The equilibrium dissociation constant (K D ) of drug-membrane receptor affinity is the basic parameter that reflects the strength of interaction. The cell membrane chromatography (CMC) method is an effective technique to study the characteristics of drug-membrane receptor affinity. In this study, the K D value of CMC relative standard method for the determination of drug-membrane receptor affinity was established to analyze the relative K D values of drugs binding to the membrane receptors (Epidermal growth factor receptor and angiotensin II receptor). The K D values obtained by the CMC relative standard method had a strong correlation with those obtained by the frontal analysis method. Additionally, the K D values obtained by CMC relative standard method correlated with pharmacological activity of the drug being evaluated. The CMC relative standard method is a convenient and effective method to evaluate drug-membrane receptor affinity. Copyright © 2017 Elsevier B.V. All rights reserved.
Kawanishi, Hideki; Akiba, Takashi; Masakane, Ikuto; Tomo, Tadashi; Mineshima, Michio; Kawasaki, Tadayuki; Hirakata, Hideki; Akizawa, Tadao
2009-04-01
The Committee of Scientific Academy of the Japanese Society for Dialysis Therapy (JSDT) proposes a new standard on microbiological management of fluids for hemodialysis and related therapies. This standard is within the scope of the International Organization for Standardization (ISO), which is currently under revision. This standard is to be applied to the central dialysis fluid delivery systems (CDDS), which are widely used in Japan. In this standard, microbiological qualities for dialysis water and dialysis fluids are clearly defined by endotoxin level and bacterial count. The qualities of dialysis fluids were classified into three levels: standard, ultrapure, and online prepared substitution fluid. In addition, the therapeutic application of each dialysis fluid is clarified. Since high-performance dialyzers are frequently used in Japan, the standard recommends that ultrapure dialysis fluid be used for all dialysis modalities at all dialysis facilities. It also recommends that the dialysis equipment safety management committee at each facility should validate the microbiological qualities of online prepared substitution fluid.
Physical and Model Uncertainty for Fatigue Design of Composite Material
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard
The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...
Gentes, Emily L; Ruscio, Ayelet Meron
2011-08-01
Intolerance of uncertainty (IU) has been suggested to reflect a specific risk factor for generalized anxiety disorder (GAD), but there have been no systematic attempts to evaluate the specificity of IU to GAD. This meta-analysis examined the cross-sectional association of IU with symptoms of GAD, major depressive disorder (MDD), and obsessive-compulsive disorder (OCD). Random effects analyses were conducted for two common definitions of IU, one that has predominated in studies of GAD (56 effect sizes) and another that has been favored in studies of OCD (29 effect sizes). Using the definition of IU developed for GAD, IU shared a mean correlation of .57 with GAD, .53 with MDD, and .50 with OCD. Using the alternate definition developed for OCD, IU shared a mean correlation of .46 with MDD and .42 with OCD, with no studies available for GAD. Post-hoc significance tests revealed that IU was more strongly related to GAD than to OCD when the GAD-specific definition of IU was used. No other differences were found in the magnitude of associations between IU and the three syndromes. We discuss implications of these findings for models of shared and specific features of emotional disorders and for future research efforts. Copyright © 2011 Elsevier Ltd. All rights reserved.
Some illustrative examples of model uncertainty
International Nuclear Information System (INIS)
Bier, V.M.
1994-01-01
In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion
Directory of Open Access Journals (Sweden)
Mujtaba Baqar
2017-06-01
Full Text Available The current study is a follow up to our previously published “letter to the editor”, whose purpose was to report the relationship of pain prevalence (work-related musculoskeletal symptoms and trouble prohibiting to normal work (work-related musculoskeletal disorders with descriptive variables among motorcycle mechanics using standardized Nordic musculoskeletal questionnaire (NMQ as a tool. Results demonstrates that the pain prevalence was significantly high in shoulders, neck, low-back, wrists, ankles and elbows compared to other body parts. Among variables “age of participants and working hours” were found to be directly associated whereby shoulder and neck pain had significant correlation with lower age groups and more working hours. Regarding trouble prohibited to normal work, a total of 121 (46% participants reported hindrance in normal work with serious complaints about shoulders for younger age group; wrists and hips for middle age group; and neck, lower back, knees, ankles for old age group. Finally, results of a frequencies and cross-tabulations indicated that prolong work hours were significantly associated with emergence of musculoskeletal symptoms. This high prevalence of WMSS leading to WMSD among the motorcycles mechanics reflects the ignorance of occupational duties. A possible recommendation of this research includes the development and implementation of health and safety guidelines for the mentioned industry.
Non-standard employment relations and wages among school-leavers in the Netherlands
de Vries, M.R.; Wolbers, M.H.J.
2005-01-01
Non-standard (alternatively, flexible) employment has become common in the Netherlands, and viewed as an important weapon for combating youth unemployment. However, if such jobs are 'bad', non-standard employment becomes a matter of concern. In addition, non-standard employment may hit the least
78 FR 57445 - Charging Standard Administrative Fees for Nonprogram-Related Information
2013-09-18
... the Federal Register a schedule of standardized administrative fees we charge to recover the full cost... fee schedule is outdated and incongruent with the agency's current costs for this service. New... new standard fee on our most recent cost calculations for supplying this information and the standard...
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
National Oceanic and Atmospheric Administration, Department of Commerce — This dataset represents sediment size prediction uncertainty from a sediment spatial model developed for the New York offshore spatial planning area. The model also...
2011-04-22
... criteria. The revised air quality criteria reflect advances in scientific knowledge on the effects of the... National Ambient Air Quality Standards, contains staff analyses of the scientific bases for alternative... Document Related to the Review of the National Ambient Air Quality Standards for Particulate Matter AGENCY...
Effects of standard humic materials on relative bioavailability of NDL-PCBs in juvenile swine.
Directory of Open Access Journals (Sweden)
Matthieu Delannoy
Full Text Available Young children with their hand-to-mouth activity may be exposed to contaminated soils. However few studies assessing exposure of organic compounds sequestrated in soil were realized. The present study explores the impact of different organic matters on retention of NDL-PCBs during digestive processes using commercial humic substances in a close digestive model of children: the piglet. Six artificial soils were used. One standard soil, devoid of organic matter, and five amended versions of this standard soil with either fulvic acid, humic acid, Sphagnum peat, activated carbon or a mix of Sphagnum peat and activated carbon (95∶5 (SPAC were prepared. In order to compare the different treatments, we use spiked oil and negative control animals. Forty male piglets were randomly distributed in 7 contaminated and one control groups (n = 5 for each group. During 10 days, the piglets were fed artificial soil or a corn oil spiked with 19,200 ng of Aroclor 1254 per g of dry matter (6,000 ng.g⁻¹ of NDL-PCBs to achieve an exposure dose of 1,200 ng NDL-PCBs.Kg⁻¹ of body weight per day. NDL-PCBs in adipose tissue were analyzed by GC-MS. Fulvic acid reduced slightly the bioavailability of NDL-PCBs compared to oil. Humic acid and Sphagnum peat reduced it significantly higher whereas activated carbon reduced the most. Piglets exposed to soil containing both activated carbon and Shagnum peat exhibited a lower reduction than soil with only activated carbon. Therefore, treatment groups are ordered by decreasing value of relative bioavailability as following: oil ≥ fulvic acid>Sphagnum peat ≥ Sphagnum peat and activated carbon ≥ Humic acid>>activated carbon. This suggests competition between Sphagnum peat and activated carbon. The present study highlights that quality of organic matter does have a significant effect on bioavailability of sequestrated organic compounds.
Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility
International Nuclear Information System (INIS)
Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.
2010-01-01
Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure
Energy Technology Data Exchange (ETDEWEB)
Deflandre, T. [Electricite de France (EDF), 92 - Clamart (France). Direction des Etudes et Recherches
1996-09-01
The objective of this document is to provide an overview of the various normative documents and regulations that may be useful in the field of harmonics as references to a network user, fitter, manufacturer or operator. Several documents that are presently developed at Electricite de France are also presented, together with the connecting regulation that is included in the Emeraude contract. The following points are discussed: hierarchy of normative documents, harmonic emission standards and more especially the NF EN 610003-2 standards ``Limits for the emission of harmonic currents for equipment having an inrush current inferior or equal to 16 A per phase``; document status, immunity standards, compatibility and environment standards, documents under study and the Emeraude contract
Kica, Evisa
2015-01-01
The core of this thesis consists of developing a comprehensive empirical assessment on the legitimacy of nanotechnology related transnational private governance arrangements (TPGAs), explored through the case study of the International Organization for Standardization (ISO) Technical Committee on
The Precautionary Principle and statistical approaches to uncertainty
DEFF Research Database (Denmark)
Keiding, Niels; Budtz-Jørgensen, Esben
2003-01-01
Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification...
Booth, Alison L.; Francesconi, Marco
2000-01-01
This paper documents the extent of union coverage and performance-related pay (PRP) - the latter representing one aspect of pay flexibility - across standard and non-standard workers in Britain, using the first seven waves of the British Household Panel Survey, 1991-1997. We find there is no evidence of expansion of either union coverage or PRP towards any type of non-standard employment in the 1990s. Thus union rhetoric about a 'strategy of enlargement' towards non-standard workers remains j...
Uncertainty in social dilemmas
Kwaadsteniet, Erik Willem de
2007-01-01
This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...
Double standards: the multinational asbestos industry and asbestos-related disease in South Africa.
McCulloch, Jock; Tweedale, Geoffrey
2004-01-01
This study documents and contrasts the development of knowledge about asbestos-related disease (ARD) in South Africa and the United Kingdom. It also contributes to the globalization debate by exploring corporate decision-making in a multinational industry. Between the 1930s and 1960s, the leading U.K. asbestos companies developed a sophisticated knowledge of ARD, though in South Africa, where the leading companies such as Turner & Newall and Cape Asbestos owned mines, there was little attempt to apply this knowledge. Asbestos mines (and their environments) in South Africa were uniquely dusty and ARD was rife. Social and political factors in South Africa, especially apartheid, allowed these companies to apply double standards, even after 1960 when the much more serious hazard of mesothelioma was identified. This shows the need for greater regulation of multinationals. Because of the lack of such regulation in the early 1960s, an opportunity was lost to prevent the current high morbidity and mortality of ARD both in South Africa and worldwide.
Length standards and the Twin Paradox in the Special Theory of Relativity
Carrubba, James Gasper
In this Thesis I work towards a discussion of several resolutions of the Twin Paradox by exploring the Lorentz transformations. I begin by asking what it means for a moving length to contract, a question which obviously cannot be divorced from the propagation of length standards from one reference frame to another. I emphasize the conventionality of definitions of length. I go on to argue that it is the imposition of clock synchronization-the conventionality of one-way speeds- and not the effects of acceleration which leads to the asymmetry of light speed observed in Sagnac effect; and further, that this asymmetry leads to apparent paradoxes which are easily resolved when we take into account general covariance. In subsequent discussion of light-speed conventionality, I prove that any transform which preserves synchronization consistent with Michelson-Morley must be a similarity transform; and use this to demonstrate that not all results which appear to depend on Special Relativity actually do. I conclude this Thesis with an argument that the Twin Paradox cannot be resolved consistently if we impose simultaneously all 'physical' conditions which various resolutions impose in part.
Failure trend analysis for safety related components of Korean standard NPPs
International Nuclear Information System (INIS)
Choi, Sun Yeong; Han, Sang Hoon
2005-01-01
The component reliability data of Korean NPP that reflects the plant specific characteristics is required necessarily for PSA of Korean nuclear power plants. We have performed a project to develop the component reliability database (KIND, Korea Integrated Nuclear Reliability Database) and S/W for database management and component reliability analysis. Based on the system, we have collected the component operation data and failure/repair data during from plant operation date to 2002 for YGN 3, 4 and UCN 3, 4 plants. Recently, we provided the component failure rate data for UCN 3, 4 standard PSA model from the KIND. We evaluated the components that have high-ranking failure rates with the component reliability data from plant operation date to 1998 and 2000 for YGN 3,4 and UCN 3, 4 respectively. We also identified their failure mode that occurred frequently. In this study, we analyze the component failure trend and perform site comparison based on the generic data by using the component reliability data which is extended to 2002 for UCN 3, 4 and YGN 3, 4 respectively. We focus on the major safety related rotating components such as pump, EDG etc
Voskresenskaya, Elena; Vorona-Slivinskaya, Lubov
2018-03-01
The article considers the issues of developing national standards for high-rise construction. The system of standards should provide industrial, operational, economic and terrorist safety of high-rise buildings and facilities. Modern standards of high-rise construction should set the rules for designing engineering systems of high-rise buildings, which will ensure the integrated security of buildings, increase their energy efficiency and reduce the consumption of resources in construction and operation.
Directory of Open Access Journals (Sweden)
Voskresenskaya Elena
2018-01-01
Full Text Available The article considers the issues of developing national standards for high-rise construction. The system of standards should provide industrial, operational, economic and terrorist safety of high-rise buildings and facilities. Modern standards of high-rise construction should set the rules for designing engineering systems of high-rise buildings, which will ensure the integrated security of buildings, increase their energy efficiency and reduce the consumption of resources in construction and operation.
Uncertainty Propagation in an Ecosystem Nutrient Budget.
New aspects and advancements in classical uncertainty propagation methods were used to develop a nutrient budget with associated error for a northern Gulf of Mexico coastal embayment. Uncertainty was calculated for budget terms by propagating the standard error and degrees of fr...
Demmer, Denise L; Beilin, Lawrence J; Hands, Beth; Burrows, Sally; Pennell, Craig E; Lye, Stephen J; Mountain, Jennifer A; Mori, Trevor A
2016-01-01
Assessment of adiposity using dual energy x-ray absorptiometry (DXA) has been considered more advantageous in comparison to anthropometry for predicting cardio-metabolic risk in the older population, by virtue of its ability to distinguish total and regional fat. Nonetheless, there is increasing uncertainty regarding the relative superiority of DXA and little comparative data exist in young adults. This study aimed to identify which measure of adiposity determined by either DXA or anthropometry is optimal within a range of cardio-metabolic risk factors in young adults. 1138 adults aged 20 years were assessed by DXA and standard anthropometry from the Western Australian Pregnancy Cohort (Raine) Study. Cross-sectional linear regression analyses were performed. Waist to height ratio was superior to any DXA measure with HDL-C. BMI was the superior model in relation to blood pressure than any DXA measure. Midriff fat mass (DXA) and waist circumference were comparable in relation to glucose. For all the other cardio-metabolic variables, anthropometric and DXA measures were comparable. DXA midriff fat mass compared with BMI or waist hip ratio was the superior measure for triglycerides, insulin and HOMA-IR. Although midriff fat mass (measured by DXA) was the superior measure with insulin sensitivity and triglycerides, the anthropometric measures were better or equal with various DXA measures for majority of the cardio-metabolic risk factors. Our findings suggest, clinical anthropometry is generally as useful as DXA in the evaluation of the individual cardio-metabolic risk factors in young adults.
An Analysis of Geography Content in Relation to Geography for Life Standards in Oman
Al-Nofli, Mohammed Abdullah
2018-01-01
Since the publication of "Geography for Life: National Geography Standards" in the United States (Geography Education Standards Project, 1994), it has been widely used to develop quality curriculum materials for what students should know and able to do in geography. This study compared geography content taught in Omani public schools…
Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei
2012-12-01
To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.
2012-03-01
ISO / IEC 17025 Inspection Bodies – ISO / IEC 17020 RMPs – ISO Guide 34 (Reference...certify to : ISO 9001 (QMS), ISO 14001 (EMS), TS 16949 (US Automotive) etc. 2 3 DoD QSM 4.2 standard ISO / IEC 17025 :2005 Each has uncertainty...IPV6, NLLAP, NEFAP TRAINING Programs Certification Bodies – ISO / IEC 17021 Accreditation for Management System
The Uncertainties of Risk Management
DEFF Research Database (Denmark)
Vinnari, Eija; Skærbæk, Peter
2014-01-01
for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...
Limited entropic uncertainty as new principle of quantum physics
International Nuclear Information System (INIS)
Ion, D.B.; Ion, M.L.
2001-01-01
The Uncertainty Principle (UP) of quantum mechanics discovered by Heisenberg, which constitute the corner-stone of quantum physics, asserts that: there is an irreducible lower bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables. In order to avoid this state-dependence many authors proposed to use the information entropy as a measure of the uncertainty instead of above standard quantitative formulation of the Heisenberg uncertainty principle. In this paper the Principle of Limited Entropic Uncertainty (LEU-Principle), as a new principle in quantum physics, is proved. Then, consistent experimental tests of the LEU-principle, obtained by using the available 49 sets of the pion-nucleus phase shifts, are presented for both, extensive (q=1) and nonextensive (q=0.5 and q=2.0) cases. Some results obtained by the application of LEU-Principle to the diffraction phenomena are also discussed. The main results and conclusions of our paper can be summarized as follows: (i) We introduced a new principle in quantum physics namely the Principle of Limited Entropic Uncertainty (LEU-Principle). This new principle includes in a more general and exact form not only the old Heisenberg uncertainty principle but also introduce an upper limit on the magnitude of the uncertainty in the quantum physics. The LEU-Principle asserts that: 'there is an irreducible lower bound as well as an upper bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables for any extensive and nonextensive (q ≥ 0) quantum systems'; (ii) Two important concrete realizations of the LEU-Principle are explicitly obtained in this paper, namely: (a) the LEU-inequalities for the quantum scattering of spinless particles and (b) the LEU-inequalities for the diffraction on single slit of width 2a. In particular from our general results, in the limit y → +1 we recover in an exact form all the results previously reported. In our paper an
Low cost high performance uncertainty quantification
Bekas, C.; Curioni, A.; Fedulova, I.
2009-01-01
Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost
Influence of midsole hardness of standard cushioned shoes on running-related injury risk.
Theisen, Daniel; Malisoux, Laurent; Genin, Joakim; Delattre, Nicolas; Seil, Romain; Urhausen, Axel
2014-03-01
In this double-blind randomised controlled trial, we tested if leisure-time runners using shoes with less compliant midsoles have a higher running-related injury (RRI) risk. We provided 247 runners with standard running shoes having either a soft study shoes (soft-SS) or a hard study shoes (hard-SS) midsole and followed them prospectively for 5 months regarding RRI. All information about sports practice and injuries was uploaded on a dedicated internet platform and checked for consistency and completeness. RRI was defined as any first-time pain sustained during or as a result of running practice and impeding normal running activity for at least 1 day. Cox proportional hazards regressions were used to identify RRI risk factors. The type of study shoes used for running was not associated with RRIs (HR=0.92; 95% CI 0.57 to 1.48). The hard-SS had a 15% greater overall stiffness in the heel region. The two study groups were similar regarding personal and sports participation characteristics, except for years of running experience, which was higher (prunning. No between-group differences were found regarding injury location, type, severity or category. Nevertheless, the adjusted regression model revealed positive associations with RRI risk for body mass index (HR=1.126; 95% CI 1.033 to 1.227), previous injury (HR=1.735; 95% CI 1.037 to 2.902) and mean session intensity (HR=1.396; 95% CI 1.040 to 1.874). Protective factors were previous regular running activity (HR=0.422; 95% CI 0.228 to 0.779) and weekly volume of other sports activities (HR=0.702; 95% CI 0.561 to 0.879). Midsole hardness of modern cushioned running shoes does not seem to influence RRI risk.
Confronting Uncertainty in Life Cycle Assessment Used for Decision Support
DEFF Research Database (Denmark)
Herrmann, Ivan Tengbjerg; Hauschild, Michael Zwicky; Sohn, Michael D.
2014-01-01
the decision maker (DM) in making the best possible choice for the environment. At present, some DMs do not trust the LCA to be a reliable decisionsupport tool—often because DMs consider the uncertainty of an LCA to be too large. The standard evaluation of uncertainty in LCAs is an ex-post approach that can...... regarding which type of LCA study to employ for the decision context at hand. This taxonomy enables the derivation of an LCA classification matrix to clearly identify and communicate the type of a given LCA. By relating the LCA classification matrix to statistical principles, we can also rank the different......The aim of this article is to help confront uncertainty in life cycle assessments (LCAs) used for decision support. LCAs offer a quantitative approach to assess environmental effects of products, technologies, and services and are conducted by an LCA practitioner or analyst (AN) to support...
DEFF Research Database (Denmark)
Winkler, Ingo
2008-01-01
and training opportunities, students’ relations to other employees, and social integration. By adopting a qualitative design, I was able to emphasize the subjective perspective of students describing their very own experiences as flexible workers. The study revealed various perceptions of students working...... as flexible employees and related this picture to current empirical and theoretical research in the field of non-standard employment....
Household energy consumption versus income and relative standard of living: A panel approach
International Nuclear Information System (INIS)
Joyeux, Roselyne; Ripple, Ronald D.
2007-01-01
Our fundamental premise is that energy consumption at the household level is a key indicator of standard of living. We employ state-of-the-art panel cointegration techniques to evaluate the nature of the relationship between income measures and energy consumption measures for seven East Indian Ocean countries. The general finding is that income and household electricity consumption are not cointegrated. Given this finding, we conclude that standard of living measures that rely on income measures and do not include household-level energy consumption information will necessarily miss important indications of both levels and changes of standard of living
Uncertainty Characterization of Reactor Vessel Fracture Toughness
International Nuclear Information System (INIS)
Li, Fei; Modarres, Mohammad
2002-01-01
To perform fracture mechanics analysis of reactor vessel, fracture toughness (K Ic ) at various temperatures would be necessary. In a best estimate approach, K Ic uncertainties resulting from both lack of sufficient knowledge and randomness in some of the variables of K Ic must be characterized. Although it may be argued that there is only one type of uncertainty, which is lack of perfect knowledge about the subject under study, as a matter of practice K Ic uncertainties can be divided into two types: aleatory and epistemic. Aleatory uncertainty is related to uncertainty that is very difficult to reduce, if not impossible; epistemic uncertainty, on the other hand, can be practically reduced. Distinction between aleatory and epistemic uncertainties facilitates decision-making under uncertainty and allows for proper propagation of uncertainties in the computation process. Typically, epistemic uncertainties representing, for example, parameters of a model are sampled (to generate a 'snapshot', single-value of the parameters), but the totality of aleatory uncertainties is carried through the calculation as available. In this paper a description of an approach to account for these two types of uncertainties associated with K Ic has been provided. (authors)
Car, Nicholas; Cox, Simon; Fitch, Peter
2015-04-01
With earth-science datasets increasingly being published to enable re-use in projects disassociated from the original data acquisition or generation, there is an urgent need for associated metadata to be connected, in order to guide their application. In particular, provenance traces should support the evaluation of data quality and reliability. However, while standards for describing provenance are emerging (e.g. PROV-O), these do not include the necessary statistical descriptors and confidence assessments. UncertML has a mature conceptual model that may be used to record uncertainty metadata. However, by itself UncertML does not support the representation of uncertainty of multi-part datasets, and provides no direct way of associating the uncertainty information - metadata in relation to a dataset - with dataset objects.We present a method to address both these issues by combining UncertML with PROV-O, and delivering resulting uncertainty-enriched provenance traces through the Linked Data API. UncertProv extends the PROV-O provenance ontology with an RDF formulation of the UncertML conceptual model elements, adds further elements to support uncertainty representation without a conceptual model and the integration of UncertML through links to documents. The Linked ID API provides a systematic way of navigating from dataset objects to their UncertProv metadata and back again. The Linked Data API's 'views' capability enables access to UncertML and non-UncertML uncertainty metadata representations for a dataset. With this approach, it is possible to access and navigate the uncertainty metadata associated with a published dataset using standard semantic web tools, such as SPARQL queries. Where the uncertainty data follows the UncertML model it can be automatically interpreted and may also support automatic uncertainty propagation . Repositories wishing to enable uncertainty propagation for all datasets must ensure that all elements that are associated with uncertainty
On economic resolution and uncertainty in hydrocarbon exploration assessment
International Nuclear Information System (INIS)
Lerche, I.
1998-01-01
When assessment of parameters of a decision tree for a hydrocarbon exploration project can lie within estimated ranges, it is shown that the ensemble average expected value has two sorts of uncertainties: one is due to the expected value of each realization of the decision tree being different than the average; the second is due to intrinsic variance of each decision tree. The total standard error of the average expected value combines both sorts. The use of additional statistical measures, such as standard error, volatility, and cumulative probability of making a profit, provide insight into the selection process leading to a more appropriate decision. In addition, the use of relative contributions and relative importance for the uncertainty measures guides one to a better determination of those parameters that dominantly influence the total ensemble uncertainty. In this way one can concentrate resources on efforts to minimize the uncertainty ranges of such dominant parameters. A numerical illustration is provided to indicate how such calculations can be performed simply with a hand calculator. (author)
Pharmacological Fingerprints of Contextual Uncertainty.
Directory of Open Access Journals (Sweden)
Louise Marshall
2016-11-01
Full Text Available Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses.
Propagation of dynamic measurement uncertainty
International Nuclear Information System (INIS)
Hessling, J P
2011-01-01
The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result
Instrument uncertainty predictions
International Nuclear Information System (INIS)
Coutts, D.A.
1991-07-01
The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty
Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures
International Nuclear Information System (INIS)
Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha
2002-04-01
Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes
Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation
Zhao, T.; Cai, X.; Yang, D.
2010-12-01
Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover
Uncertainty in geological and hydrogeological data
Directory of Open Access Journals (Sweden)
B. Nilsson
2007-09-01
Full Text Available Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.
Directory of Open Access Journals (Sweden)
Gürhan UYSAL
2008-01-01
Full Text Available This study explores the impact of resource uncertainty and relational exchange between customer and supplier. Resource uncertainty involves factors as resource concentration, resource availability uncertainty and resource interconnectedness. The necessary data has been collected from 134 companies in Marmara Region through a questionnaire. This study, therefore, adopts factor, correlation and regression analyses to test impact of resource uncertainty on relational exchange. Data analysis reveals that resource concentration and resource availability uncertainty do not have an impact on relational exchange between customer and supplier and resource interconnectedness influences relational exchange. Furthermore, One-way Anova tests demonstrate that resource concentration, resource availability uncertainty and resource interconnectedness do not significantly differentiate on control variables such as industry, foundation year, revenues and number of employees.
American Society for Testing and Materials. Philadelphia
1989-01-01
1.1 This practice covers the providing of guidance in converting the results of electrochemical measurements to rates of uniform corrosion. Calculation methods for converting corrosion current density values to either mass loss rates or average penetration rates are given for most engineering alloys. In addition, some guidelines for converting polarization resistance values to corrosion rates are provided. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard.
RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY
Energy Technology Data Exchange (ETDEWEB)
Salaymeh, S.; Ashley, W.; Jeffcoat, R.
2010-06-17
It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.
Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty
International Nuclear Information System (INIS)
Salaymeh, S.; Ashley, W.; Jeffcoat, R.
2010-01-01
It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.
Neff, Bonita Dostal
Focusing on ethics in public relations from a multicultural point of view brings together elements which are critical to international public relations. The Public Relations-Ethics-Multicultural (PREM) model illustrates that articles can be found in the literature on ethics, public relations, and multicultural as individual concepts. The…
Investment, regulation, and uncertainty
Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose
2014-01-01
As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases. This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745
Worry, Intolerance of Uncertainty, and Statistics Anxiety
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
Schlegel, Claudia; Bonvin, Raphael; Rethans, Jan-Joost; Van der Vleuten, Cees
2016-08-01
The use of standardized patients (SPs) in health care education has grown in the last 50 years. In addition, the requirements for SPs have increased steadily, and thus, the work of SPs has become more difficult and demanding. It has been claimed that SP programs are highly contextualized, having emerged from local, institutional, professional, and national conditions, but their effects on SPs have not been investigated. We have studied the effects of this job development on SPs and their programs. The study was conducted using a qualitative research design, with semistructured individual in-depth interviews to understand the reactions, values, and perceptions that underlie and influence SP behavior. To cover SP perspectives from more than 1 SP program, a total of 15 SPs from 8 different nursing schools and medical schools in Switzerland were asked to participate. Standardized patients feel motivated, engaged, and willing to invest effort in their task and do not mind demands increasing as long as the social environment in SP programs is supportive. The role of the SP trainer and the use of feedback are considered very important. Standardized patient programs require concepts in which the SP perspective has been integrated to better serve SPs' well-being. Standardized patients are valuable partners in the training of health professionals-we need to take care of them.
International Nuclear Information System (INIS)
1978-11-01
The safety evaluation for the Westinghouse Standard Reactor includes information on general reactor characteristics; design criteria for systems and components; reactor coolant system; engineered safety systems; instrumentation and controls; electric power systems; auxiliary systems; steam and power conversion system; radioactive waste management; radiation protection; conduct of operations; accident analyses; and quality assurance
76 FR 50117 - Commission Rules and Forms Related to the FASB's Accounting Standards Codification
2011-08-12
.... generally accepted accounting principles (``U.S. GAAP''). Statement No. 168 became effective for financial... Codification'' is a registered trademark of the Financial Accounting Foundation. DATES: Effective Date: August... accounting principles established by a standard-setting body that meets specified criteria. On April 25, 2003...
1985-12-01
IRDS DATA ARCHITECTURE This section presents an overview of the framwork in which IRDS data is organized and presented to the user. 33 4 -T i .V 1...system-standard schema contains twelve entity-types that conceptually can be grouped into three categories, Data, Process, and External. [Ref. 32] 36
Mandana Bambenongama, Norbert; Losimba Likwela, Joris
2013-01-01
The infectious risk in the healthcare setting is potentially ubiquitous. Several infectious agents may be transmitted to healthcare professionals, most of which are carried by blood and body fluids. The aim of this study was to assess knowledge, attitudes and practices of healthcare workers in delivery rooms and operating theatres about standard precautions in healthcare settings in order to deduce the actions to be implemented to improve their security. A descriptive cross-sectional survey was conducted in September 2011. A questionnaire was sent to 96 people using the direct interview technique. Only 20% of study subjects were familiar with the main bloodborne viruses (HBV, HCV and HIV). 67.8% of them considered that standard precautions must be applied only to women in labour and suspected HIV-positive patients. Almost all respondents (91.1%) had already been subject to at least one AES during the last 12 months. Respondents appeared to have a poor knowledge of the recommended actions following an AES. Recapping of needles after care is a practice reported by 55.6% of respondents. Routine use of protective barriers is unsatisfactory. The frequent failure of systematic application of standard precautions in healthcare settings by healthcare workers in the city of Isiro should lead the Ministry of Health to implement a process designed to increase awareness about standard precautions and improve the equipment necessary for strict compliance with these precautions.
The Standard-Relational Theory of 'Ought' and the Oughtistic Theory of Reasons
Evers, Daan
2011-01-01
The idea that normative statements implicitly refer to standards has been around for quite some time. It is usually defended by normative antirealists, who tend to be attracted to Humean theories of reasons. But this is an awkward combination: 'A ought to X' entails that there are reasons for A to
DEFF Research Database (Denmark)
Olesen, Bjarne W.; de Carli, Michele
2011-01-01
According to the Energy Performance of Buildings Directive (EPBD) all new European buildings (residential, commercial, industrial, etc.) must since 2006 have an energy declaration based on the calculated energy performance of the building, including heating, ventilating, cooling and lighting syst......–20% of the building energy demand. The additional loss depends on the type of heat emitter, type of control, pump and boiler. Keywords: Heating systems; CEN standards; Energy performance; Calculation methods......According to the Energy Performance of Buildings Directive (EPBD) all new European buildings (residential, commercial, industrial, etc.) must since 2006 have an energy declaration based on the calculated energy performance of the building, including heating, ventilating, cooling and lighting...... systems. This energy declaration must refer to the primary energy or CO2 emissions. The European Organization for Standardization (CEN) has prepared a series of standards for energy performance calculations for buildings and systems. This paper presents related standards for heating systems. The relevant...
International Nuclear Information System (INIS)
Picel, K.C.
1995-01-01
Projected volumes of contaminated media and debris at US Department of Energy (DOE) environmental restoration sites that are potentially subject to the hazardous waste provisions of the Resource Conservation and Recovery Act are needed to support programmatic planning. Such projections have been gathered in various surveys conducted under DOE's environmental restoration and waste management programs. It is expected that reducing uncertainty in the projections through review of existing site data and process knowledge and through further site characterization will result in substantially lowered projections. If promulgated, the US Environmental Protection Agency's Hazardous Waste Identification Rule would result in potentially even greater reductions in the projections when site conditions are reviewed under the provisions of the new rule. Reducing uncertainty in projections under current and future waste identification rules may be necessary to support effective remediation planning. Further characterization efforts that may be conducted should be designed to limit uncertainty in identifying volumes of wastes to the extent needed to support alternative selection and to minimize costs of remediation
International Nuclear Information System (INIS)
Andres, T.H.
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Energy Technology Data Exchange (ETDEWEB)
Andres, T.H
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Lipscomb, Hester J; Li, Leiming; Dement, John
2003-08-01
Washington State enacted a change in their fall standard for the construction industry in 1991, preceding the Safety Standard for Fall Protection in the Construction Industry promulgated by Federal OSHA in 1994. We evaluated changes in the rate of falls from elevations and measures of severity among a large cohort of union carpenters after the fall standard change in Washington State, taking into account the temporal trends in their overall injury rates. There was a significant decrease in the rate of falls from height after the standard went into effect, even after adjusting for the overall decrease in work-related injuries among this cohort. Much of the decrease was immediate, likely representing the publicity surrounding fatal falls and subsequent promulgation of the standard. The greatest decrease was seen between 3 and 3(1/2) years after the standard went into effect. There was a significant reduction in mean paid lost days per event after the standard change and there was a significant reduction in mean cost per fall when adjusting for age and the temporal trend for costs among non-fall injuries. Through the use of observational methods we have demonstrated significant effects of the Washington State Vertical Fall Arrest Standard among carpenters in the absence of a control or comparison group. Without controlling for the temporal trend in overall injury rates, the rate of decline in falls appeared significantly greater, but the more pronounced, but delayed, decline was not seen. The analyses demonstrate potential error in failing to account for temporal patterns or assuming that a decline after an intervention is related to the intervention. Copyright 2003 Wiley-Liss, Inc.
Uncertainty and Cognitive Control
Directory of Open Access Journals (Sweden)
Faisal eMushtaq
2011-10-01
Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.
Directory of Open Access Journals (Sweden)
Mawardi Bahri
2017-01-01
Full Text Available The continuous quaternion wavelet transform (CQWT is a generalization of the classical continuous wavelet transform within the context of quaternion algebra. First of all, we show that the directional quaternion Fourier transform (QFT uncertainty principle can be obtained using the component-wise QFT uncertainty principle. Based on this method, the directional QFT uncertainty principle using representation of polar coordinate form is easily derived. We derive a variation on uncertainty principle related to the QFT. We state that the CQWT of a quaternion function can be written in terms of the QFT and obtain a variation on uncertainty principle related to the CQWT. Finally, we apply the extended uncertainty principles and properties of the CQWT to establish logarithmic uncertainty principles related to generalized transform.
Directory of Open Access Journals (Sweden)
Carlos Guillermo Carreno-Bodensiek
2016-12-01
Full Text Available This work presents the results of a research process applied to a sample of companies in the steel and metalworking sector in Boyacá, Colombia. The active workers are evaluated over the Occupational Competency Standards related to their daily activities. It also aims to highlight the formation priority of human talent for business, according to build up a level of competitiveness. Also, seeks to meet the need to train and develop skills and competencies in the workforce, taking into account the concepts of experts about training and developing proposals for management. This research is consistent with global trends in education and the requirements of standardization of training, why diagnoses and designs are focused on the functions of the companies related to the Standards of Competency.
Do Orthopaedic Surgeons Acknowledge Uncertainty?
Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert
2016-06-01
of experience. Two hundred forty-two (34%) members completed the survey. We found no differences between responders and nonresponders. Each survey item measured its own trait better than any of the other traits. Recognition of uncertainty (0.70) and confidence bias (0.75) had relatively high Cronbach alpha levels, meaning that the questions making up these traits are closely related and probably measure the same construct. This was lower for statistical understanding (0.48) and trust in the orthopaedic evidence base (0.37). Subsequently, combining each trait's individual questions, we calculated a 0 to 10 score for each trait. The mean recognition of uncertainty score was 3.2 ± 1.4. Recognition of uncertainty in daily practice did not vary by years in practice (0-5 years, 3.2 ± 1.3; 6-10 years, 2.9 ± 1.3; 11-20 years, 3.2 ± 1.4; 21-30 years, 3.3 ± 1.6 years; p = 0.51), but overconfidence bias did correlate with years in practice (0-5 years, 6.2 ± 1.4; 6-10 years, 7.1 ± 1.3; 11-20 years, 7.4 ± 1.4; 21-30 years, 7.1 ± 1.2 years; p < 0.001). Accounting for a potential interaction of variables using multivariable analysis, less recognition of uncertainty was independently but weakly associated with working in a multispecialty group compared with academic practice (β regression coefficient, -0.53; 95% confidence interval [CI], -1.0 to -0.055; partial R(2), 0.021; p = 0.029), belief in God or any other deity/deities (β, -0.57; 95% CI, -1.0 to -0.11; partial R(2), 0.026; p = 0.015), greater confidence bias (β, -0.26; 95% CI, -0.37 to -0.14; partial R(2), 0.084; p < 0.001), and greater trust in the orthopaedic evidence base (β, -0.16; 95% CI, -0.26 to -0.058; partial R(2), 0.040; p = 0.002). Better statistical understanding was independently, and more strongly, associated with greater recognition of uncertainty (β, 0.25; 95% CI, 0.17-0.34; partial R(2), 0.13; p < 0.001). Our full model accounted for 29% of the variability in recognition of uncertainty (adjusted
Watanabe, Takao; Stoorvogel, Antonie Arij
2001-01-01
A relation between the finite zero structure of the plant and the standard $H_\\infty$ controller was studied. The mechanism was also investigated using the ARE-based $H_\\infty$ controller that is represented by a free parameter. It was observed that the mechanism of the controller reduction is
International Nuclear Information System (INIS)
Pan Ziqiang
1996-01-01
The author describes the necessity of revising the existing radiation protection standard, and discusses the problems that need to be studied for the revision, which are mainly as follows: (1) Which exposure from natural radiation sources should be as part of occupational exposure; (2) Control of the occupational exposure of the pregnant woman; (3) Chronic exposure and action level; (4) Control of potential exposure; (5) Health surveillance
A relation between the Barbero-Immirzi parameter and the standard model
Energy Technology Data Exchange (ETDEWEB)
Broda, Boguslaw, E-mail: bobroda@uni.lodz.p [Department of Theoretical Physics, University of Lodz, Pomorska 149/153, PL-90-236 Lodz (Poland); Szanecki, Michal, E-mail: michalszanecki@wp.p [Department of Theoretical Physics, University of Lodz, Pomorska 149/153, PL-90-236 Lodz (Poland)
2010-06-07
It has been shown that Sakharov's induced, from the fields entering the standard model, Barbero-Immirzi parameter {gamma} assumes, in the framework of Euclidean formalism, the UV cutoff-independent value, 1/9. The calculus uses the Schwinger's proper-time formalism, the Seeley-DeWitt heat-kernel expansion, and it is akin to the derivation of the ABJ chiral anomaly in space-time with torsion.
The relation between respiratory motion artifact correction and lung standardized uptake value
International Nuclear Information System (INIS)
Yin Lijie; Liu Xiaojian; Liu Jie; Xu Rui; Yan Jue
2014-01-01
PET/CT is playing an important role in disease diagnosis and therapeutic evaluation. But the respiratory motion artifact may bring trouble in diagnosis and therapy. There are many methods to correct the respiratory motion artifact. Respiratory gated PET/CT is applied most extensively of them. Using respiratory gated PET/CT to correct respiratory motion artifact can increase the maximum standardized uptake value of lung lesion obviously, thereby improving the quality of image and accuracy of diagnosis. (authors)
Statistical characterization of roughness uncertainty and impact on wind resource estimation
Directory of Open Access Journals (Sweden)
M. Kelly
2017-04-01
Full Text Available In this work we relate uncertainty in background roughness length (z0 to uncertainty in wind speeds, where the latter are predicted at a wind farm location based on wind statistics observed at a different site. Sensitivity of predicted winds to roughness is derived analytically for the industry-standard European Wind Atlas method, which is based on the geostrophic drag law. We statistically consider roughness and its corresponding uncertainty, in terms of both z0 derived from measured wind speeds as well as that chosen in practice by wind engineers. We show the combined effect of roughness uncertainty arising from differing wind-observation and turbine-prediction sites; this is done for the case of roughness bias as well as for the general case. For estimation of uncertainty in annual energy production (AEP, we also develop a generalized analytical turbine power curve, from which we derive a relation between mean wind speed and AEP. Following our developments, we provide guidance on approximate roughness uncertainty magnitudes to be expected in industry practice, and we also find that sites with larger background roughness incur relatively larger uncertainties.
Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)
International Nuclear Information System (INIS)
BABA, T.; ISHIGURO, K.; ISHIHARA, Y.; SAWADA, A.; UMEKI, H.; WAKASUGI, K.; WEBB, ERIK K.
1999-01-01
Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs were defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment
An introductory guide to uncertainty analysis in environmental and health risk assessment
International Nuclear Information System (INIS)
Hoffman, F.O.; Hammonds, J.S.
1992-10-01
To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites
Energy Technology Data Exchange (ETDEWEB)
Kim, Myung Hyeon; Shin, Myeong Won; Rhy, Seok Jin; Cho, Dong Keon; Park, Dong Hwan [Kyunghee Univ., Seoul (Korea, Republic of); Cheong, Beom Jin [Minstry of Science and Technology, Gwacheon (Korea, Republic of)
1998-03-15
The main objective os to develop a technical standards for the facility operation of the interm, spent fuel storage facility and to develop a draft for the technical criteria to be legislated. The another objective os to define a uncertainty evaluation system for burn up credit application in criticality analysis and to investigate an applicability of this topic for future regulatory activity. Investigate a status of art for the operational criteria of spent fuel interm wet storage. Collect relevant laws, decree, notices and standards related to the operation of storage facility and study on the legislation system. Develop a draft of technical standards and criteria to be legislated. Define an evaluation system for the uncertainty analysis and study on the status of art in the field of criticality safety analysis. Develop an uncertainty evaluation system in criticality analysis with burnup credit and investigate an applicability as well as its benefits of this policy.
Directory of Open Access Journals (Sweden)
Marco Fedrizzi
2015-12-01
Full Text Available This paper describes the methods used in the monitoring carried out in the farms of the MO.NA.CO. project, to calculate the economic competitiveness gap faced by agricultural holdings that accede to the commitments imposed by the standards included in the project. The monitoring works were performed in agricultural holdings in relation to the particular reference condition of each standard. The processing of the information acquired allowed us to define the working times of each cultivation operation by means of the indications in the recommendations of the Associazione Italiana di Genio Rurale - Italian Rural Engineering Association, that considers the official methodology of the International Commission of the Organisation Scientifique du Travail en Agriculture (C.I.O.S.T.A.. The overall costs and revenues in case of compliance or non-compliance with the commitments of the standard were calculated by using Biondi’s methodology and other norms that indicate the technical and economic coefficients to be used in the calculations (EP 496.2 and D 497.4 ASAE standards. With the data related to the unit cost of ploughing a model Partial Least Squares (PLS has been achieved and validated, and it makes possible to predict the unit cost of this agricultural operation. Finally, the values of the variation of the economic competitiveness gap are reported for each standard.
Uncertainty estimation of uranium determination in urine by fluorometry
International Nuclear Information System (INIS)
Shakhashiro, A.; Al-Khateeb, S.
2003-11-01
In this study an applicable mathematical model is proposed for the estimation of uncertainty in uranium determination by fluorometry in urine sample. The study based on EURACHEM guide for uncertainty estimation. This model was tested on a sample containing 0.02 μg/ml uranium, where calculated uncertainty was 0.007 μg/ml. The sources of uncertainty were shown on fish-bone plane as the following: In addition, the weight of each uncertainty parameter was shown in a histogram: Finally, it was found that the estimated uncertainty by the proposed model was 3 to 4 time more that the usually reported standard deviation. (author)
Treatment and reporting of uncertainties for environmental radiation measurements
International Nuclear Information System (INIS)
Colle, R.
1980-01-01
Recommendations for a practical and uniform method for treating and reporting uncertainties in environmental radiation measurements data are presented. The method requires that each reported measurement result include the value, a total propagated random uncertainty expressed as the standard deviation, and a combined overall uncertainty. The uncertainty assessment should be based on as nearly a complete assessment as possible and should include every conceivable or likely source of inaccuracy in the result. Guidelines are given for estimating random and systematic uncertainty components, and for propagating and combining them to form an overall uncertainty
Treatment of measurement uncertainties at the power burst facility
International Nuclear Information System (INIS)
Meyer, L.C.
1980-01-01
The treatment of measurement uncertainty at the Power Burst Facility provides a means of improving data integrity as well as meeting standard practice reporting requirements. This is accomplished by performing the uncertainty analysis in two parts, test independent uncertainty analysis and test dependent uncertainty analysis. The test independent uncertainty analysis is performed on instrumentation used repeatedly from one test to the next, and does not have to be repeated for each test except for improved or new types of instruments. A test dependent uncertainty analysis is performed on each test based on the test independent uncertainties modified as required by test specifications, experiment fixture design, and historical performance of instruments on similar tests. The methodology for performing uncertainty analysis based on the National Bureau of Standards method is reviewed with examples applied to nuclear instrumentation
Lacey, Ronald E; Faulkner, William Brock
2015-07-01
This work applied a propagation of uncertainty method to typical total suspended particulate (TSP) sampling apparatus in order to estimate the overall measurement uncertainty. The objectives of this study were to estimate the uncertainty for three TSP samplers, develop an uncertainty budget, and determine the sensitivity of the total uncertainty to environmental parameters. The samplers evaluated were the TAMU High Volume TSP Sampler at a nominal volumetric flow rate of 1.42 m3 min(-1) (50 CFM), the TAMU Low Volume TSP Sampler at a nominal volumetric flow rate of 17 L min(-1) (0.6 CFM) and the EPA TSP Sampler at the nominal volumetric flow rates of 1.1 and 1.7 m3 min(-1) (39 and 60 CFM). Under nominal operating conditions the overall measurement uncertainty was found to vary from 6.1x10(-6) g m(-3) to 18.0x10(-6) g m(-3), which represented an uncertainty of 1.7% to 5.2% of the measurement. Analysis of the uncertainty budget determined that three of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative humidity. Of these, only ambient TSP concentration and volumetric airflow rate were found to have a strong effect on the overall uncertainty. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically. This work addresses measurement uncertainty of TSP samplers used in ambient conditions. Estimation of uncertainty in gravimetric measurements is of particular interest, since as ambient particulate
Directory of Open Access Journals (Sweden)
Denise L Demmer
Full Text Available Assessment of adiposity using dual energy x-ray absorptiometry (DXA has been considered more advantageous in comparison to anthropometry for predicting cardio-metabolic risk in the older population, by virtue of its ability to distinguish total and regional fat. Nonetheless, there is increasing uncertainty regarding the relative superiority of DXA and little comparative data exist in young adults. This study aimed to identify which measure of adiposity determined by either DXA or anthropometry is optimal within a range of cardio-metabolic risk factors in young adults.1138 adults aged 20 years were assessed by DXA and standard anthropometry from the Western Australian Pregnancy Cohort (Raine Study. Cross-sectional linear regression analyses were performed. Waist to height ratio was superior to any DXA measure with HDL-C. BMI was the superior model in relation to blood pressure than any DXA measure. Midriff fat mass (DXA and waist circumference were comparable in relation to glucose. For all the other cardio-metabolic variables, anthropometric and DXA measures were comparable. DXA midriff fat mass compared with BMI or waist hip ratio was the superior measure for triglycerides, insulin and HOMA-IR.Although midriff fat mass (measured by DXA was the superior measure with insulin sensitivity and triglycerides, the anthropometric measures were better or equal with various DXA measures for majority of the cardio-metabolic risk factors. Our findings suggest, clinical anthropometry is generally as useful as DXA in the evaluation of the individual cardio-metabolic risk factors in young adults.
International Nuclear Information System (INIS)
Okrent, D.
1993-01-01
This paper asks whether some of the fundamental bases for the 1985 USEPA standard on disposal of high level radioactive wastes (40 CFR Part 191) warrant re-examination. Similar questions also apply to the bases for the radioactive waste disposal requirements proposed by most other countries. It is suggested that the issue of intergenerational equity has been dealt with from too narrow a perspective. Not only should radioactive and nonradioactive hazardous waste disposal be regulated from a consistent philosophic basis, but the regulation of waste disposal itself should be embedded in the broader issues of intergenerational conservation of options, conservation of quality, and conservation of access. (author). 25 refs
International Nuclear Information System (INIS)
Davis, C.B.
1987-08-01
The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results
Academic Public Relations Curricula: How They Compare with the Bateman-Cutlip Commission Standards.
McCartney, Hunter P.
To see what effect the 1975 Bateman-Cutlip Commission's recommendations have had on improving public relations education in the United States, 173 questionnaires were sent to colleges or universities with accredited or comprehensive programs in public relations. Responding to five basic assumptions underlying the commission's recommendations,…
Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Lozano-Rubí, Raimundo; Serrano-Balazote, Pablo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario
2017-08-18
The objective of this research is to compare the relational and non-relational (NoSQL) database systems approaches in order to store, recover, query and persist standardized medical information in the form of ISO/EN 13606 normalized Electronic Health Record XML extracts, both in isolation and concurrently. NoSQL database systems have recently attracted much attention, but few studies in the literature address their direct comparison with relational databases when applied to build the persistence layer of a standardized medical information system. One relational and two NoSQL databases (one document-based and one native XML database) of three different sizes have been created in order to evaluate and compare the response times (algorithmic complexity) of six different complexity growing queries, which have been performed on them. Similar appropriate results available in the literature have also been considered. Relational and non-relational NoSQL database systems show almost linear algorithmic complexity query execution. However, they show very different linear slopes, the former being much steeper than the two latter. Document-based NoSQL databases perform better in concurrency than in isolation, and also better than relational databases in concurrency. Non-relational NoSQL databases seem to be more appropriate than standard relational SQL databases when database size is extremely high (secondary use, research applications). Document-based NoSQL databases perform in general better than native XML NoSQL databases. EHR extracts visualization and edition are also document-based tasks more appropriate to NoSQL database systems. However, the appropriate database solution much depends on each particular situation and specific problem.
Integrity, standards, and QC-related issues with big data in pre-clinical drug discovery.
Brothers, John F; Ung, Matthew; Escalante-Chong, Renan; Ross, Jermaine; Zhang, Jenny; Cha, Yoonjeong; Lysaght, Andrew; Funt, Jason; Kusko, Rebecca
2018-06-01
The tremendous expansion of data analytics and public and private big datasets presents an important opportunity for pre-clinical drug discovery and development. In the field of life sciences, the growth of genetic, genomic, transcriptomic and proteomic data is partly driven by a rapid decline in experimental costs as biotechnology improves throughput, scalability, and speed. Yet far too many researchers tend to underestimate the challenges and consequences involving data integrity and quality standards. Given the effect of data integrity on scientific interpretation, these issues have significant implications during preclinical drug development. We describe standardized approaches for maximizing the utility of publicly available or privately generated biological data and address some of the common pitfalls. We also discuss the increasing interest to integrate and interpret cross-platform data. Principles outlined here should serve as a useful broad guide for existing analytical practices and pipelines and as a tool for developing additional insights into therapeutics using big data. Copyright © 2018 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Hernandez Caceres, Jose Luis; Hong, Rolando; Martinez Ortiz, Carlos; Sautie Castellanos, Miguel; Valdes, Kiria; Guevara Erra, Ramon
2004-10-01
Under the assumption of even point mutation pressure on the DNA strand, rates for transitions from one amino acid into another were assessed. Nearly 25% of all mutations were silent. About 48% of the mutations from a given amino acid stream either into the same amino acid or into an amino acid of the same class. These results suggest a great stability of the Standard Genetic Code respect to mutation load. Concepts from chemical equilibrium theory are applicable into this case provided that mutation rate constants are given. It was obtained that unequal synonymic codon usage may lead to changes in the equilibrium concentrations. Data from real biological species showed that several amino acids are close to the respective equilibrium concentration. However in all the cases the concentration of leucine nearly doubled its equilibrium concentration, whereas for the stop command (Term) it was about 10 times lower. The overall distance from equilibrium for a set of species suggests that eukaryotes are closer to equilibrium than prokaryotes, and the HIV virus was closest to equilibrium among 15 species. We obtained that contemporary species are closer to the equilibrium than the Last Universal Common Ancestor (LUCA) was. Similarly, nonpreserved regions in proteins are closer to equilibrium than the preserved ones. We suggest that this approach can be useful for exploring some aspects of biological evolution in the framework of Standard Genetic Code properties. (author)
DEFF Research Database (Denmark)
Winkler, Ingo
2008-01-01
The article presents the results of an explorative study that aimed at exploring work related issues in students’ perceptions of their job as atypical employees. An individual picture of the experienced work reality of students is drawn according to work task, flexible working hours, instructions...... and training opportunities, students’ relations to other employees, and social integration. By adopting a qualitative design, I was able to emphasize the subjective perspective of students describing their very own experiences as flexible workers. The study revealed various perceptions of students working...... as flexible employees and related this picture to current empirical and theoretical research in the field of non-standard employment....
Addressing uncertainties in the ERICA Integrated Approach
International Nuclear Information System (INIS)
Oughton, D.H.; Agueero, A.; Avila, R.; Brown, J.E.; Copplestone, D.; Gilek, M.
2008-01-01
Like any complex environmental problem, ecological risk assessment of the impacts of ionising radiation is confounded by uncertainty. At all stages, from problem formulation through to risk characterisation, the assessment is dependent on models, scenarios, assumptions and extrapolations. These include technical uncertainties related to the data used, conceptual uncertainties associated with models and scenarios, as well as social uncertainties such as economic impacts, the interpretation of legislation, and the acceptability of the assessment results to stakeholders. The ERICA Integrated Approach has been developed to allow an assessment of the risks of ionising radiation, and includes a number of methods that are intended to make the uncertainties and assumptions inherent in the assessment more transparent to users and stakeholders. Throughout its development, ERICA has recommended that assessors deal openly with the deeper dimensions of uncertainty and acknowledge that uncertainty is intrinsic to complex systems. Since the tool is based on a tiered approach, the approaches to dealing with uncertainty vary between the tiers, ranging from a simple, but highly conservative screening to a full probabilistic risk assessment including sensitivity analysis. This paper gives on overview of types of uncertainty that are manifest in ecological risk assessment and the ERICA Integrated Approach to dealing with some of these uncertainties
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Energy Technology Data Exchange (ETDEWEB)
Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Inflation and Inflation Uncertainty Revisited: Evidence from Egypt
Directory of Open Access Journals (Sweden)
Mesbah Fathy Sharaf
2015-07-01
Full Text Available The welfare costs of inflation and inflation uncertainty are well documented in the literature and empirical evidence on the link between the two is sparse in the case of Egypt. This paper investigates the causal relationship between inflation and inflation uncertainty in Egypt using monthly time series data during the period January 1974–April 2015. To endogenously control for any potential structural breaks in the inflation time series, Zivot and Andrews (2002 and Clemente–Montanes–Reyes (1998 unit root tests are used. The inflation–inflation uncertainty relation is modeled by the standard two-step approach as well as simultaneously using various versions of the GARCH-M model to control for any potential feedback effects. The analyses explicitly control for the effect of the Economic Reform and Structural Adjustment Program (ERSAP undertaken by the Egyptian government in the early 1990s, which affected inflation rate and its associated volatility. Results show a high degree of inflation–volatility persistence in the response to inflationary shocks. Granger-causality test along with symmetric and asymmetric GARCH-M models indicate a statistically significant bi-directional positive relationship between inflation and inflation uncertainty, supporting both the Friedman–Ball and the Cukierman–Meltzer hypotheses. The findings are robust to the various estimation methods and model specifications. The findings of this paper support the view of adopting inflation-targeting policy in Egypt, after fulfilling its preconditions, to reduce the welfare cost of inflation and its related uncertainties. Monetary authorities in Egypt should enhance the credibility of monetary policy and attempt to reduce inflation uncertainty, which will help lower inflation rates.
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Uncertainty in artificial intelligence
Kanal, LN
1986-01-01
How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.
Uncertainties in hydrogen combustion
International Nuclear Information System (INIS)
Stamps, D.W.; Wong, C.C.; Nelson, L.S.
1988-01-01
Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references
Model Uncertainty for Bilinear Hysteretic Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Thoft-Christensen, Palle
1984-01-01
. The statistical uncertainty -due to lack of information can e.g. be taken into account by describing the variables by predictive density functions, Veneziano [2). In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis...... is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...
Phenomenon of Uncertainty as a Subjective Experience
Directory of Open Access Journals (Sweden)
Lifintseva A.A.
2018-04-01
Full Text Available The phenomenon of uncertainty in illness of patients is discussed and analyzed in this article. Uncertainty in illness is a condition that accompanies the patient from the moment of appearance of the first somatic symptoms of the disease and could be strengthened or weakened thanks to many psychosocial factors. The level of uncertainty is related to the level of stress, emotional disadaptation, affective states, coping strategies, mechanisms of psychological defense, etc. Uncertainty can perform destructive functions, acting as a trigger for stressful conditions and launching negative emotional experiences. As a positive function of uncertainty, one can note a possible positive interpretation of the patient's disease. In addition, the state of uncertainty allows the patient to activate the resources of coping with the disease, among which the leading role belongs to social support.
Visual Semiotics & Uncertainty Visualization: An Empirical Study.
MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M
2012-12-01
This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.
Technical issues related to NUREG 0800, Chapter 18: Human Factors Engineering/Standard Review Plan
International Nuclear Information System (INIS)
Savage, J.W.
1982-01-01
The revision of Chapter 18 of NUREG 0800, Human Factors Engineering Standard Review Plan (SRP) will be based on SECY 82-111 and guidance contained in NUREG 0700, NUREG 0801 and NUREG 0835, plus other references. In conducting field reviews of control rooms, the NRC has identified technical issues which can be used to enhance the development of the revised version of NUREG 0800, and to establish priorities among the list of possible Branch Technical Positions (BTP) in NUREG 0800, Rev. 0, Table 18.0-2. This report is a compilation of comments and suggestions from the people who used NUREG 0700 in the Control Room field reviews. This information was used to establish possible BTP topic priorities so that the most important BTPs could be issued first. The comments and suggestions are included for HFEB review in conjunction with the table of priorities
Energy Technology Data Exchange (ETDEWEB)
Platts-Mills, T.A.E.; Chapman, M.D.; Pollart, S.M.; Heymann, P.W.; Luczynska, C.M. (Univ. of Virginia, Charlottesville (United States))
1990-01-01
There is no doubt that a large number of individuals become allergic to foreign proteins that are predominantly or exclusively present indoors. In each case this immune response can be demonstrated either by immediate skin test responses or by measuring serum IgE antibodies. It has also been obvious for some time that patients presenting with asthma, perennial rhinitis and atopic dermatitis have an increased prevalence of IgE antibodies to these indoor allergens. More recently several epidemiological surveys have suggested that both mite exposure and IgE antibodies are important risk factors for asthma. The present situation is that assays have been developed capable of measuring the presence of mite, cockroach and cat allergens in house dust. Further clinical studies will be necessary to test the proposed standards for mite allergens and to define risk levels for other allergens.
International Nuclear Information System (INIS)
Stauffer, J.R.; Cherry, D.S.; Dickson, K.L.; Cairns, J. Jr.
1975-01-01
Temperature preferences for important fish species in the New River in the vicinity of Appalachian Power Company's Glen Lyn, Virginia plant were determined independently by both field and laboratory studies. A relationship was demonstrated between the temperature preference data generated by the two approaches. Based on the temperature preference data the responses of fish to the thermal discharges can be predicted. From these data and from other data on the fish community structure, it was possible to determine that the thermal discharge was causing no appreciable harm to the fish community. Based on these studies it was concluded that the most reasonable approach to establishing thermal standards is to couple temperature preference studies with site specific studies. (U.S.)
A Web tool for calculating k0-NAA uncertainties
International Nuclear Information System (INIS)
Younes, N.; Robouch, P.
2003-01-01
The calculation of uncertainty budgets is becoming a standard step in reporting analytical results. This gives rise to the need for simple, easily accessed tools to calculate uncertainty budgets. An example of such a tool is the Excel spreadsheet approach of Robouch et al. An internet application which calculates uncertainty budgets for k 0 -NAA is presented. The Web application has built in 'Literature' values for standard isotopes and accepts as inputs fixed information such as the thermal to epithermal neutron flux ratio, as well as experiment specific data such as the mass of the sample. The application calculates and displays intermediate uncertainties as well as the final combined uncertainty of the element concentration in the sample. The interface only requires access to a standard browser and is thus easily accessible to researchers and laboratories. This may facilitate and standardize the calculation of k 0 -NAA uncertainty budgets. (author)
Fuzzy Uncertainty Evaluation for Fault Tree Analysis
Energy Technology Data Exchange (ETDEWEB)
Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)
2015-05-15
This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.
Hallett, Chris S.; Hall, Norm G.
2012-06-01
We describe a method for modelling the relative effects of seine net biases and for deriving equivalence factors to standardize fish abundance data sets collected using multiple sampling gears. Nearshore fish communities were sampled from 10 sites in each of the basin and riverine portions of the Swan-Canning Estuary, Western Australia, using beach seine nets of three different lengths (21.5, 41.5 and 133 m). The resulting data were subjected to generalized linear modelling to derive equivalence factors relating catches from the two larger net types to those from the 21.5 m net. Equivalence factors were derived on the basis of functional habitat guilds of fish (small benthic, small pelagic, demersal, pelagic). Prior to standardization, catches from the 41.5 and 133 m nets consistently underestimated fish densities relative to those from the 21.5 m net. Following standardization, the degree to which fish densities were underestimated by the two larger nets was reduced and/or eliminated for most guilds, and particularly in the case of the 133 m net. For both of the larger nets, standardized estimates of total fish density across all species were far closer to those recorded using the 21.5 m seine, thus indicating that standardization of the fish abundance data had greatly reduced the overall effects of the biases introduced by the different net types. This approach could be applied to other systems and sampling methods, to facilitate more robust comparisons of fish abundances between studies with divergent sampling methodologies.
Uncertainty analysis for secondary energy distributions
International Nuclear Information System (INIS)
Gerstl, S.A.W.
1978-01-01
In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities