WorldWideScience

Sample records for generalised uncertainty principle

  1. Expanding Uncertainty Principle to Certainty-Uncertainty Principles with Neutrosophy and Quad-stage Method

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-03-01

    Full Text Available The most famous contribution of Heisenberg is uncertainty principle. But the original uncertainty principle is improper. Considering all the possible situations (including the case that people can create laws and applying Neutrosophy and Quad-stage Method, this paper presents "certainty-uncertainty principles" with general form and variable dimension fractal form. According to the classification of Neutrosophy, "certainty-uncertainty principles" can be divided into three principles in different conditions: "certainty principle", namely a particle’s position and momentum can be known simultaneously; "uncertainty principle", namely a particle’s position and momentum cannot be known simultaneously; and neutral (fuzzy "indeterminacy principle", namely whether or not a particle’s position and momentum can be known simultaneously is undetermined. The special cases of "certain ty-uncertainty principles" include the original uncertainty principle and Ozawa inequality. In addition, in accordance with the original uncertainty principle, discussing high-speed particle’s speed and track with Newton mechanics is unreasonable; but according to "certaintyuncertainty principles", Newton mechanics can be used to discuss the problem of gravitational defection of a photon orbit around the Sun (it gives the same result of deflection angle as given by general relativity. Finally, for the reason that in physics the principles, laws and the like that are regardless of the principle (law of conservation of energy may be invalid; therefore "certaintyuncertainty principles" should be restricted (or constrained by principle (law of conservation of energy, and thus it can satisfy the principle (law of conservation of energy.

  2. A Variation on Uncertainty Principle and Logarithmic Uncertainty Principle for Continuous Quaternion Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    Mawardi Bahri

    2017-01-01

    Full Text Available The continuous quaternion wavelet transform (CQWT is a generalization of the classical continuous wavelet transform within the context of quaternion algebra. First of all, we show that the directional quaternion Fourier transform (QFT uncertainty principle can be obtained using the component-wise QFT uncertainty principle. Based on this method, the directional QFT uncertainty principle using representation of polar coordinate form is easily derived. We derive a variation on uncertainty principle related to the QFT. We state that the CQWT of a quaternion function can be written in terms of the QFT and obtain a variation on uncertainty principle related to the CQWT. Finally, we apply the extended uncertainty principles and properties of the CQWT to establish logarithmic uncertainty principles related to generalized transform.

  3. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  4. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  5. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  6. Gamma-Ray Telescope and Uncertainty Principle

    Science.gov (United States)

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  7. On the uncertainty principle. V

    International Nuclear Information System (INIS)

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  8. Quantum Action Principle with Generalized Uncertainty Principle

    OpenAIRE

    Gu, Jie

    2013-01-01

    One of the common features in all promising candidates of quantum gravity is the existence of a minimal length scale, which naturally emerges with a generalized uncertainty principle, or equivalently a modified commutation relation. Schwinger's quantum action principle was modified to incorporate this modification, and was applied to the calculation of the kernel of a free particle, partly recovering the result previously studied using path integral.

  9. Limited entropic uncertainty as new principle of quantum physics

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    2001-01-01

    The Uncertainty Principle (UP) of quantum mechanics discovered by Heisenberg, which constitute the corner-stone of quantum physics, asserts that: there is an irreducible lower bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables. In order to avoid this state-dependence many authors proposed to use the information entropy as a measure of the uncertainty instead of above standard quantitative formulation of the Heisenberg uncertainty principle. In this paper the Principle of Limited Entropic Uncertainty (LEU-Principle), as a new principle in quantum physics, is proved. Then, consistent experimental tests of the LEU-principle, obtained by using the available 49 sets of the pion-nucleus phase shifts, are presented for both, extensive (q=1) and nonextensive (q=0.5 and q=2.0) cases. Some results obtained by the application of LEU-Principle to the diffraction phenomena are also discussed. The main results and conclusions of our paper can be summarized as follows: (i) We introduced a new principle in quantum physics namely the Principle of Limited Entropic Uncertainty (LEU-Principle). This new principle includes in a more general and exact form not only the old Heisenberg uncertainty principle but also introduce an upper limit on the magnitude of the uncertainty in the quantum physics. The LEU-Principle asserts that: 'there is an irreducible lower bound as well as an upper bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables for any extensive and nonextensive (q ≥ 0) quantum systems'; (ii) Two important concrete realizations of the LEU-Principle are explicitly obtained in this paper, namely: (a) the LEU-inequalities for the quantum scattering of spinless particles and (b) the LEU-inequalities for the diffraction on single slit of width 2a. In particular from our general results, in the limit y → +1 we recover in an exact form all the results previously reported. In our paper an

  10. Uncertainty principle for angular position and angular momentum

    International Nuclear Information System (INIS)

    Franke-Arnold, Sonja; Barnett, Stephen M; Yao, Eric; Leach, Jonathan; Courtial, Johannes; Padgett, Miles

    2004-01-01

    The uncertainty principle places fundamental limits on the accuracy with which we are able to measure the values of different physical quantities (Heisenberg 1949 The Physical Principles of the Quantum Theory (New York: Dover); Robertson 1929 Phys. Rev. 34 127). This has profound effects not only on the microscopic but also on the macroscopic level of physical systems. The most familiar form of the uncertainty principle relates the uncertainties in position and linear momentum. Other manifestations include those relating uncertainty in energy to uncertainty in time duration, phase of an electromagnetic field to photon number and angular position to angular momentum (Vaccaro and Pegg 1990 J. Mod. Opt. 37 17; Barnett and Pegg 1990 Phys. Rev. A 41 3427). In this paper, we report the first observation of the last of these uncertainty relations and derive the associated states that satisfy the equality in the uncertainty relation. We confirm the form of these states by detailed measurement of the angular momentum of a light beam after passage through an appropriate angular aperture. The angular uncertainty principle applies to all physical systems and is particularly important for systems with cylindrical symmetry

  11. What is the uncertainty principle of non-relativistic quantum mechanics?

    Science.gov (United States)

    Riggs, Peter J.

    2018-05-01

    After more than ninety years of discussions over the uncertainty principle, there is still no universal agreement on what the principle states. The Robertson uncertainty relation (incorporating standard deviations) is given as the mathematical expression of the principle in most quantum mechanics textbooks. However, the uncertainty principle is not merely a statement of what any of the several uncertainty relations affirm. It is suggested that a better approach would be to present the uncertainty principle as a statement about the probability distributions of incompatible variables and the resulting restrictions on quantum states.

  12. Human perception and the uncertainty principle

    International Nuclear Information System (INIS)

    Harney, R.C.

    1976-01-01

    The concept of the uncertainty principle that position and momentum cannot be simultaneously specified to arbitrary accuracy is somewhat difficult to reconcile with experience. This note describes order-of-magnitude calculations which quantify the inadequacy of human perception with regards to direct observation of the breakdown of the trajectory concept implied by the uncertainty principle. Even with the best optical microscope, human vision is inadequate by three orders of magnitude. 1 figure

  13. A revision of the generalized uncertainty principle

    International Nuclear Information System (INIS)

    Bambi, Cosimo

    2008-01-01

    The generalized uncertainty principle arises from the Heisenberg uncertainty principle when gravity is taken into account, so the leading order correction to the standard formula is expected to be proportional to the gravitational constant G N = L 2 Pl . On the other hand, the emerging picture suggests a set of departures from the standard theory which demand a revision of all the arguments used to deduce heuristically the new rule. In particular, one can now argue that the leading order correction to the Heisenberg uncertainty principle is proportional to the first power of the Planck length L Pl . If so, the departures from ordinary quantum mechanics would be much less suppressed than what is commonly thought

  14. Dilaton cosmology and the modified uncertainty principle

    International Nuclear Information System (INIS)

    Majumder, Barun

    2011-01-01

    Very recently Ali et al. (2009) proposed a new generalized uncertainty principle (with a linear term in Plank length which is consistent with doubly special relativity and string theory. The classical and quantum effects of this generalized uncertainty principle (termed as modified uncertainty principle or MUP) are investigated on the phase space of a dilatonic cosmological model with an exponential dilaton potential in a flat Friedmann-Robertson-Walker background. Interestingly, as a consequence of MUP, we found that it is possible to get a late time acceleration for this model. For the quantum mechanical description in both commutative and MUP framework, we found the analytical solutions of the Wheeler-DeWitt equation for the early universe and compare our results. We have used an approximation method in the case of MUP.

  15. Economic uncertainty principle?

    OpenAIRE

    Alexander Harin

    2006-01-01

    The economic principle of (hidden) uncertainty is presented. New probability formulas are offered. Examples of solutions of three types of fundamental problems are reviewed.; Principe d'incertitude économique? Le principe économique d'incertitude (cachée) est présenté. De nouvelles formules de chances sont offertes. Les exemples de solutions des trois types de problèmes fondamentaux sont reconsidérés.

  16. The Uncertainty Principle in the Presence of Quantum Memory

    Science.gov (United States)

    Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato

    2010-03-01

    One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.

  17. Quantum wells and the generalized uncertainty principle

    International Nuclear Information System (INIS)

    Blado, Gardo; Owens, Constance; Meyers, Vincent

    2014-01-01

    The finite and infinite square wells are potentials typically discussed in undergraduate quantum mechanics courses. In this paper, we discuss these potentials in the light of the recent studies of the modification of the Heisenberg uncertainty principle into a generalized uncertainty principle (GUP) as a consequence of attempts to formulate a quantum theory of gravity. The fundamental concepts of the minimal length scale and the GUP are discussed and the modified energy eigenvalues and transmission coefficient are derived. (paper)

  18. The uncertainty principle

    International Nuclear Information System (INIS)

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  19. Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?

    Science.gov (United States)

    Robertson, Bill

    2016-01-01

    Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…

  20. Some Implications of Two Forms of the Generalized Uncertainty Principle

    Directory of Open Access Journals (Sweden)

    Mohammed M. Khalil

    2014-01-01

    Full Text Available Various theories of quantum gravity predict the existence of a minimum length scale, which leads to the modification of the standard uncertainty principle to the Generalized Uncertainty Principle (GUP. In this paper, we study two forms of the GUP and calculate their implications on the energy of the harmonic oscillator and the hydrogen atom more accurately than previous studies. In addition, we show how the GUP modifies the Lorentz force law and the time-energy uncertainty principle.

  1. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  2. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  3. The action uncertainty principle and quantum gravity

    Science.gov (United States)

    Mensky, Michael B.

    1992-02-01

    Results of the path-integral approach to the quantum theory of continuous measurements have been formulated in a preceding paper in the form of an inequality of the type of the uncertainty principle. The new inequality was called the action uncertainty principle, AUP. It was shown that the AUP allows one to find in a simple what outputs of the continuous measurements will occur with high probability. Here a more simple form of the AUP will be formulated, δ S≳ħ. When applied to quantum gravity, it leads in a very simple way to the Rosenfeld inequality for measurability of the average curvature.

  4. Uncertainty principle in loop quantum cosmology by Moyal formalism

    Science.gov (United States)

    Perlov, Leonid

    2018-03-01

    In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.

  5. The role of general relativity in the uncertainty principle

    International Nuclear Information System (INIS)

    Padmanabhan, T.

    1986-01-01

    The role played by general relativity in quantum mechanics (especially as regards the uncertainty principle) is investigated. It is confirmed that the validity of time-energy uncertainty does depend on gravitational time dilation. It is also shown that there exists an intrinsic lower bound to the accuracy with which acceleration due to gravity can be measured. The motion of equivalence principle in quantum mechanics is clarified. (author)

  6. Generalized uncertainty principle as a consequence of the effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Faizal, Mir, E-mail: mirfaizalmir@gmail.com [Irving K. Barber School of Arts and Sciences, University of British Columbia – Okanagan, Kelowna, British Columbia V1V 1V7 (Canada); Department of Physics and Astronomy, University of Lethbridge, Lethbridge, Alberta T1K 3M4 (Canada); Ali, Ahmed Farag, E-mail: ahmed.ali@fsc.bu.edu.eg [Department of Physics, Faculty of Science, Benha University, Benha, 13518 (Egypt); Netherlands Institute for Advanced Study, Korte Spinhuissteeg 3, 1012 CG Amsterdam (Netherlands); Nassar, Ali, E-mail: anassar@zewailcity.edu.eg [Department of Physics, Zewail City of Science and Technology, 12588, Giza (Egypt)

    2017-02-10

    We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.

  7. Generalized uncertainty principle as a consequence of the effective field theory

    Directory of Open Access Journals (Sweden)

    Mir Faizal

    2017-02-01

    Full Text Available We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.

  8. Sequential experimental design based generalised ANOVA

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    2016-07-15

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  9. Generalized uncertainty principles, effective Newton constant and regular black holes

    OpenAIRE

    Li, Xiang; Ling, Yi; Shen, You-Gen; Liu, Cheng-Zhou; He, Hong-Sheng; Xu, Lan-Fang

    2016-01-01

    In this paper, we explore the quantum spacetimes that are potentially connected with the generalized uncertainty principles. By analyzing the gravity-induced quantum interference pattern and the Gedanken for weighting photon, we find that the generalized uncertainty principles inspire the effective Newton constant as same as our previous proposal. A characteristic momentum associated with the tidal effect is suggested, which incorporates the quantum effect with the geometric nature of gravity...

  10. A review of the generalized uncertainty principle

    International Nuclear Information System (INIS)

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-01-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. (review)

  11. Towards Thermodynamics with Generalized Uncertainty Principle

    International Nuclear Information System (INIS)

    Moussa, Mohamed; Farag Ali, Ahmed

    2014-01-01

    Various frameworks of quantum gravity predict a modification in the Heisenberg uncertainty principle to a so-called generalized uncertainty principle (GUP). Introducing quantum gravity effect makes a considerable change in the density of states inside the volume of the phase space which changes the statistical and thermodynamical properties of any physical system. In this paper we investigate the modification in thermodynamic properties of ideal gases and photon gas. The partition function is calculated and using it we calculated a considerable growth in the thermodynamical functions for these considered systems. The growth may happen due to an additional repulsive force between constitutes of gases which may be due to the existence of GUP, hence predicting a considerable increase in the entropy of the system. Besides, by applying GUP on an ideal gas in a trapped potential, it is found that GUP assumes a minimum measurable value of thermal wavelength of particles which agrees with discrete nature of the space that has been derived in previous studies from the GUP

  12. Supersymmetry Breaking as a new source for the Generalized Uncertainty Principle

    OpenAIRE

    Faizal, Mir

    2016-01-01

    In this letter, we will demonstrate that the breaking of supersymmetry by a non-anticommutative deformation can be used to generate the generalized uncertainty principle. We will analyze the physical reasons for this observation, in the framework of string theory. We also discuss the relation between the generalized uncertainty principle and the Lee–Wick field theories.

  13. Supersymmetry breaking as a new source for the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Faizal, Mir, E-mail: mirfaizalmir@gmail.com

    2016-06-10

    In this letter, we will demonstrate that the breaking of supersymmetry by a non-anticommutative deformation can be used to generate the generalized uncertainty principle. We will analyze the physical reasons for this observation, in the framework of string theory. We also discuss the relation between the generalized uncertainty principle and the Lee–Wick field theories.

  14. “Stringy” coherent states inspired by generalized uncertainty principle

    Science.gov (United States)

    Ghosh, Subir; Roy, Pinaki

    2012-05-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  15. “Stringy” coherent states inspired by generalized uncertainty principle

    International Nuclear Information System (INIS)

    Ghosh, Subir; Roy, Pinaki

    2012-01-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  16. Uncertainty Principles on Two Step Nilpotent Lie Groups

    Indian Academy of Sciences (India)

    Abstract. We extend an uncertainty principle due to Cowling and Price to two step nilpotent Lie groups, which generalizes a classical theorem of Hardy. We also prove an analogue of Heisenberg inequality on two step nilpotent Lie groups.

  17. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    Science.gov (United States)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  18. Gauge theories under incorporation of a generalized uncertainty principle

    International Nuclear Information System (INIS)

    Kober, Martin

    2010-01-01

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  19. The 'Herbivory Uncertainty Principle': application in a cerrado site

    Directory of Open Access Journals (Sweden)

    CA Gadotti

    Full Text Available Researchers may alter the ecology of their studied organisms, even carrying out apparently beneficial activities, as in herbivory studies, when they may alter herbivory damage. We tested whether visit frequency altered herbivory damage, as predicted by the 'Herbivory Uncertainty Principle'. In a cerrado site, we established 80 quadrats, in which we sampled all woody individuals. We used four visit frequencies (high, medium, low, and control, quantifying, at the end of three months, herbivory damage for each species in each treatment. We did not corroborate the 'Herbivory Uncertainty Principle', since visiting frequency did not alter herbivory damage, at least when the whole plant community was taken into account. However, when we analysed each species separately, four out of 11 species presented significant differences in herbivory damage, suggesting that the researcher is not independent of its measurements. The principle could be tested in other ecological studies in which it may occur, such as those on animal behaviour, human ecology, population dynamics, and conservation.

  20. Universal uncertainty principle in the measurement operator formalism

    International Nuclear Information System (INIS)

    Ozawa, Masanao

    2005-01-01

    Heisenberg's uncertainty principle has been understood to set a limitation on measurements; however, the long-standing mathematical formulation established by Heisenberg, Kennard, and Robertson does not allow such an interpretation. Recently, a new relation was found to give a universally valid relation between noise and disturbance in general quantum measurements, and it has become clear that the new relation plays a role of the first principle to derive various quantum limits on measurement and information processing in a unified treatment. This paper examines the above development on the noise-disturbance uncertainty principle in the model-independent approach based on the measurement operator formalism, which is widely accepted to describe a class of generalized measurements in the field of quantum information. We obtain explicit formulae for the noise and disturbance of measurements given by measurement operators, and show that projective measurements do not satisfy the Heisenberg-type noise-disturbance relation that is typical in the gamma-ray microscope thought experiments. We also show that the disturbance on a Pauli operator of a projective measurement of another Pauli operator constantly equals √2, and examine how this measurement violates the Heisenberg-type relation but satisfies the new noise-disturbance relation

  1. Generalized uncertainty principle and entropy of three-dimensional rotating acoustic black hole

    International Nuclear Information System (INIS)

    Zhao, HuiHua; Li, GuangLiang; Zhang, LiChun

    2012-01-01

    Using the new equation of state density from the generalized uncertainty principle, we investigate statistics entropy of a 3-dimensional rotating acoustic black hole. When λ introduced in the generalized uncertainty principle takes a specific value, we obtain an area entropy and a correction term associated with the acoustic black hole. In this method, there does not exist any divergence and one needs not the small mass approximation in the original brick-wall model. -- Highlights: ► Statistics entropy of a 3-dimensional rotating acoustic black hole is studied. ► We obtain an area entropy and a correction term associated with it. ► We make λ introduced in the generalized uncertainty principle take a specific value. ► There does not exist any divergence in this method.

  2. A connection between the Uncertainty Principles on the real line and on the circle

    OpenAIRE

    Andersen, Nils Byrial

    2013-01-01

    The purpose of this short note is to exhibit a new connection between the Heisenberg Uncertainty Principle on the line and the Breitenberger Uncertainty Principle on the circle, by considering the commutator of the multiplication and difference operators on Bernstein functions

  3. The Bertlmann-Martin Inequalities and the Uncertainty Principle

    International Nuclear Information System (INIS)

    Ighezou, F.Z.; Kerris, A.T.; Lombard, R.J.

    2008-01-01

    A lower bound to (r) 1s is established from the Thomas-Reiche-Kuhn sum rule applied to the reduced equation for the s-states. It is linked to the average value of (r 2 ) 1s We discuss, on few examples, how the use of approximate value for (r 2 ) 1s , derived from the generalized Bertlmann and Martin inequalities, preserves the lower bound character of (r) 1s . Finally, by using the uncertainty principle and the uncertainty in the radial position, we derive a low bound to the ground state kinetic energy

  4. Continuous quantum measurements and the action uncertainty principle

    Science.gov (United States)

    Mensky, Michael B.

    1992-09-01

    The path-integral approach to quantum theory of continuous measurements has been developed in preceding works of the author. According to this approach the measurement amplitude determining probabilities of different outputs of the measurement can be evaluated in the form of a restricted path integral (a path integral “in finite limits”). With the help of the measurement amplitude, maximum deviation of measurement outputs from the classical one can be easily determined. The aim of the present paper is to express this variance in a simpler and transparent form of a specific uncertainty principle (called the action uncertainty principle, AUP). The most simple (but weak) form of AUP is δ S≳ℏ, where S is the action functional. It can be applied for simple derivation of the Bohr-Rosenfeld inequality for measurability of gravitational field. A stronger (and having wider application) form of AUP (for ideal measurements performed in the quantum regime) is |∫{/' t″ }(δ S[ q]/δ q( t))Δ q( t) dt|≃ℏ, where the paths [ q] and [Δ q] stand correspondingly for the measurement output and for the measurement error. It can also be presented in symbolic form as Δ(Equation) Δ(Path) ≃ ℏ. This means that deviation of the observed (measured) motion from that obeying the classical equation of motion is reciprocally proportional to the uncertainty in a path (the latter uncertainty resulting from the measurement error). The consequence of AUP is that improving the measurement precision beyond the threshold of the quantum regime leads to decreasing information resulting from the measurement.

  5. Theoretical formulation of finite-dimensional discrete phase spaces: I. Algebraic structures and uncertainty principles

    International Nuclear Information System (INIS)

    Marchiolli, M.A.; Ruzzi, M.

    2012-01-01

    We propose a self-consistent theoretical framework for a wide class of physical systems characterized by a finite space of states which allows us, within several mathematical virtues, to construct a discrete version of the Weyl–Wigner–Moyal (WWM) formalism for finite-dimensional discrete phase spaces with toroidal topology. As a first and important application from this ab initio approach, we initially investigate the Robertson–Schrödinger (RS) uncertainty principle related to the discrete coordinate and momentum operators, as well as its implications for physical systems with periodic boundary conditions. The second interesting application is associated with a particular uncertainty principle inherent to the unitary operators, which is based on the Wiener–Khinchin theorem for signal processing. Furthermore, we also establish a modified discrete version for the well-known Heisenberg–Kennard–Robertson (HKR) uncertainty principle, which exhibits additional terms (or corrections) that resemble the generalized uncertainty principle (GUP) into the context of quantum gravity. The results obtained from this new algebraic approach touch on some fundamental questions inherent to quantum mechanics and certainly represent an object of future investigations in physics. - Highlights: ► We construct a discrete version of the Weyl–Wigner–Moyal formalism. ► Coherent states for finite-dimensional discrete phase spaces are established. ► Discrete coordinate and momentum operators are properly defined. ► Uncertainty principles depend on the topology of finite physical systems. ► Corrections for the discrete Heisenberg uncertainty relation are also obtained.

  6. Lacunary Fourier Series and a Qualitative Uncertainty Principle for ...

    Indian Academy of Sciences (India)

    We define lacunary Fourier series on a compact connected semisimple Lie group . If f ∈ L 1 ( G ) has lacunary Fourier series and vanishes on a non empty open subset of , then we prove that vanishes identically. This result can be viewed as a qualitative uncertainty principle.

  7. A Framework for Generalising the Newton Method and Other Iterative Methods from Euclidean Space to Manifolds

    OpenAIRE

    Manton, Jonathan H.

    2012-01-01

    The Newton iteration is a popular method for minimising a cost function on Euclidean space. Various generalisations to cost functions defined on manifolds appear in the literature. In each case, the convergence rate of the generalised Newton iteration needed establishing from first principles. The present paper presents a framework for generalising iterative methods from Euclidean space to manifolds that ensures local convergence rates are preserved. It applies to any (memoryless) iterative m...

  8. The generalised Sylvester matrix equations over the generalised bisymmetric and skew-symmetric matrices

    Science.gov (United States)

    Dehghan, Mehdi; Hajarian, Masoud

    2012-08-01

    A matrix P is called a symmetric orthogonal if P = P T = P -1. A matrix X is said to be a generalised bisymmetric with respect to P if X = X T = PXP. It is obvious that any symmetric matrix is also a generalised bisymmetric matrix with respect to I (identity matrix). By extending the idea of the Jacobi and the Gauss-Seidel iterations, this article proposes two new iterative methods, respectively, for computing the generalised bisymmetric (containing symmetric solution as a special case) and skew-symmetric solutions of the generalised Sylvester matrix equation ? (including Sylvester and Lyapunov matrix equations as special cases) which is encountered in many systems and control applications. When the generalised Sylvester matrix equation has a unique generalised bisymmetric (skew-symmetric) solution, the first (second) iterative method converges to the generalised bisymmetric (skew-symmetric) solution of this matrix equation for any initial generalised bisymmetric (skew-symmetric) matrix. Finally, some numerical results are given to illustrate the effect of the theoretical results.

  9. Generalized Uncertainty Principle and Black Hole Entropy of Higher-Dimensional de Sitter Spacetime

    International Nuclear Information System (INIS)

    Zhao Haixia; Hu Shuangqi; Zhao Ren; Li Huaifan

    2007-01-01

    Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking black hole entropy. In particular, many researchers have expressed a vested interest in the coefficient of the logarithmic term of the black hole entropy correction term. In this paper, we calculate the correction value of the black hole entropy by utilizing the generalized uncertainty principle and obtain the correction term caused by the generalized uncertainty principle. Because in our calculation we think that the Bekenstein-Hawking area theorem is still valid after considering the generalized uncertainty principle, we derive that the coefficient of the logarithmic term of the black hole entropy correction term is positive. This result is different from the known result at present. Our method is valid not only for four-dimensional spacetimes but also for higher-dimensional spacetimes. In the whole process, the physics idea is clear and calculation is simple. It offers a new way for studying the entropy correction of the complicated spacetime.

  10. Generalised Filtering

    Directory of Open Access Journals (Sweden)

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  11. Generalized uncertainty principle and quantum gravity phenomenology

    Science.gov (United States)

    Bosso, Pasquale

    The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.

  12. Concurrent analysis: towards generalisable qualitative research.

    Science.gov (United States)

    Snowden, Austyn; Martin, Colin R

    2011-10-01

    This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.

  13. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  14. Verification of the uncertainty principle by using diffraction of light waves

    International Nuclear Information System (INIS)

    Nikolic, D; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the acquisition of the experimental data and their further analysis, we used a computer. Because of its simplicity this experiment is very suitable for demonstration, as well as for a quantitative exercise at universities and final year of high school studies.

  15. Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities

    International Nuclear Information System (INIS)

    Tawfik, A.

    2013-01-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible

  16. On uncertainty in information and ignorance in knowledge

    Science.gov (United States)

    Ayyub, Bilal M.

    2010-05-01

    This paper provides an overview of working definitions of knowledge, ignorance, information and uncertainty and summarises formalised philosophical and mathematical framework for their analyses. It provides a comparative examination of the generalised information theory and the generalised theory of uncertainty. It summarises foundational bases for assessing the reliability of knowledge constructed as a collective set of justified true beliefs. It discusses system complexity for ancestor simulation potentials. It offers value-driven communication means of knowledge and contrarian knowledge using memes and memetics.

  17. On the principled assignment of probabilities for uncertainty analysis

    International Nuclear Information System (INIS)

    Unwin, S.D.; Cook, I.

    1986-01-01

    The authors sympathize with those who raise the questions of inscrutability and over-precision in connection with probabilistic techniques as currently implemented in nuclear PRA. This inscrutability also renders the probabilistic approach, as practiced, open to abuse. They believe that the appropriate remedy is not the discarding of the probabilistic representation of uncertainty in favour of a more simply structured, but logically inconsistent approach such as that of bounding analysis. This would be like forbidding the use of arithmetic in order to prevent the issuing of fraudulent company prospectuses. The remedy, in this analogy, is the enforcement of accounting standards for the valuation of inventory, rates of depreciation etc. They require an analogue of such standards in the PRA domain. What is needed is not the interdiction of probabilistic judgment, but the interdiction of private, inscrutable judgment. Some principles may be conventional in character, as are certain accounting principles. They expound a set of controlling principles which they suggest should govern the formulation of probabilities in nuclear risk analysis. A fuller derivation and consideration of these principles can be found

  18. The action uncertainty principle for continuous measurements

    Science.gov (United States)

    Mensky, Michael B.

    1996-02-01

    The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa( t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δ F(t)A(p,q,t) in the Hamiltonian where the function δ F (generalized fictitious force) is restricted by the AUP ∫|δ F(t)| Δa( t) d t ≲ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior.

  19. The action uncertainty principle for continuous measurements

    International Nuclear Information System (INIS)

    Mensky, M.B.

    1996-01-01

    The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa(t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δF(t) A(p,q,t) in the Hamiltonian where the function δF (generalized fictitious force) is restricted by the AUP ∫ vertical stroke δF(t) vertical stroke Δa(t)d t< or∼ℎ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of ℎ. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior. (orig.)

  20. The Quark-Gluon Plasma Equation of State and the Generalized Uncertainty Principle

    Directory of Open Access Journals (Sweden)

    L. I. Abou-Salem

    2015-01-01

    Full Text Available The quark-gluon plasma (QGP equation of state within a minimal length scenario or Generalized Uncertainty Principle (GUP is studied. The Generalized Uncertainty Principle is implemented on deriving the thermodynamics of ideal QGP at a vanishing chemical potential. We find a significant effect for the GUP term. The main features of QCD lattice results were quantitatively achieved in case of nf=0, nf=2, and nf=2+1 flavors for the energy density, the pressure, and the interaction measure. The exciting point is the large value of bag pressure especially in case of nf=2+1 flavor which reflects the strong correlation between quarks in this bag which is already expected. One can notice that the asymptotic behavior which is characterized by Stephan-Boltzmann limit would be satisfied.

  1. The Precautionary Principle and statistical approaches to uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2004-01-01

    is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures...... for setting safe exposure levels are essentially derived from these classical statistical ideas, and we outline how uncertainties in the exposure and response measurements affect the no observed adverse effect level, the Benchmark approach and the "Hockey Stick" model. A particular problem concerns model...

  2. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    Science.gov (United States)

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…

  3. Generalized uncertainty principle, quantum gravity and Horava-Lifshitz gravity

    International Nuclear Information System (INIS)

    Myung, Yun Soo

    2009-01-01

    We investigate a close connection between generalized uncertainty principle (GUP) and deformed Horava-Lifshitz (HL) gravity. The GUP commutation relations correspond to the UV-quantum theory, while the canonical commutation relations represent the IR-quantum theory. Inspired by this UV/IR quantum mechanics, we obtain the GUP-corrected graviton propagator by introducing UV-momentum p i =p 0i (1+βp 0 2 ) and compare this with tensor propagators in the HL gravity. Two are the same up to p 0 4 -order.

  4. Reducing Uncertainty: Implementation of Heisenberg Principle to Measure Company Performance

    Directory of Open Access Journals (Sweden)

    Anna Svirina

    2015-08-01

    Full Text Available The paper addresses the problem of uncertainty reduction in estimation of future company performance, which is a result of wide range of enterprise's intangible assets probable efficiency. To reduce this problem, the paper suggests to use quantum economy principles, i.e. implementation of Heisenberg principle to measure efficiency and potential of intangible assets of the company. It is proposed that for intangibles it is not possible to estimate both potential and efficiency at a certain time point. To provide a proof for these thesis, the data on resources potential and efficiency from mid-Russian companies was evaluated within deterministic approach, which did not allow to evaluate probability of achieving certain resource efficiency, and quantum approach, which allowed to estimate the central point around which the probable efficiency of resources in concentrated. Visualization of these approaches was performed by means of LabView software. It was proven that for tangible assets performance estimation a deterministic approach should be used; while for intangible assets the quantum approach allows better quality of future performance prediction. On the basis of these findings we proposed the holistic approach towards estimation of company resource efficiency in order to reduce uncertainty in modeling company performance.

  5. Unconditional security of quantum key distribution and the uncertainty principle

    International Nuclear Information System (INIS)

    Koashi, Masato

    2006-01-01

    An approach to the unconditional security of quantum key distribution protocols is presented, which is based on the uncertainty principle. The approach applies to every case that has been treated via the argument by Shor and Preskill, but it is not necessary to find quantum error correcting codes. It can also treat the cases with uncharacterized apparatuses. The proof can be applied to cases where the secret key rate is larger than the distillable entanglement

  6. Generalised anxiety disorder

    Directory of Open Access Journals (Sweden)

    Bojana Avguštin Avčin

    2013-10-01

    Full Text Available Generalised anxiety disorder is characterised by persistent, excessive and difficult-to-control worry, which may be accompanied by several psychic and somatic symptoms, including suicidality. Generalized anxiety disorder is the most common psychiatric disorder in the primary care, although it is often underrecognised and undertreated. Generalized anxiety disorder is typically a chronic condition with low short- and medium-term remission rates. Clinical presentations often include depression, somatic illness, pain, fatigue and problems sleeping. The evaluation of prognosis is complicated by frequent comorbidity with other anxiety disorders and depression, which worsen the long-term outcome and accompanying burden of disability. The two main treatments for generalised anxiety disorder are medications and psychotherapy. Selective serotonin reuptake inhibitors and serotonin-norepinephrine reuptake inhibitors represent first-line psychopharmacologic treatment for generalised anxiety disorder. The most extensively studied psychotherapy for anxiety is cognitive behavioural therapy which has demonstrated efficacy throughout controlled studies.

  7. A Simplified Proof of Uncertainty Principle for Quaternion Linear Canonical Transform

    Directory of Open Access Journals (Sweden)

    Mawardi Bahri

    2016-01-01

    Full Text Available We provide a short and simple proof of an uncertainty principle associated with the quaternion linear canonical transform (QLCT by considering the fundamental relationship between the QLCT and the quaternion Fourier transform (QFT. We show how this relation allows us to derive the inverse transform and Parseval and Plancherel formulas associated with the QLCT. Some other properties of the QLCT are also studied.

  8. Covariant energy–momentum and an uncertainty principle for general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Cooperstock, F.I., E-mail: cooperst@uvic.ca [Department of Physics and Astronomy, University of Victoria, P.O. Box 3055, Victoria, B.C. V8W 3P6 (Canada); Dupre, M.J., E-mail: mdupre@tulane.edu [Department of Mathematics, Tulane University, New Orleans, LA 70118 (United States)

    2013-12-15

    We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.

  9. Vacuum thermalization of high intensity laser beams and the uncertainty principle

    International Nuclear Information System (INIS)

    Gupta, R.P.; Bhakar, B.S.; Panarella, E.

    1983-01-01

    This chapter phenomenologically calculates the cross section for photon-photon scattering in high intensity laser beams. The consequence of the Heisenberg uncertainty principle must be taken account in any photon-photon scattering calculation when many photons are present within the uncertainty volume. An exact determination of the number of scattering centers present in the scattering region is precluded when high intensity laser beams are involved in the scattering. Predictions are presented which suggest an upper limit to which the coherent photon densities can be increased either during amplification or focusing before scattering becomes predominant. The results of multiphoton ionization of gases, and laser induced CTR plasmas of the future, may be significantly affected due to the enhancement of the photon scattering investigated

  10. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  11. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  12. Wagner’s theory of generalised heaps

    CERN Document Server

    Hollings, Christopher D

    2017-01-01

    The theories of V. V. Wagner (1908-1981) on abstractions of systems of binary relations are presented here within their historical and mathematical contexts. This book contains the first translation from Russian into English of a selection of Wagner’s papers, the ideas of which are connected to present-day mathematical research. Along with a translation of Wagner’s main work in this area, his 1953 paper ‘Theory of generalised heaps and generalised groups,’ the book also includes translations of three short precursor articles that provide additional context for his major work. Researchers and students interested in both algebra (in particular, heaps, semiheaps, generalised heaps, semigroups, and groups) and differential geometry will benefit from the techniques offered by these translations, owing to the natural connections between generalised heaps and generalised groups, and the role played by these concepts in differential geometry. This book gives examples from present-day mathematics where ideas r...

  13. Quantum generalisation of feedforward neural networks

    Science.gov (United States)

    Wan, Kwok Ho; Dahlsten, Oscar; Kristjánsson, Hlér; Gardner, Robert; Kim, M. S.

    2017-09-01

    We propose a quantum generalisation of a classical neural network. The classical neurons are firstly rendered reversible by adding ancillary bits. Then they are generalised to being quantum reversible, i.e., unitary (the classical networks we generalise are called feedforward, and have step-function activation functions). The quantum network can be trained efficiently using gradient descent on a cost function to perform quantum generalisations of classical tasks. We demonstrate numerically that it can: (i) compress quantum states onto a minimal number of qubits, creating a quantum autoencoder, and (ii) discover quantum communication protocols such as teleportation. Our general recipe is theoretical and implementation-independent. The quantum neuron module can naturally be implemented photonically.

  14. An ontological system for interoperable spatial generalisation in biodiversity monitoring

    Science.gov (United States)

    Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael

    2015-11-01

    Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.

  15. An environmental generalised Luenberger-Hicks-Moorsteen productivity indicator and an environmental generalised Hicks-Moorsteen productivity index.

    Science.gov (United States)

    Abad, A

    2015-09-15

    The purpose of this paper is to introduce an environmental generalised productivity indicator and its ratio-based counterpart. The innovative environmental generalised total factor productivity measures inherit the basic structure of both Hicks-Moorsteen productivity index and Luenberger-Hicks-Moorsteen productivity indicator. This methodological contribution shows that these new environmental generalised total factor productivity measures yield the earlier standard Hicks-Moorsteen index and Luenberger-Hicks-Moorsteen indicator, as well as environmental performance index, as special cases. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A generalised Dynamic Overflow Risk Assessment (DORA) for Real Time Control of urban drainage systems

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Grum, Morten

    2014-01-01

    An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored in the drai......An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored...... and their uncertainty contributed to further improving the performance of drainage systems. The results of this paper will contribute to the wider usage of global RTC methods in the management of urban drainage networks....

  17. The most general form of deformation of the Heisenberg algebra from the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Masood, Syed [Department of Physics, International Islamic University, H-10 Sector, Islamabad (Pakistan); Faizal, Mir, E-mail: mirfaizalmir@gmail.com [Irving K. Barber School of Arts and Sciences, University of British Columbia – Okanagan, Kelowna, BC V1V 1V7 (Canada); Department of Physics and Astronomy, University of Lethbridge, Lethbridge, AB T1K 3M4 (Canada); Zaz, Zaid [Department of Electronics and Communication Engineering, University of Kashmir, Srinagar, Kashmir, 190006 (India); Ali, Ahmed Farag [Department of Physics, Faculty of Science, Benha University, Benha, 13518 (Egypt); Raza, Jamil [Department of Physics, International Islamic University, H-10 Sector, Islamabad (Pakistan); Shah, Mushtaq B. [Department of Physics, National Institute of Technology, Srinagar, Kashmir, 190006 (India)

    2016-12-10

    In this paper, we will propose the most general form of the deformation of Heisenberg algebra motivated by the generalized uncertainty principle. This deformation of the Heisenberg algebra will deform all quantum mechanical systems. The form of the generalized uncertainty principle used to motivate these results will be motivated by the space fractional quantum mechanics, and non-locality in quantum mechanical systems. We also analyse a specific limit of this generalized deformation for one dimensional system, and in that limit, a nonlocal deformation of the momentum operator generates a local deformation of all one dimensional quantum mechanical systems. We analyse the low energy effects of this deformation on a harmonic oscillator, Landau levels, Lamb shift, and potential barrier. We also demonstrate that this deformation leads to a discretization of space.

  18. The most general form of deformation of the Heisenberg algebra from the generalized uncertainty principle

    International Nuclear Information System (INIS)

    Masood, Syed; Faizal, Mir; Zaz, Zaid; Ali, Ahmed Farag; Raza, Jamil; Shah, Mushtaq B.

    2016-01-01

    In this paper, we will propose the most general form of the deformation of Heisenberg algebra motivated by the generalized uncertainty principle. This deformation of the Heisenberg algebra will deform all quantum mechanical systems. The form of the generalized uncertainty principle used to motivate these results will be motivated by the space fractional quantum mechanics, and non-locality in quantum mechanical systems. We also analyse a specific limit of this generalized deformation for one dimensional system, and in that limit, a nonlocal deformation of the momentum operator generates a local deformation of all one dimensional quantum mechanical systems. We analyse the low energy effects of this deformation on a harmonic oscillator, Landau levels, Lamb shift, and potential barrier. We also demonstrate that this deformation leads to a discretization of space.

  19. Generalized uncertainty principle and the maximum mass of ideal white dwarfs

    Energy Technology Data Exchange (ETDEWEB)

    Rashidi, Reza, E-mail: reza.rashidi@srttu.edu

    2016-11-15

    The effects of a generalized uncertainty principle on the structure of an ideal white dwarf star is investigated. The equation describing the equilibrium configuration of the star is a generalized form of the Lane–Emden equation. It is proved that the star always has a finite size. It is then argued that the maximum mass of such an ideal white dwarf tends to infinity, as opposed to the conventional case where it has a finite value.

  20. Lorentz violation and generalized uncertainty principle

    Science.gov (United States)

    Lambiase, Gaetano; Scardigli, Fabio

    2018-04-01

    Investigations on possible violation of Lorentz invariance have been widely pursued in the last decades, both from theoretical and experimental sides. A comprehensive framework to formulate the problem is the standard model extension (SME) proposed by A. Kostelecky, where violation of Lorentz invariance is encoded into specific coefficients. Here we present a procedure to link the deformation parameter β of the generalized uncertainty principle to the SME coefficients of the gravity sector. The idea is to compute the Hawking temperature of a black hole in two different ways. The first way involves the deformation parameter β , and therefore we get a deformed Hawking temperature containing the parameter β . The second way involves a deformed Schwarzschild metric containing the Lorentz violating terms s¯μ ν of the gravity sector of the SME. The comparison between the two different techniques yields a relation between β and s¯μ ν. In this way bounds on β transferred from s¯μ ν are improved by many orders of magnitude when compared with those derived in other gravitational frameworks. Also the opposite possibility of bounds transferred from β to s¯μ ν is briefly discussed.

  1. Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.

    Science.gov (United States)

    Rogers, Michael D

    2003-06-01

    Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.

  2. Blockchain to Rule the Waves - Nascent Design Principles for Reducing Risk and Uncertainty in Decentralized Environments

    DEFF Research Database (Denmark)

    Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman

    2017-01-01

    Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings...... of a Design Science Research project involving the construction and evaluation of an information technology artifact in collaboration with Maersk, a leading international shipping company, where central documents in shipping, such as the Bill of Lading, are turned into a smart contract on blockchain. Based...... on our insights from the project, we provide first evidence for preliminary design principles for applications that aim to mitigate the transactional risk and uncertainty in decentralized environments using blockchain. Both the artifact and the first evidence for emerging design principles are novel...

  3. Completeness, special functions and uncertainty principles over q-linear grids

    International Nuclear Information System (INIS)

    Abreu, LuIs Daniel

    2006-01-01

    We derive completeness criteria for sequences of functions of the form f(xλ n ), where λ n is the nth zero of a suitably chosen entire function. Using these criteria, we construct complete nonorthogonal systems of Fourier-Bessel functions and their q-analogues, as well as other complete sets of q-special functions. We discuss connections with uncertainty principles over q-linear grids and the completeness of certain sets of q-Bessel functions is used to prove that, if a function f and its q-Hankel transform both vanish at the points {q -n } ∞ n=1 , 0 n } ∞ n=-∞

  4. On the connection between complementarity and uncertainty principles in the Mach–Zehnder interferometric setting

    International Nuclear Information System (INIS)

    Bosyk, G M; Portesi, M; Holik, F; Plastino, A

    2013-01-01

    We revisit the connection between the complementarity and uncertainty principles of quantum mechanics within the framework of Mach–Zehnder interferometry. We focus our attention on the trade-off relation between complementary path information and fringe visibility. This relation is equivalent to the uncertainty relation of Schrödinger and Robertson for a suitably chosen pair of observables. We show that it is equivalent as well to the uncertainty inequality provided by Landau and Pollak. We also study the relationship of this trade-off relation with a family of entropic uncertainty relations based on Rényi entropies. There is no equivalence in this case, but the different values of the entropic parameter do define regimes that provides us with a tool to discriminate between non-trivial states of minimum uncertainty. The existence of such regimes agrees with previous results of Luis (2011 Phys. Rev. A 84 034101), although their meaning was not sufficiently clear. We discuss the origin of these regimes with the intention of gaining a deeper understanding of entropic measures. (paper)

  5. An Inconvenient Deliberation. The Precautionary Principle's Contribution to the Uncertainties Surrounding Climate Change Liability

    International Nuclear Information System (INIS)

    Haritz, M.M.

    2011-01-01

    There is increasing evidence to suggest that adaptation to the inevitable is as relevant to climate change policymaking as mitigation efforts. Both mitigation and adaptation, as well as the unavoidable damage occurring both now and that is predicted to occur, all involve costs at the expense of diverse climate change victims. The allocation of responsibilities - implicit in terms of the burden-sharing mechanisms that currently exist in public and private governance - demands recourse under liability law, especially as it has become clear that most companies will only start reducing emissions if verifiable costs of the economic consequences of climate change, including the likelihood of liability, outweigh the costs of taking precautionary measures. This vitally important book asks: Can the precautionary principle make uncertainty judiciable in the context of liability for the consequences of climate change, and, if so, to what extent? Drawing on the full range of pertinent existing literature and case law, the author examines the precautionary principle both in terms of its content and application and in the context of liability law. She analyses the indirect means offered by existing legislation being used by environmental groups and affected individuals before the courts to challenge both companies and regulators as responsible agents of climate change damage. In the process of responding to its fundamental question, the analysis explores such further questions as the following: (a) What is the role of the precautionary principle in resolving uncertainty in scientific risk assessment when faced with inconclusive evidence, and how does it affect decision-making, particularly in the regulatory choices concerning climate change? To this end, what is the concrete content of the precautionary principle?; (b) How does liability law generally handle scientific uncertainty? What different types of liability exist, and how are they equipped to handle a climate change

  6. Primary small bowel anastomosis in generalised peritonitis

    NARCIS (Netherlands)

    deGraaf, JS; van Goor, Harry; Bleichrodt, RP

    Objective: To find out if primary small bowel anastomosis of the bowel is safe in patients with generalised peritonitis who are treated by planned relaparotomies. Design: Retrospective study. Setting: University hospital, The Netherlands. Subjects. 10 Patients with generalised purulent peritonitis

  7. When the uncertainty principle goes up to 11 or how to explain quantum physics with heavy metal

    CERN Document Server

    Moriarty, Philip

    2018-01-01

    There are deep and fascinating links between heavy metal and quantum physics. No, there are. Really. While teaching at the University of Nottingham, physicist Philip Moriarty noticed something odd--a surprising number of his students were heavily into metal music. Colleagues, too: a Venn diagram of physicists and metal fans would show a shocking amount of overlap. What's more, it turns out that heavy metal music is uniquely well-suited to explaining quantum principles. In When the Uncertainty Principle Goes Up to Eleven, Moriarty explains the mysteries of the universe's inner workings via drum beats and feedback: You'll discover how the Heisenberg uncertainty principle comes into play with every chugging guitar riff, what wave interference has to do with Iron Maiden, and why metalheads in mosh pits behave just like molecules in a gas. If you're a metal fan trying to grasp the complexities of quantum physics, a quantum physicist baffled by heavy metal, or just someone who'd like to know how the fundamental sci...

  8. Cloverleaf skull with generalised bone dysplasia

    International Nuclear Information System (INIS)

    Kozlowski, K.; Warren, P.S.; Fisher, C.C.; Royal Hospital for Women, Camperdown

    1985-01-01

    A case of cloverleaf skull with generalised bone dysplasia is reported. The authors believe that bone dysplasia associated with cloverleaf is neither identical with thanatophoric dysplasia nor achondroplasia. Until identity of thanatophoric dysplasia and cloverleaf skull with generalised bone dysplasia is proved the diseases should be looked upon as separate entities and the wording ''thanatophoric dysplasia with cloverleaf skull'' should be abolished. (orig.)

  9. Cloverleaf skull with generalised bone dysplasia

    Energy Technology Data Exchange (ETDEWEB)

    Kozlowski, K.; Warren, P.S.; Fisher, C.C.

    1985-09-01

    A case of cloverleaf skull with generalised bone dysplasia is reported. The authors believe that bone dysplasia associated with cloverleaf is neither identical with thanatophoric dysplasia nor achondroplasia. Until identity of thanatophoric dysplasia and cloverleaf skull with generalised bone dysplasia is proved the diseases should be looked upon as separate entities and the wording ''thanatophoric dysplasia with cloverleaf skull'' should be abolished.

  10. The energy-time uncertainty principle and the EPR paradox: Experiments involving correlated two-photon emission in parametric down-conversion

    Science.gov (United States)

    Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

    1992-01-01

    The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

  11. Generalised structures for N=1 AdS backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Coimbra, André [Institut für Theoretische Physik & Center for Quantum Engineering and Spacetime Research,Leibniz Universität Hannover,Appelstraße 2, 30167 Hannover (Germany); Strickland-Constable, Charles [Institut de physique théorique, Université Paris Saclay, CEA, CNRS, Orme des Merisiers, F-91191 Gif-sur-Yvette (France)

    2016-11-16

    We expand upon a claim made in a recent paper [http://arxiv.org/abs/1411.5721] that generic minimally supersymmetric AdS backgrounds of warped flux compactifications of Type II and M theory can be understood as satisfying a straightforward weak integrability condition in the language of E{sub d(d)}×ℝ{sup +} generalised geometry. Namely, they are spaces admitting a generalised G-structure set by the Killing spinor and with constant singlet generalised intrinsic torsion.

  12. Data assimilation and uncertainty analysis of environmental assessment problems--an application of Stochastic Transfer Function and Generalised Likelihood Uncertainty Estimation techniques

    International Nuclear Information System (INIS)

    Romanowicz, Renata; Young, Peter C.

    2003-01-01

    Stochastic Transfer Function (STF) and Generalised Likelihood Uncertainty Estimation (GLUE) techniques are outlined and applied to an environmental problem concerned with marine dose assessment. The goal of both methods in this application is the estimation and prediction of the environmental variables, together with their associated probability distributions. In particular, they are used to estimate the amount of radionuclides transferred to marine biota from a given source: the British Nuclear Fuel Ltd (BNFL) repository plant in Sellafield, UK. The complexity of the processes involved, together with the large dispersion and scarcity of observations regarding radionuclide concentrations in the marine environment, require efficient data assimilation techniques. In this regard, the basic STF methods search for identifiable, linear model structures that capture the maximum amount of information contained in the data with a minimal parameterisation. They can be extended for on-line use, based on recursively updated Bayesian estimation and, although applicable to only constant or time-variable parameter (non-stationary) linear systems in the form used in this paper, they have the potential for application to non-linear systems using recently developed State Dependent Parameter (SDP) non-linear STF models. The GLUE based-methods, on the other hand, formulate the problem of estimation using a more general Bayesian approach, usually without prior statistical identification of the model structure. As a result, they are applicable to almost any linear or non-linear stochastic model, although they are much less efficient both computationally and in their use of the information contained in the observations. As expected in this particular environmental application, it is shown that the STF methods give much narrower confidence limits for the estimates due to their more efficient use of the information contained in the data. Exploiting Monte Carlo Simulation (MCS) analysis

  13. Horizon Wavefunction of Generalized Uncertainty Principle Black Holes

    Directory of Open Access Journals (Sweden)

    Luciano Manfredi

    2016-01-01

    Full Text Available We study the Horizon Wavefunction (HWF description of a Generalized Uncertainty Principle inspired metric that admits sub-Planckian black holes, where the black hole mass m is replaced by M=m1+β/2MPl2/m2. Considering the case of a wave-packet shaped by a Gaussian distribution, we compute the HWF and the probability PBH that the source is a (quantum black hole, that is, that it lies within its horizon radius. The case β0, where a minimum in PBH is encountered, thus meaning that every particle has some probability of decaying to a black hole. Furthermore, for sufficiently large β we find that every particle is a quantum black hole, in agreement with the intuitive effect of increasing β, which creates larger M and RH terms. This is likely due to a “dimensional reduction” feature of the model, where the black hole characteristics for sub-Planckian black holes mimic those in (1+1 dimensions and the horizon size grows as RH~M-1.

  14. Impulsivity modulates performance under response uncertainty in a reaching task.

    Science.gov (United States)

    Tzagarakis, C; Pellizzer, G; Rogers, R D

    2013-03-01

    We sought to explore the interaction of the impulsivity trait with response uncertainty. To this end, we used a reaching task (Pellizzer and Hedges in Exp Brain Res 150:276-289, 2003) where a motor response direction was cued at different levels of uncertainty (1 cue, i.e., no uncertainty, 2 cues or 3 cues). Data from 95 healthy adults (54 F, 41 M) were analysed. Impulsivity was measured using the Barratt Impulsiveness Scale version 11 (BIS-11). Behavioral variables recorded were reaction time (RT), errors of commission (referred to as 'early errors') and errors of precision. Data analysis employed generalised linear mixed models and generalised additive mixed models. For the early errors, there was an interaction of impulsivity with uncertainty and gender, with increased errors for high impulsivity in the one-cue condition for women and the three-cue condition for men. There was no effect of impulsivity on precision errors or RT. However, the analysis of the effect of RT and impulsivity on precision errors showed a different pattern for high versus low impulsives in the high uncertainty (3 cue) condition. In addition, there was a significant early error speed-accuracy trade-off for women, primarily in low uncertainty and a 'reverse' speed-accuracy trade-off for men in high uncertainty. These results extend those of past studies of impulsivity which help define it as a behavioural trait that modulates speed versus accuracy response styles depending on environmental constraints and highlight once more the importance of gender in the interplay of personality and behaviour.

  15. Uncertainty and Complementarity in Axiomatic Quantum Mechanics

    Science.gov (United States)

    Lahti, Pekka J.

    1980-11-01

    In this work an investigation of the uncertainty principle and the complementarity principle is carried through. A study of the physical content of these principles and their representation in the conventional Hilbert space formulation of quantum mechanics forms a natural starting point for this analysis. Thereafter is presented more general axiomatic framework for quantum mechanics, namely, a probability function formulation of the theory. In this general framework two extra axioms are stated, reflecting the ideas of the uncertainty principle and the complementarity principle, respectively. The quantal features of these axioms are explicated. The sufficiency of the state system guarantees that the observables satisfying the uncertainty principle are unbounded and noncompatible. The complementarity principle implies a non-Boolean proposition structure for the theory. Moreover, nonconstant complementary observables are always noncompatible. The uncertainty principle and the complementarity principle, as formulated in this work, are mutually independent. Some order is thus brought into the confused discussion about the interrelations of these two important principles. A comparison of the present formulations of the uncertainty principle and the complementarity principle with the Jauch formulation of the superposition principle is also given. The mutual independence of the three fundamental principles of the quantum theory is hereby revealed.

  16. Acute generalised exanthematous pustulosis.

    Science.gov (United States)

    Criton, S; Sofia, B

    2001-01-01

    Acute generalised exanthernatous pustulosis (AGEP) is a condition characterised by sudden onset of non-follicular aseptic pustules all over the body. It is distinct from pustular psoriasis with characteristic morphology, histopathology and evolution.

  17. On the exceptional generalised Lie derivative for d≥7

    International Nuclear Information System (INIS)

    Rosabal, J.A.

    2015-01-01

    In this work we revisit the E_8×ℝ"+ generalised Lie derivative encoding the algebra of diffeomorphisms and gauge transformations of compactifications of M-theory on eight-dimensional manifolds, by extending certain features of the E_7×ℝ"+ one. Compared to its E_d×ℝ"+, d≤7 counterparts, a new term is needed for consistency. However, we find that no compensating parameters need to be introduced, but rather that the new term can be written in terms of the ordinary generalised gauge parameters by means of a connection. This implies that no further degrees of freedom, beyond those of the field content of the E_8 group, are needed to have a well defined theory. We discuss the implications of the structure of the E_8×ℝ"+ generalised transformation on the construction of the d=8 generalised geometry. Finally, we suggest how to lift the generalised Lie derivative to eleven dimensions.

  18. Application of a Bayesian/generalised least-squares method to generate correlations between independent neutron fission yield data

    International Nuclear Information System (INIS)

    Fiorito, L.; Diez, C.; Cabellos, O.; Stankovskiy, A.; Van den Eynde, G.; Labeau, P.E.

    2014-01-01

    Fission product yields are fundamental parameters for several nuclear engineering calculations and in particular for burn-up/activation problems. The impact of their uncertainties was widely studied in the past and evaluations were released, although still incomplete. Recently, the nuclear community expressed the need for full fission yield covariance matrices to produce inventory calculation results that take into account the complete uncertainty data. In this work, we studied and applied a Bayesian/generalised least-squares method for covariance generation, and compared the generated uncertainties to the original data stored in the JEFF-3.1.2 library. Then, we focused on the effect of fission yield covariance information on fission pulse decay heat results for thermal fission of 235 U. Calculations were carried out using different codes (ACAB and ALEPH-2) after introducing the new covariance values. Results were compared with those obtained with the uncertainty data currently provided by the library. The uncertainty quantification was performed with the Monte Carlo sampling technique. Indeed, correlations between fission yields strongly affect the statistics of decay heat. (authors)

  19. Acute generalised exanthematous pustulosis

    Directory of Open Access Journals (Sweden)

    Criton S

    2001-01-01

    Full Text Available Acute generalised exanthernatous pustulosis (AGEP is a condition characterised by sudden onset of non-follicular aseptic pustules all over the body. It is distinct from pustular psoriasis with characteristic morphology, histopathology and evolution.

  20. Further Generalisations of Twisted Gabidulin Codes

    DEFF Research Database (Denmark)

    Puchinger, Sven; Rosenkilde, Johan Sebastian Heesemann; Sheekey, John

    2017-01-01

    We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes.......We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes....

  1. Generalisability of a composite student selection programme

    DEFF Research Database (Denmark)

    O'Neill, Lotte Dyhrberg; Korsholm, Lars; Wallstedt, Birgitta

    2009-01-01

    format); general knowledge (multiple-choice test), and a semi-structured admission interview. The aim of this study was to estimate the generalisability of a composite selection. METHODS: Data from 307 applicants who participated in the admission to medicine in 2007 were available for analysis. Each...... admission parameter was double-scored using two random, blinded and independent raters. Variance components for applicant, rater and residual effects were estimated for a mixed model with the restricted maximum likelihood (REML) method. The reliability of obtained applicant ranks (G coefficients......) was calculated for individual admission criteria and for composite admission procedures. RESULTS: A pre-selection procedure combining qualification and motivation scores showed insufficient generalisability (G = 0.45). The written motivation in particular, displayed low generalisability (G = 0.10). Good...

  2. Simple regular black hole with logarithmic entropy correction

    Energy Technology Data Exchange (ETDEWEB)

    Morales-Duran, Nicolas; Vargas, Andres F.; Hoyos-Restrepo, Paulina; Bargueno, Pedro [Universidad de los Andes, Departamento de Fisica, Bogota, Distrito Capital (Colombia)

    2016-10-15

    A simple regular black hole solution satisfying the weak energy condition is obtained within Einstein-non-linear electrodynamics theory. We have computed the thermodynamic properties of this black hole by a careful analysis of the horizons and we have found that the usual Bekenstein-Hawking entropy gets corrected by a logarithmic term. Therefore, in this sense our model realises some quantum gravity predictions which add this kind of correction to the black hole entropy. In particular, we have established some similitudes between our model and a quadratic generalised uncertainty principle. This similitude has been confirmed by the existence of a remnant, which prevents complete evaporation, in agreement with the quadratic generalised uncertainty principle case. (orig.)

  3. Generalised Brown Clustering and Roll-up Feature Generation

    DEFF Research Database (Denmark)

    Derczynski, Leon; Chester, Sean

    2016-01-01

    active set size. Moreover, the generalisation permits a novel approach to feature selection from Brown clusters: We show that the standard approach of shearing the Brown clustering output tree at arbitrary bitlengths is lossy and that features should be chosen instead by rolling up Generalised Brown...

  4. Generalised linear models for correlated pseudo-observations, with applications to multi-state models

    DEFF Research Database (Denmark)

    Andersen, Per Kragh; Klein, John P.; Rosthøj, Susanne

    2003-01-01

    Generalised estimating equation; Generalised linear model; Jackknife pseudo-value; Logistic regression; Markov Model; Multi-state model......Generalised estimating equation; Generalised linear model; Jackknife pseudo-value; Logistic regression; Markov Model; Multi-state model...

  5. Supersymmetric backgrounds, the Killing superalgebra, and generalised special holonomy

    Energy Technology Data Exchange (ETDEWEB)

    Coimbra, André [Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Strickland-Constable, Charles [Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Institut de physique théorique, Université Paris Saclay, CEA, CNRS,Orme des Merisiers, F-91191 Gif-sur-Yvette (France)

    2016-11-10

    We prove that, for M theory or type II, generic Minkowski flux backgrounds preserving N supersymmetries in dimensions D≥4 correspond precisely to integrable generalised G{sub N} structures, where G{sub N} is the generalised structure group defined by the Killing spinors. In other words, they are the analogues of special holonomy manifolds in E{sub d(d)}×ℝ{sup +} generalised geometry. In establishing this result, we introduce the Kosmann-Dorfman bracket, a generalisation of Kosmann’s Lie derivative of spinors. This allows us to write down the internal sector of the Killing superalgebra, which takes a rather simple form and whose closure is the key step in proving the main result. In addition, we find that the eleven-dimensional Killing superalgebra of these backgrounds is necessarily the supertranslational part of the N-extended super-Poincaré algebra.

  6. Before and beyond the precautionary principle: Epistemology of uncertainty in science and law

    International Nuclear Information System (INIS)

    Tallacchini, Mariachiara

    2005-01-01

    The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society

  7. Wavelets-Computational Aspects of Sterian Realistic Approach to Uncertainty Principle in High Energy Physics: A Transient Approach

    Directory of Open Access Journals (Sweden)

    Cristian Toma

    2013-01-01

    Full Text Available This study presents wavelets-computational aspects of Sterian-realistic approach to uncertainty principle in high energy physics. According to this approach, one cannot make a device for the simultaneous measuring of the canonical conjugate variables in reciprocal Fourier spaces. However, such aspects regarding the use of conjugate Fourier spaces can be also noticed in quantum field theory, where the position representation of a quantum wave is replaced by momentum representation before computing the interaction in a certain point of space, at a certain moment of time. For this reason, certain properties regarding the switch from one representation to another in these conjugate Fourier spaces should be established. It is shown that the best results can be obtained using wavelets aspects and support macroscopic functions for computing (i wave-train nonlinear relativistic transformation, (ii reflection/refraction with a constant shift, (iii diffraction considered as interaction with a null phase shift without annihilation of associated wave, (iv deflection by external electromagnetic fields without phase loss, and (v annihilation of associated wave-train through fast and spatially extended phenomena according to uncertainty principle.

  8. Generalised anxiety disorder

    OpenAIRE

    Gale, Christopher K; Millichamp, Jane

    2011-01-01

    Generalised anxiety disorder is characterised by persistent, excessive and difficult-to-control worry, which may be accompanied by several psychic and somatic symptoms, including suicidality. Generalized anxiety disorder is the most common psychiatric disorder in the primary care, although it is often underrecognised and undertreated. Generalized anxiety disorder is typically a chronic condition with low short- and medium-term remission rates. Clinical presentations often include depression, ...

  9. Managing uncertainty for sustainability of complex projects

    DEFF Research Database (Denmark)

    Brink, Tove

    2017-01-01

    Purpose – The purpose of this paper is to reveal how management of uncertainty can enable sustainability of complex projects. Design/methodology/approach – The research was conducted from June 2014 to May 2015 using a qualitative deductive approach among operation and maintenance actors in offshore...... wind farms. The research contains a focus group interview with 11 companies, 20 individual interviews and a seminar presenting preliminary findings with 60 participants. Findings – The findings reveal the need for management of uncertainty through two different paths. First, project management needs...... to join efforts. Research limitations/implications – Further research is needed to reveal the generalisability of the findings in other complex project contexts containing “unknown unknowns”. Practical implications – The research leads to the development of a tool for uncertainty management...

  10. Improvement on generalised synchronisation of chaotic systems

    International Nuclear Information System (INIS)

    Hui-Bin, Zhu; Fang, Qiu; Bao-Tong, Cui

    2010-01-01

    In this paper, the problem of generalised synchronisation of two different chaotic systems is investigated. Some less conservative conditions are derived using linear matrix inequality other than existing results. Furthermore, a simple adaptive control scheme is proposed to achieve the generalised synchronisation of chaotic systems. The proposed method is simple and easy to implement in practice and can be applied to secure communications. Numerical simulations are also given to demonstrate the effectiveness and feasibility of the theoretical analysis

  11. Effect of Generalized Uncertainty Principle on Main-Sequence Stars and White Dwarfs

    Directory of Open Access Journals (Sweden)

    Mohamed Moussa

    2015-01-01

    Full Text Available This paper addresses the effect of generalized uncertainty principle, emerged from different approaches of quantum gravity within Planck scale, on thermodynamic properties of photon, nonrelativistic ideal gases, and degenerate fermions. A modification in pressure, particle number, and energy density are calculated. Astrophysical objects such as main-sequence stars and white dwarfs are examined and discussed as an application. A modification in Lane-Emden equation due to a change in a polytropic relation caused by the presence of quantum gravity is investigated. The applicable range of quantum gravity parameters is estimated. The bounds in the perturbed parameters are relatively large but they may be considered reasonable values in the astrophysical regime.

  12. Automatic map generalisation from research to production

    Science.gov (United States)

    Nyberg, Rose; Johansson, Mikael; Zhang, Yang

    2018-05-01

    The manual work of map generalisation is known to be a complex and time consuming task. With the development of technology and societies, the demands for more flexible map products with higher quality are growing. The Swedish mapping, cadastral and land registration authority Lantmäteriet has manual production lines for databases in five different scales, 1 : 10 000 (SE10), 1 : 50 000 (SE50), 1 : 100 000 (SE100), 1 : 250 000 (SE250) and 1 : 1 million (SE1M). To streamline this work, Lantmäteriet started a project to automatically generalise geographic information. Planned timespan for the project is 2015-2022. Below the project background together with the methods for the automatic generalisation are described. The paper is completed with a description of results and conclusions.

  13. Rational first integrals of geodesic equations and generalised hidden symmetries

    International Nuclear Information System (INIS)

    Aoki, Arata; Houri, Tsuyoshi; Tomoda, Kentaro

    2016-01-01

    We discuss novel generalisations of Killing tensors, which are introduced by considering rational first integrals of geodesic equations. We introduce the notion of inconstructible generalised Killing tensors, which cannot be constructed from ordinary Killing tensors. Moreover, we introduce inconstructible rational first integrals, which are constructed from inconstructible generalised Killing tensors, and provide a method for checking the inconstructibility of a rational first integral. Using the method, we show that the rational first integral of the Collinson–O’Donnell solution is not inconstructible. We also provide several examples of metrics admitting an inconstructible rational first integral in two and four-dimensions, by using the Maciejewski–Przybylska system. Furthermore, we attempt to generalise other hidden symmetries such as Killing–Yano tensors. (paper)

  14. Free Fall and the Equivalence Principle Revisited

    Science.gov (United States)

    Pendrill, Ann-Marie

    2017-01-01

    Free fall is commonly discussed as an example of the equivalence principle, in the context of a homogeneous gravitational field, which is a reasonable approximation for small test masses falling moderate distances. Newton's law of gravity provides a generalisation to larger distances, and also brings in an inhomogeneity in the gravitational field.…

  15. Quantification of uncertainty in first-principles predicted mechanical properties of solids: Application to solid ion conductors

    Science.gov (United States)

    Ahmad, Zeeshan; Viswanathan, Venkatasubramanian

    2016-08-01

    Computationally-guided material discovery is being increasingly employed using a descriptor-based screening through the calculation of a few properties of interest. A precise understanding of the uncertainty associated with first-principles density functional theory calculated property values is important for the success of descriptor-based screening. The Bayesian error estimation approach has been built in to several recently developed exchange-correlation functionals, which allows an estimate of the uncertainty associated with properties related to the ground state energy, for example, adsorption energies. Here, we propose a robust and computationally efficient method for quantifying uncertainty in mechanical properties, which depend on the derivatives of the energy. The procedure involves calculating energies around the equilibrium cell volume with different strains and fitting the obtained energies to the corresponding energy-strain relationship. At each strain, we use instead of a single energy, an ensemble of energies, giving us an ensemble of fits and thereby, an ensemble of mechanical properties associated with each fit, whose spread can be used to quantify its uncertainty. The generation of ensemble of energies is only a post-processing step involving a perturbation of parameters of the exchange-correlation functional and solving for the energy non-self-consistently. The proposed method is computationally efficient and provides a more robust uncertainty estimate compared to the approach of self-consistent calculations employing several different exchange-correlation functionals. We demonstrate the method by calculating the uncertainty bounds for several materials belonging to different classes and having different structures using the developed method. We show that the calculated uncertainty bounds the property values obtained using three different GGA functionals: PBE, PBEsol, and RPBE. Finally, we apply the approach to calculate the uncertainty

  16. Fearing shades of grey: individual differences in fear responding towards generalisation stimuli.

    Science.gov (United States)

    Arnaudova, Inna; Krypotos, Angelos-Miltiadis; Effting, Marieke; Kindt, Merel; Beckers, Tom

    2017-09-01

    Individual differences in fear generalisation have been proposed to play a role in the aetiology and/or maintenance of anxiety disorders, but few data are available to directly support that claim. The research that is available has focused mostly on generalisation of peripheral and central physiological fear responses. Far less is known about the generalisation of avoidance, the behavioural component of fear. In two experiments, we evaluated how neuroticism, a known vulnerability factor for anxiety, modulates an array of fear responses, including avoidance tendencies, towards generalisation stimuli (GS). Participants underwent differential fear conditioning, in which one conditioned stimulus (CS+) was repeatedly paired with an aversive outcome (shock; unconditioned stimulus, US), whereas another was not (CS-). Fear generalisation was observed across measures in Experiment 1 (US expectancy and evaluative ratings) and Experiment 2 (US expectancy, evaluative ratings, skin conductance, startle responses, safety behaviours), with overall highest responding to the CS+, lowest to the CS- and intermediate responding to the GSs. Neuroticism had very little impact on fear generalisation (but did affect GS recognition rates in Experiment 1), in line with the idea that fear generalisation is largely an adaptive process.

  17. Exactly marginal deformations from exceptional generalised geometry

    Energy Technology Data Exchange (ETDEWEB)

    Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford,Andrew Wiles Building, Woodstock Road, Oxford, OX2 6GG (United Kingdom); Gabella, Maxime [Institute for Advanced Study,Einstein Drive, Princeton, NJ 08540 (United States); Graña, Mariana [Institut de Physique Théorique, CEA/Saclay,91191 Gif-sur-Yvette (France); Petrini, Michela [Sorbonne Université, UPMC Paris 05, UMR 7589, LPTHE,75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)

    2017-01-27

    We apply exceptional generalised geometry to the study of exactly marginal deformations of N=1 SCFTs that are dual to generic AdS{sub 5} flux backgrounds in type IIB or eleven-dimensional supergravity. In the gauge theory, marginal deformations are parametrised by the space of chiral primary operators of conformal dimension three, while exactly marginal deformations correspond to quotienting this space by the complexified global symmetry group. We show how the supergravity analysis gives a geometric interpretation of the gauge theory results. The marginal deformations arise from deformations of generalised structures that solve moment maps for the generalised diffeomorphism group and have the correct charge under the generalised Reeb vector, generating the R-symmetry. If this is the only symmetry of the background, all marginal deformations are exactly marginal. If the background possesses extra isometries, there are obstructions that come from fixed points of the moment maps. The exactly marginal deformations are then given by a further quotient by these extra isometries. Our analysis holds for any N=2 AdS{sub 5} flux background. Focussing on the particular case of type IIB Sasaki-Einstein backgrounds we recover the result that marginal deformations correspond to perturbing the solution by three-form flux at first order. In various explicit examples, we show that our expression for the three-form flux matches those in the literature and the obstruction conditions match the one-loop beta functions of the dual SCFT.

  18. Myocardial infarction and generalised anxiety disorder : 10-year follow-up

    NARCIS (Netherlands)

    Roest, Annelieke M.; Zuidersma, Marij; de Jonge, Peter

    Background Few studies have addressed the relationship between generalised anxiety disorder and cardiovascular prognosis using a diagnostic interview. Aims To assess the association between generalised anxiety disorder and adverse outcomes in patients with myocardial infarction. Method Patients with

  19. The Precautionary Principle and statistical approaches to uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2003-01-01

    Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification...

  20. The Precautionary Principle and Statistical Approaches to Uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2005-01-01

    Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification...

  1. Uncertainty and complementarity in axiomatic quantum mechanics

    International Nuclear Information System (INIS)

    Lahti, P.J.

    1980-01-01

    An investigation of the uncertainty principle and the complementarity principle is carried through. The physical content of these principles and their representation in the conventional Hilbert space formulation of quantum mechanics forms a natural starting point. Thereafter is presented more general axiomatic framework for quantum mechanics, namely, a probability function formulation of the theory. Two extra axioms are stated, reflecting the ideas of the uncertainty principle and the complementarity principle, respectively. The quantal features of these axioms are explicated. (author)

  2. Towards a 'pointless' generalisation of Yang-Mills theory

    International Nuclear Information System (INIS)

    Chan Hongmo; Tsou Sheungtsun

    1989-05-01

    We examine some generalisations in physical concepts of gauge theories, leading towards a scenario corresponding to non-commutative geometry, where the concept of locality loses its usual meaning of being associated with points on a base manifold and becomes intertwined with the concept of internal symmetry, suggesting thereby a gauge theory of extended objects. Examples are given where such generalised gauge structures can be realised, in particular that of string theory. (author)

  3. On a quaternionic generalisation of the Riccati differential equation

    OpenAIRE

    Kravchenko, Viktor; Kravchenko, Vladislav; Williams, Benjamin

    2001-01-01

    A quaternionic partial differential equation is shown to be a generalisation of the Riccati ordinary differential equation and its relationship with the Schrodinger equation is established. Various approaches to the problem of finding particular solutions are explored, and the generalisations of two theorems of Euler on the Riccati differential equation, which correspond to the quaternionic equation, are given.

  4. Thermodynamics of a class of regular black holes with a generalized uncertainty principle

    Science.gov (United States)

    Maluf, R. V.; Neves, Juliano C. S.

    2018-05-01

    In this article, we present a study on thermodynamics of a class of regular black holes. Such a class includes Bardeen and Hayward regular black holes. We obtained thermodynamic quantities like the Hawking temperature, entropy, and heat capacity for the entire class. As part of an effort to indicate some physical observable to distinguish regular black holes from singular black holes, we suggest that regular black holes are colder than singular black holes. Besides, contrary to the Schwarzschild black hole, that class of regular black holes may be thermodynamically stable. From a generalized uncertainty principle, we also obtained the quantum-corrected thermodynamics for the studied class. Such quantum corrections provide a logarithmic term for the quantum-corrected entropy.

  5. Massive vector particles tunneling from black holes influenced by the generalized uncertainty principle

    Directory of Open Access Journals (Sweden)

    Xiang-Qian Li

    2016-12-01

    Full Text Available This study considers the generalized uncertainty principle, which incorporates the central idea of large extra dimensions, to investigate the processes involved when massive spin-1 particles tunnel from Reissner–Nordstrom and Kerr black holes under the effects of quantum gravity. For the black hole, the quantum gravity correction decelerates the increase in temperature. Up to O(1Mf2, the corrected temperatures are affected by the mass and angular momentum of the emitted vector bosons. In addition, the temperature of the Kerr black hole becomes uneven due to rotation. When the mass of the black hole approaches the order of the higher dimensional Planck mass Mf, it stops radiating and yields a black hole remnant.

  6. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.

    2001-01-01

    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...... propose an interesting application to formalisation of hybrid systems. We obtain some class of hybrid systems, which trajectories are computable in the sense of computable analysis. This research was supported in part by the RFBR (grants N 99-01-00485, N 00-01- 00810) and by the Siberian Branch of RAS (a...... grant for young researchers, 2000)...

  7. The exceptional generalised geometry of supersymmetric AdS flux backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford, Andrew Wiles Building,Woodstock Road, Oxford, OX2 6GG (United Kingdom); Petrini, Michela [Sorbonne Université, UPMC Paris 06, UMR 7589,LPTHE, 75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)

    2016-12-29

    We analyse generic AdS flux backgrounds preserving eight supercharges in D=4 and D=5 dimensions using exceptional generalised geometry. We show that they are described by a pair of globally defined, generalised structures, identical to those that appear for flat flux backgrounds but with different integrability conditions. We give a number of explicit examples of such “exceptional Sasaki-Einstein” backgrounds in type IIB supergravity and M-theory. In particular, we give the complete analysis of the generic AdS{sub 5} M-theory backgrounds. We also briefly discuss the structure of the moduli space of solutions. In all cases, one structure defines a “generalised Reeb vector” that generates a Killing symmetry of the background corresponding to the R-symmetry of the dual field theory, and in addition encodes the generic contact structures that appear in the D=4 M-theory and D=5 type IIB cases. Finally, we investigate the relation between generalised structures and quantities in the dual field theory, showing that the central charge and R-charge of BPS wrapped-brane states are both encoded by the generalised Reeb vector, as well as discussing how volume minimisation (the dual of a- and F-maximisation) is encoded.

  8. Supersymmetry for gauged double field theory and generalised Scherk–Schwarz reductions

    International Nuclear Information System (INIS)

    Berman, David S.; Lee, Kanghoon

    2014-01-01

    Previous constructions of supersymmetry for double field theory have relied on the so-called strong constraint. In this paper, the strong constraint is relaxed and the theory is shown to possess supersymmetry once the generalised Scherk–Schwarz reduction is imposed. The equivalence between the generalised Scherk–Schwarz reduced theory and the gauged double field theory is then examined in detail for the supersymmetric theory. As a biproduct we write the generalised Killing spinor equations for the supersymmetric double field theory

  9. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    Science.gov (United States)

    Deffner, Sebastian; Campbell, Steve

    2017-11-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.

  10. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    International Nuclear Information System (INIS)

    Deffner, Sebastian; Campbell, Steve

    2017-01-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam–Tamm and the Margolus–Levitin bounds on the quantum speed limit , and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach , where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader. (topical review)

  11. Generalised shot noise Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    2005-01-01

    We introduce a class of cox cluster processes called generalised shot noise Cox processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process that drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can...

  12. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    Science.gov (United States)

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  13. The generalised anxiety stigma scale (GASS): psychometric properties in a community sample

    Science.gov (United States)

    2011-01-01

    Background Although there is substantial concern about negative attitudes to mental illness, little is known about the stigma associated with Generalised Anxiety Disorder (GAD) or its measurement. The aim of this study was to develop a multi-item measure of Generalised Anxiety Disorder stigma (the GASS). Methods Stigma items were developed from a thematic analysis of web-based text about the stigma associated with GAD. Six hundred and seventeen members of the public completed a survey comprising the resulting 20 stigma items and measures designed to evaluate construct validity. Follow-up data were collected for a subset of the participants (n = 212). Results The factor structure comprised two components: Personal Stigma (views about Generalised Anxiety Disorder); and Perceived Stigma (views about the beliefs of most others in the community). There was evidence of good construct validity and reliability for each of the Generalised Anxiety Stigma Scale (GASS) subscales. Conclusions The GASS is a promising brief measure of the stigma associated with Generalised Anxiety Disorder. PMID:22108099

  14. Thoracic involvement in generalised lymphatic anomaly (or lymphangiomatosis

    Directory of Open Access Journals (Sweden)

    Francesca Luisi

    2016-06-01

    Full Text Available Generalised lymphatic anomaly (GLA, also known as lymphangiomatosis, is a rare disease caused by congenital abnormalities of lymphatic development. It usually presents in childhood but can also be diagnosed in adults. GLA encompasses a wide spectrum of clinical manifestations ranging from single-organ involvement to generalised disease. Given the rarity of the disease, most of the information regarding it comes from case reports. To date, no clinical trials concerning treatment are available. This review focuses on thoracic GLA and summarises possible diagnostic and therapeutic approaches.

  15. Dyads, a generalisation of monads

    NARCIS (Netherlands)

    Fokkinga, M.M.

    The concept of dyad is defined as the least common generalisation of monads and co-monads. So, taking some of the ingredients to be the identity, the concept specialises to the concept of monad, and taking other ingredients to be the identity it specialises to co-monads. Except for one axiom, all

  16. Determining the minimal length scale of the generalized uncertainty principle from the entropy-area relationship

    International Nuclear Information System (INIS)

    Kim, Wontae; Oh, John J.

    2008-01-01

    We derive the formula of the black hole entropy with a minimal length of the Planck size by counting quantum modes of scalar fields in the vicinity of the black hole horizon, taking into account the generalized uncertainty principle (GUP). This formula is applied to some intriguing examples of black holes - the Schwarzschild black hole, the Reissner-Nordstrom black hole, and the magnetically charged dilatonic black hole. As a result, it is shown that the GUP parameter can be determined by imposing the black hole entropy-area relationship, which has a Planck length scale and a universal form within the near-horizon expansion

  17. Generalised relativistic Ohm's laws, extended gauge transformations, and magnetic linking

    International Nuclear Information System (INIS)

    Pegoraro, F.

    2015-01-01

    Generalisations of the relativistic ideal Ohm's law are presented that include specific dynamical features of the current carrying particles in a plasma. Cases of interest for space and laboratory plasmas are identified where these generalisations allow for the definition of generalised electromagnetic fields that transform under a Lorentz boost in the same way as the real electromagnetic fields and that obey the same set of homogeneous Maxwell's equations

  18. Loop Amplitudes in Pure Yang-Mills from Generalised Unitarity

    OpenAIRE

    Brandhuber, Andreas; McNamara, Simon; Spence, Bill; Travaglini, Gabriele

    2005-01-01

    We show how generalised unitarity cuts in D = 4 - 2 epsilon dimensions can be used to calculate efficiently complete one-loop scattering amplitudes in non-supersymmetric Yang-Mills theory. This approach naturally generates the rational terms in the amplitudes, as well as the cut-constructible parts. We test the validity of our method by re-deriving the one-loop ++++, -+++, --++, -+-+ and +++++ gluon scattering amplitudes using generalised quadruple cuts and triple cuts in D dimensions.

  19. Loop amplitudes in pure Yang-Mills from generalised unitarity

    International Nuclear Information System (INIS)

    Brandhuber, Andreas; McNamara, Simon; Spence, Bill; Travaglini, Gabriele

    2005-01-01

    We show how generalised unitarity cuts in D = 4-2ε dimensions can be used to calculate efficiently complete one-loop scattering amplitudes in non-supersymmetric Yang-Mills theory. This approach naturally generates the rational terms in the amplitudes, as well as the cut-constructible parts. We test the validity of our method by re-deriving the one-loop ++++, -+++, --++, -+-+ and +++++ gluon scattering amplitudes using generalised quadruple cuts and triple cuts in D dimensions

  20. Loop amplitudes in pure Yang-Mills from generalised unitarity

    Energy Technology Data Exchange (ETDEWEB)

    Brandhuber, Andreas [Department of Physics, Queen Mary, University of London, Mile End Road, London, E1 4NS (United Kingdom); McNamara, Simon [Department of Physics, Queen Mary, University of London, Mile End Road, London, E1 4NS (United Kingdom); Spence, Bill [Department of Physics, Queen Mary, University of London, Mile End Road, London, E1 4NS (United Kingdom); Travaglini, Gabriele [Department of Physics, Queen Mary, University of London, Mile End Road, London, E1 4NS (United Kingdom)

    2005-10-15

    We show how generalised unitarity cuts in D = 4-2{epsilon} dimensions can be used to calculate efficiently complete one-loop scattering amplitudes in non-supersymmetric Yang-Mills theory. This approach naturally generates the rational terms in the amplitudes, as well as the cut-constructible parts. We test the validity of our method by re-deriving the one-loop ++++, -+++, --++, -+-+ and +++++ gluon scattering amplitudes using generalised quadruple cuts and triple cuts in D dimensions.

  1. A generalised groundwater flow equation using the concept of non ...

    African Journals Online (AJOL)

    The classical Darcy law is generalised by regarding the water flow as a function of a non-integer order derivative of the piezometric head. This generalised law and the law of conservation of mass are then used to derive a new equation for groundwater flow. Numerical solutions of this equation for various fractional orders of ...

  2. Open quantum generalisation of Hopfield neural networks

    Science.gov (United States)

    Rotondo, P.; Marcuzzi, M.; Garrahan, J. P.; Lesanovsky, I.; Müller, M.

    2018-03-01

    We propose a new framework to understand how quantum effects may impact on the dynamics of neural networks. We implement the dynamics of neural networks in terms of Markovian open quantum systems, which allows us to treat thermal and quantum coherent effects on the same footing. In particular, we propose an open quantum generalisation of the Hopfield neural network, the simplest toy model of associative memory. We determine its phase diagram and show that quantum fluctuations give rise to a qualitatively new non-equilibrium phase. This novel phase is characterised by limit cycles corresponding to high-dimensional stationary manifolds that may be regarded as a generalisation of storage patterns to the quantum domain.

  3. Control configuration selection for bilinear systems via generalised Hankel interaction index array

    DEFF Research Database (Denmark)

    Shaker, Hamid Reza; Tahavori, Maryamsadat

    2015-01-01

    configuration selection. It is well known that a suitable control configuration selection is an important prerequisite for a successful industrial control. In this paper the problem of control configuration selection for multiple-input and multiple-output (MIMO) bilinear processes is addressed. First...... way, an iterative method for solving the generalised Sylvester equation is proposed. The generalised cross-gramian is used to form the generalised Hankel interaction index array. The generalised Hankel interaction index array is used for control configuration selection of MIMO bilinear processes. Most......Decentralised and partially decentralised control strategies are very popular in practice. To come up with a suitable decentralised or partially decentralised control structure, it is important to select the appropriate input and output pairs for control design. This procedure is called control...

  4. Generalised summation-by-parts operators and variable coefficients

    Science.gov (United States)

    Ranocha, Hendrik

    2018-06-01

    High-order methods for conservation laws can be highly efficient if their stability is ensured. A suitable means mimicking estimates of the continuous level is provided by summation-by-parts (SBP) operators and the weak enforcement of boundary conditions. Recently, there has been an increasing interest in generalised SBP operators both in the finite difference and the discontinuous Galerkin spectral element framework. However, if generalised SBP operators are used, the treatment of the boundaries becomes more difficult since some properties of the continuous level are no longer mimicked discretely - interpolating the product of two functions will in general result in a value different from the product of the interpolations. Thus, desired properties such as conservation and stability are more difficult to obtain. Here, new formulations are proposed, allowing the creation of discretisations using general SBP operators that are both conservative and stable. Thus, several shortcomings that might be attributed to generalised SBP operators are overcome (cf. Nordström and Ruggiu (2017) [38] and Manzanero et al. (2017) [39]).

  5. Generalised twisted partition functions

    CERN Document Server

    Petkova, V B

    2001-01-01

    We consider the set of partition functions that result from the insertion of twist operators compatible with conformal invariance in a given 2D Conformal Field Theory (CFT). A consistency equation, which gives a classification of twists, is written and solved in particular cases. This generalises old results on twisted torus boundary conditions, gives a physical interpretation of Ocneanu's algebraic construction, and might offer a new route to the study of properties of CFT.

  6. The generalised anxiety stigma scale (GASS: psychometric properties in a community sample

    Directory of Open Access Journals (Sweden)

    Griffiths Kathleen M

    2011-11-01

    Full Text Available Abstract Background Although there is substantial concern about negative attitudes to mental illness, little is known about the stigma associated with Generalised Anxiety Disorder (GAD or its measurement. The aim of this study was to develop a multi-item measure of Generalised Anxiety Disorder stigma (the GASS. Methods Stigma items were developed from a thematic analysis of web-based text about the stigma associated with GAD. Six hundred and seventeen members of the public completed a survey comprising the resulting 20 stigma items and measures designed to evaluate construct validity. Follow-up data were collected for a subset of the participants (n = 212. Results The factor structure comprised two components: Personal Stigma (views about Generalised Anxiety Disorder; and Perceived Stigma (views about the beliefs of most others in the community. There was evidence of good construct validity and reliability for each of the Generalised Anxiety Stigma Scale (GASS subscales. Conclusions The GASS is a promising brief measure of the stigma associated with Generalised Anxiety Disorder.

  7. The certainty principle (review)

    OpenAIRE

    Arbatsky, D. A.

    2006-01-01

    The certainty principle (2005) allowed to conceptualize from the more fundamental grounds both the Heisenberg uncertainty principle (1927) and the Mandelshtam-Tamm relation (1945). In this review I give detailed explanation and discussion of the certainty principle, oriented to all physicists, both theorists and experimenters.

  8. A Generalised Fault Protection Structure Proposed for Uni-grounded Low-Voltage AC Microgrids

    Science.gov (United States)

    Bui, Duong Minh; Chen, Shi-Lin; Lien, Keng-Yu; Jiang, Jheng-Lun

    2016-04-01

    This paper presents three main configurations of uni-grounded low-voltage AC microgrids. Transient situations of a uni-grounded low-voltage (LV) AC microgrid (MG) are simulated through various fault tests and operation transition tests between grid-connected and islanded modes. Based on transient simulation results, available fault protection methods are proposed for main and back-up protection of a uni-grounded AC microgrid. In addition, concept of a generalised fault protection structure of uni-grounded LVAC MGs is mentioned in the paper. As a result, main contributions of the paper are: (i) definition of different uni-grounded LVAC MG configurations; (ii) analysing transient responses of a uni-grounded LVAC microgrid through line-to-line faults, line-to-ground faults, three-phase faults and a microgrid operation transition test, (iii) proposing available fault protection methods for uni-grounded microgrids, such as: non-directional or directional overcurrent protection, under/over voltage protection, differential current protection, voltage-restrained overcurrent protection, and other fault protection principles not based on phase currents and voltages (e.g. total harmonic distortion detection of currents and voltages, using sequence components of current and voltage, 3I0 or 3V0 components), and (iv) developing a generalised fault protection structure with six individual protection zones to be suitable for different uni-grounded AC MG configurations.

  9. Dosimetric quantities and basic data for the evaluation of generalised derived limits

    International Nuclear Information System (INIS)

    Harrison, N.T.; Simmonds, J.R.

    1980-12-01

    The procedures, dosimetric quantities and basic data to be used for the evaluation of Generalised Derived Limits (GDLs) in environmental materials and of Generalised Derived Limits for discharges to atmosphere are described. The dosimetric considerations and the appropriate intake rates for both children and adults are discussed. In most situations in the nuclear industry and in those institutions, hospitals and laboratories which use relatively small quantities of radioactive material, the Generalised Derived Limits provide convenient reference levels against which the results of environmental monitoring can be compared, and atmospheric discharges can be assessed. They are intended for application when the environmental contamination or discharge to atmosphere is less than about 5% of the Generalised Derived Limit; above this level, it will usually be necessary to undertake a more detailed site-specific assessment. (author)

  10. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  11. Uncertainty in the classroom—teaching quantum physics

    International Nuclear Information System (INIS)

    Johansson, K E; Milstead, D

    2008-01-01

    The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how it can be used to elucidate many topics in modern physics

  12. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    Science.gov (United States)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    the use of a statistical law with two parameters (here generalised extreme value Type I distribution) and clearly lower than those associated with the use of a three-parameter law (here generalised extreme value Type II distribution). For extreme flood quantiles, the uncertainties are mostly due to the rainfall generator because of the progressive saturation of the hydrological model.

  13. Generalisability of an online randomised controlled trial: an empirical analysis.

    Science.gov (United States)

    Wang, Cheng; Mollan, Katie R; Hudgens, Michael G; Tucker, Joseph D; Zheng, Heping; Tang, Weiming; Ling, Li

    2018-02-01

    Investigators increasingly use online methods to recruit participants for randomised controlled trials (RCTs). However, the extent to which participants recruited online represent populations of interest is unknown. We evaluated how generalisable an online RCT sample is to men who have sex with men in China. Inverse probability of sampling weights (IPSW) and the G-formula were used to examine the generalisability of an online RCT using model-based approaches. Online RCT data and national cross-sectional study data from China were analysed to illustrate the process of quantitatively assessing generalisability. The RCT (identifier NCT02248558) randomly assigned participants to a crowdsourced or health marketing video for promotion of HIV testing. The primary outcome was self-reported HIV testing within 4 weeks, with a non-inferiority margin of -3%. In the original online RCT analysis, the estimated difference in proportions of HIV tested between the two arms (crowdsourcing and health marketing) was 2.1% (95% CI, -5.4% to 9.7%). The hypothesis that the crowdsourced video was not inferior to the health marketing video to promote HIV testing was not demonstrated. The IPSW and G-formula estimated differences were -2.6% (95% CI, -14.2 to 8.9) and 2.7% (95% CI, -10.7 to 16.2), with both approaches also not establishing non-inferiority. Conducting generalisability analysis of an online RCT is feasible. Examining the generalisability of online RCTs is an important step before an intervention is scaled up. NCT02248558. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Generalised fluid dynamics and quantum mechanics

    NARCIS (Netherlands)

    Broer, L.J.F.

    1974-01-01

    A generalised theory of irrotational fluid flow is developed in hamiltonian form. This allows a systematic derivation of equations for momentum, energy and the rate of work. It is shown that a nonlinear field equation for weakly interacting condensed bosons as given by Gross1) and the one-electron

  15. Generalising the logistic map through the q-product

    International Nuclear Information System (INIS)

    Pessoa, R W S; Borges, E P

    2011-01-01

    We investigate a generalisation of the logistic map as x n+1 = 1 - ax n x qmap x n (-1 ≤ x n ≤ 1, 0 map → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for q map > 1 at the edge of chaos, particularly at the first critical point a c , that depends on the value of q map . Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at a c (q map ), and connections with nonextensive statistical mechanics are explored.

  16. Quantum mechanics of a generalised rigid body

    International Nuclear Information System (INIS)

    Gripaios, Ben; Sutherland, Dave

    2016-01-01

    We consider the quantum version of Arnold’s generalisation of a rigid body in classical mechanics. Thus, we quantise the motion on an arbitrary Lie group manifold of a particle whose classical trajectories correspond to the geodesics of any one-sided-invariant metric. We show how the derivation of the spectrum of energy eigenstates can be simplified by making use of automorphisms of the Lie algebra and (for groups of type I) by methods of harmonic analysis. We show how the method can be extended to cosets, generalising the linear rigid rotor. As examples, we consider all connected and simply connected Lie groups up to dimension 3. This includes the universal cover of the archetypical rigid body, along with a number of new exactly solvable models. We also discuss a possible application to the topical problem of quantising a perfect fluid. (paper)

  17. Work and entropy production in generalised Gibbs ensembles

    International Nuclear Information System (INIS)

    Perarnau-Llobet, Martí; Riera, Arnau; Gallego, Rodrigo; Wilming, Henrik; Eisert, Jens

    2016-01-01

    Recent years have seen an enormously revived interest in the study of thermodynamic notions in the quantum regime. This applies both to the study of notions of work extraction in thermal machines in the quantum regime, as well as to questions of equilibration and thermalisation of interacting quantum many-body systems as such. In this work we bring together these two lines of research by studying work extraction in a closed system that undergoes a sequence of quenches and equilibration steps concomitant with free evolutions. In this way, we incorporate an important insight from the study of the dynamics of quantum many body systems: the evolution of closed systems is expected to be well described, for relevant observables and most times, by a suitable equilibrium state. We will consider three kinds of equilibration, namely to (i) the time averaged state, (ii) the Gibbs ensemble and (iii) the generalised Gibbs ensemble, reflecting further constants of motion in integrable models. For each effective description, we investigate notions of entropy production, the validity of the minimal work principle and properties of optimal work extraction protocols. While we keep the discussion general, much room is dedicated to the discussion of paradigmatic non-interacting fermionic quantum many-body systems, for which we identify significant differences with respect to the role of the minimal work principle. Our work not only has implications for experiments with cold atoms, but also can be viewed as suggesting a mindset for quantum thermodynamics where the role of the external heat baths is instead played by the system itself, with its internal degrees of freedom bringing coarse-grained observables to equilibrium. (paper)

  18. The oculocerebral syndrome in association with generalised ...

    African Journals Online (AJOL)

    A 14-year-old girl with generalised hypopigmentation, mental retardation, abnormal movements, and ocular anomalies is described. It is suggested that she represents a further case of oculocerebral albinism, a rare autosomal recessive condition. Reference is made to previous similar cases.

  19. Kolkata Restaurant Problem as a Generalised El Farol Bar Problem

    Science.gov (United States)

    Chakrabarti, Bikas K.

    Generalisation of the El Farol bar problem to that of many bars here leads to the Kolkata restaurant problem, where the decision to go to any restaurant or not is much simpler (depending on the previous experience of course, as in the El Farol bar problem). This generalised problem can be exactly analysed in some limiting cases discussed here. The fluctuation in the restaurant service can be shown to have precisely an inverse cubic behavior, as widely seen in the stock market fluctuations.

  20. Influence of colour on acquisition and generalisation of graphic symbols.

    Science.gov (United States)

    Hetzroni, O E; Ne'eman, A

    2013-07-01

    Children with autism may benefit from using graphic symbols for their communication, language and literacy development. The purpose of this study was to investigate the influence of colour versus grey-scale displays on the identification of graphic symbols using a computer-based intervention. An alternating treatment design was employed to examine the learning and generalisation of 58 colour and grey-scale symbols by four preschool children with autism. The graphic symbols were taught via a meaning-based intervention using stories and educational games. Results demonstrate that all of the children were able to learn and maintain symbol identification over time for both symbol displays with no apparent differences. Differences were apparent for two of the children who exhibited better generalisation when learning grey-scale symbols first. The other two showed no noticeable difference, between displays when generalising from one display to the other. Implications and further research are discussed. © 2012 The Authors. Journal of Intellectual Disability Research © 2012 John Wiley & Sons Ltd, MENCAP & IASSID.

  1. Generalised phase contrast: microscopy, manipulation and more

    DEFF Research Database (Denmark)

    Palima, Darwin; Glückstad, Jesper

    2010-01-01

    Generalised phase contrast (GPC) not only leads to more accurate phase imaging beyond thin biological samples, but serves as an enabling framework in developing tools over a wide spectrum of contemporary applications in optics and photonics, including optical trapping and micromanipulation, optic...

  2. New Inequalities and Uncertainty Relations on Linear Canonical Transform Revisit

    Directory of Open Access Journals (Sweden)

    Xu Guanlei

    2009-01-01

    Full Text Available The uncertainty principle plays an important role in mathematics, physics, signal processing, and so on. Firstly, based on definition of the linear canonical transform (LCT and the traditional Pitt's inequality, one novel Pitt's inequality in the LCT domains is obtained, which is connected with the LCT parameters a and b. Then one novel logarithmic uncertainty principle is derived from this novel Pitt's inequality in the LCT domains, which is associated with parameters of the two LCTs. Secondly, from the relation between the original function and LCT, one entropic uncertainty principle and one Heisenberg's uncertainty principle in the LCT domains are derived, which are associated with the LCT parameters a and b. The reason why the three lower bounds are only associated with LCT parameters a and b and independent of c and d is presented. The results show it is possible that the bounds tend to zeros.

  3. Change and uncertainty in quantum systems

    International Nuclear Information System (INIS)

    Franson, J.D.

    1996-01-01

    A simple inequality shows that any change in the expectation value of an observable quantity must be associated with some degree of uncertainty. This inequality is often more restrictive than the Heisenberg uncertainty principle. copyright 1996 The American Physical Society

  4. Hybrid variational principles and synthesis method for finite element neutron transport calculations

    International Nuclear Information System (INIS)

    Ackroyd, R.T.; Nanneh, M.M.

    1990-01-01

    A family of hybrid variational principles is derived using a generalised least squares method. Neutron conservation is automatically satisfied for the hybrid principles employing two trial functions. No interfaces or reflection conditions need to be imposed on the independent even-parity trial function. For some hybrid principles a single trial function can be employed by relating one parity trial function to the other, using one of the parity transport equation in relaxed form. For other hybrid principles the trial functions can be employed sequentially. Synthesis of transport solutions, starting with the diffusion theory approximation, has been used as a way of reducing the scale of the computation that arises with established finite element methods for neutron transport. (author)

  5. Working dogs cooperate among one another by generalised reciprocity.

    Science.gov (United States)

    Gfrerer, Nastassja; Taborsky, Michael

    2017-03-06

    Cooperation by generalised reciprocity implies that individuals apply the decision rule "help anyone if helped by someone". This mechanism has been shown to generate evolutionarily stable levels of cooperation, but as yet it is unclear how widely this cooperation mechanism is applied among animals. Dogs (Canis familiaris) are highly social animals with considerable cognitive potential and the ability to differentiate between individual social partners. But although dogs can solve complex problems, they may use simple rules for behavioural decisions. Here we show that dogs trained in an instrumental cooperative task to provide food to a social partner help conspecifics more often after receiving help from a dog before. Remarkably, in so doing they show no distinction between partners that had helped them before and completely unfamiliar conspecifics. Apparently, dogs use the simple decision rule characterizing generalised reciprocity, although they are probably capable of using the more complex decision rule of direct reciprocity: "help someone who has helped you". However, generalized reciprocity involves lower information processing costs and is therefore a cheaper cooperation strategy. Our results imply that generalised reciprocity might be applied more commonly than direct reciprocity also in other mutually cooperating animals.

  6. Classical r-matrices for the generalised Chern–Simons formulation of 3d gravity

    Science.gov (United States)

    Osei, Prince K.; Schroers, Bernd J.

    2018-04-01

    We study the conditions for classical r-matrices to be compatible with the generalised Chern–Simons action for 3d gravity. Compatibility means solving the classical Yang–Baxter equations with a prescribed symmetric part for each of the real Lie algebras and bilinear pairings arising in the generalised Chern–Simons action. We give a new construction of r-matrices via a generalised complexification and derive a non-linear set of matrix equations determining the most general compatible r-matrix. We exhibit new families of solutions and show that they contain some known r-matrices for special parameter values.

  7. On Generalisation of Polynomials in Complex Plane

    Directory of Open Access Journals (Sweden)

    Maslina Darus

    2010-01-01

    Full Text Available The generalised Bell and Laguerre polynomials of fractional-order in complex z-plane are defined. Some properties are studied. Moreover, we proved that these polynomials are univalent solutions for second order differential equations. Also, the Laguerre-type of some special functions are introduced.

  8. Support vector machines and generalisation in HEP

    Science.gov (United States)

    Bevan, Adrian; Gamboa Goñi, Rodrigo; Hays, Jon; Stevenson, Tom

    2017-10-01

    We review the concept of Support Vector Machines (SVMs) and discuss examples of their use in a number of scenarios. Several SVM implementations have been used in HEP and we exemplify this algorithm using the Toolkit for Multivariate Analysis (TMVA) implementation. We discuss examples relevant to HEP including background suppression for H → τ + τ - at the LHC with several different kernel functions. Performance benchmarking leads to the issue of generalisation of hyper-parameter selection. The avoidance of fine tuning (over training or over fitting) in MVA hyper-parameter optimisation, i.e. the ability to ensure generalised performance of an MVA that is independent of the training, validation and test samples, is of utmost importance. We discuss this issue and compare and contrast performance of hold-out and k-fold cross-validation. We have extended the SVM functionality and introduced tools to facilitate cross validation in TMVA and present results based on these improvements.

  9. Generalising the logistic map through the q-product

    Science.gov (United States)

    Pessoa, R. W. S.; Borges, E. P.

    2011-03-01

    We investigate a generalisation of the logistic map as xn+1 = 1 - axn otimesqmap xn (-1 Borges, E.P. Physica A 340, 95 (2004)]. The usual product, and consequently the usual logistic map, is recovered in the limit q → 1, The tent map is also a particular case for qmap → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for qmap > 1 at the edge of chaos, particularly at the first critical point ac, that depends on the value of qmap. Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at ac(qmap), and connections with nonextensive statistical mechanics are explored.

  10. Emotion recognition training using composite faces generalises across identities but not all emotions.

    Science.gov (United States)

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  11. Enhancing generalisation in biofeedback intervention using the challenge point framework: A case study

    Science.gov (United States)

    HITCHCOCK, ELAINE R.; BYUN, TARA McALLISTER

    2014-01-01

    Biofeedback intervention can help children achieve correct production of a treatment-resistant error sound, but generalisation is often limited. This case study suggests that generalisation can be enhanced when biofeedback intervention is structured in accordance with a “challenge point” framework for speech-motor learning. The participant was an 11-year-old with residual /r/ misarticulation who had previously attained correct /r/ production through a structured course of ultrasound biofeedback treatment but did not generalise these gains beyond the word level. Treatment difficulty was adjusted in an adaptive manner following predetermined criteria for advancing, maintaining, or moving back a level in a multidimensional hierarchy of functional task complexity. The participant achieved and maintained virtually 100% accuracy in producing /r/ at both word and sentence levels. These preliminary results support the efficacy of a semi-structured implementation of the challenge point framework as a means of achieving generalisation and maintenance of treatment gains. PMID:25216375

  12. The Uncertainty of Measurement Results

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  13. A study of idiopathic generalised epilepsy in an Irish population.

    LENUS (Irish Health Repository)

    Mullins, G M

    2012-02-03

    Idiopathic generalised epilepsy (IGE) is subdivided into syndromes based on clinical and EEG features. PURPOSE: The aim of this study was to characterise all cases of IGE with supportive EEG abnormalities in terms of gender differences, seizure types reported, IGE syndromes, family history of epilepsy and EEG findings. We also calculated the limited duration prevalence of IGE in our cohort. METHODS: Data on abnormal EEGs were collected retrospectively from two EEG databases at two tertiary referral centres for neurology. Clinical information was obtained from EEG request forms, standardised EEG questionnaires and medical notes of patients. RESULTS: two hundred twenty-three patients met our inclusion criteria, 89 (39.9%) male and 134 (60.1%) females. Tonic clonic seizures were the most common seizure type reported, 162 (72.65%) having a generalised tonic clonic seizure (GTCS) at some time. IGE with GTCS only (EGTCSA) was the most common syndrome in our cohort being present in 94 patients (34 male, 60 female), with 42 (15 male, 27 female) patients diagnosed with Juvenile myoclonic epilepsy (JME), 23 (9 male, 14 female) with Juvenile absence epilepsy (JAE) and 20 (9 male, 11 female) with childhood absence epilepsy (CAE). EEG studies in all patients showed generalised epileptiform activity. CONCLUSIONS: More women than men were diagnosed with generalised epilepsy. Tonic clonic seizures were the most common seizure type reported. EGTCSA was the most frequent syndrome seen. Gender differences were evident for JAE and JME as previously reported and for EGTCSA, which was not reported to date, and reached statistical significance for EGTCA and JME.

  14. A Generalised Approach to Petri Nets and Algebraic Specifications

    International Nuclear Information System (INIS)

    Sivertsen, Terje

    1998-02-01

    The present report represents a continuation of the work on Petri nets and algebraic specifications. The reported research has focused on generalising the approach introduced in HWR-454, with the aim of facilitating the translation of a wider class of Petri nets into algebraic specification. This includes autonomous Petri nets with increased descriptive power, as well as non-autonomous Petri nets allowing the modelling of systems (1) involving extensive data processing; (2) with transitions synchronized on external events; (3) whose evolutions are time dependent. The generalised approach has the important property of being modular in the sense that the translated specifications can be gradually extended to include data processing, synchronization, and timing. The report also discusses the relative merits of state-based and transition-based specifications, and includes a non-trivial case study involving automated proofs of a large number of interrelated theorems. The examples in the report illustrate the use of the new HRP Prover. Of particular importance in this context is the automatic transformation between state-based and transitionbased specifications. It is expected that the approach introduced in HWR-454 and generalised in the present report will prove useful in future work on combination of wide variety of specification techniques

  15. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  16. Object recognition and generalisation during habituation in horses

    DEFF Research Database (Denmark)

    Christensen, Janne Winther; Zharkikh, Tjatjana; Chovaux, Elodie

    2011-01-01

    The ability of horses to habituate to frightening stimuli greatly increases safety in the horse–human relationship. A recent experiment suggested, however, that habituation to frightening visual stimuli is relatively stimulus-specific in horses and that shape and colour are important factors...... for object generalisation (Christensen et al., 2008). In a series of experiments, we aimed to further explore the ability of horses (n = 30, 1 and 2-year-old mares) to recognise and generalise between objects during habituation. TEST horses (n = 15) were habituated to a complex object, composed of five...... simple objects of varying shape and colour, whereas CONTROL horses (n = 15) were habituated to the test arena, but not to the complex object. In the first experiment, we investigated whether TEST horses subsequently reacted less to i) simple objects that were previously part of the complex object (i...

  17. ''Nature is unknowable''. The idea of uncertainty

    International Nuclear Information System (INIS)

    Crozon, M.

    2000-01-01

    This paper deals with one of the great idea of the twentieth century, the uncertainty principle of Heisenberg. With a philosophical approach the author explains this principle and presents its cultural impacts on mind. (A.L.B.)

  18. Darwin without borders? Looking at 'generalised Darwinism' through the prism of the 'hourglass model'.

    Science.gov (United States)

    Levit, Georgy S; Hossfeld, Uwe

    2011-12-01

    This article critically analyzes the arguments of the 'generalized Darwinism' recently proposed for the analysis of social-economical systems. We argue that 'generalized Darwinism' is both restrictive and empty. It is restrictive because it excludes alternative (non-selectionist) evolutionary mechanisms such as orthogenesis, saltationism and mutationism without any examination of their suitability for modeling socio-economic processes and ignoring their important roles in the development of contemporary evolutionary theory. It is empty, because it reduces Darwinism to an abstract triple-principle scheme (variation, selection and inheritance) thus ignoring the actual structure of Darwinism as a complex and dynamic theoretical structure inseparable from a very detailed system of theoretical constraints. Arguing against 'generalised Darwinism' we present our vision of the history of evolutionary biology with the help of the 'hourglass model' reflecting the internal dynamic of competing theories of evolution.

  19. Generalised discrete torsion and mirror symmetry for G2 manifolds

    International Nuclear Information System (INIS)

    Gaberdiel, Matthias R.; Kaste, Peter

    2004-01-01

    A generalisation of discrete torsion is introduced in which different discrete torsion phases are considered for the different fixed points or twist fields of a twisted sector. The constraints that arise from modular invariance are analysed carefully. As an application we show how all the different resolutions of the T 7 /Z 2 3 orbifold of Joyce have an interpretation in terms of such generalised discrete torsion orbifolds. Furthermore, we show that these manifolds are pairwise identified under G 2 mirror symmetry. From a conformal field theory point of view, this mirror symmetry arises from an automorphism of the extended chiral algebra of the G 2 compactification. (author)

  20. Relativistic generalisation of the Kroll-Watson formula

    International Nuclear Information System (INIS)

    Kaminski, J.Z.

    1985-01-01

    The relativistic analogue of the space-translation method is derived. Using this method the generalisation of the Kroll-Watson formula [1973, Phys. Rev. A. 8 804] is obtained for the scattering of an arbitrary charged particle (e.g. mesons, hyperons, quarks, etc). The separation of the background and resonant parts of the scattering amplitude is predicted. (author)

  1. Hyperscaling violating solutions in generalised EMD theory

    Directory of Open Access Journals (Sweden)

    Li Li

    2017-04-01

    Full Text Available This short note is devoted to deriving scaling but hyperscaling violating solutions in a generalised Einstein–Maxwell-Dilaton theory with an arbitrary number of scalars and vectors. We obtain analytic solutions in some special case and discuss the physical constraints on the allowed parameter range in order to have a well-defined holographic ground-state solution.

  2. Hyperscaling violating solutions in generalised EMD theory

    Energy Technology Data Exchange (ETDEWEB)

    Li, Li, E-mail: lil416@lehigh.edu [Crete Center for Theoretical Physics, Institute for Theoretical and Computational Physics, Department of Physics, University of Crete, 71003 Heraklion (Greece); Crete Center for Quantum Complexity and Nanotechnology, Department of Physics, University of Crete, 71003 Heraklion (Greece); Department of Physics, Lehigh University, Bethlehem, PA, 18018 (United States)

    2017-04-10

    This short note is devoted to deriving scaling but hyperscaling violating solutions in a generalised Einstein–Maxwell-Dilaton theory with an arbitrary number of scalars and vectors. We obtain analytic solutions in some special case and discuss the physical constraints on the allowed parameter range in order to have a well-defined holographic ground-state solution.

  3. Generalising the staircase models

    International Nuclear Information System (INIS)

    Dorey, P.; Ravanini, F.

    1993-01-01

    Systems of integral equations are proposed which generalise those previously encountered in connection with the so-called staircase models. Under the assumption that these equations describe the finite-size effects of relativistic field theories via the thermodynamic Bethe ansatz, analytical and numerical evidence is given for the existence of a variety of new roaming renormalisation group trajectories. For each positive integer k and s=0, .., k-1, these is a one-parameter family of trajectories, passing close by the coset conformal field theories G (k) xG (nk+s) /G ((n+1)k+s) before finally flowing to a massive theory for s=0, or to another coset model for s.=|0. (orig.)

  4. Gait analysis of adults with generalised joint hypermobility

    DEFF Research Database (Denmark)

    Simonsen, Erik B; Tegner, Heidi; Alkjær, Tine

    2012-01-01

    BACKGROUND: The majority of adults with Generalised Joint Hypermobility experience symptoms such as pain and joint instability, which is likely to influence their gait pattern. Accordingly, the purpose of the present project was to perform a biomechanical gait analysis on a group of patients...

  5. The diagnostic value of the alveolar lamina dura in generalised bone disease

    International Nuclear Information System (INIS)

    Kuhlencordt, J.; Kruse, H.P.; Franke, J.; Hamburg Univ.

    1981-01-01

    Changes in the alveolar lamina dura in 134 patients have been analysed. They included 32 cases with urolithiasis in whom generalised bone disease had been excluded, 37 cases of primary hyperparathyroidism, 31 cases of secondary hyperparathyroidism and 34 cases with primary osteoporosis. The state of the lamina dura was related to biochemical, radiological and histological findings in the various groups. The value of the lamina dura in the diagnosis of generalised skeletal abnormalities has been defined. (orig.) [de

  6. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  7. Generalised Chou-Yang model and recent results

    International Nuclear Information System (INIS)

    Fazal-e-Aleem; Rashid, H.

    1996-01-01

    It is shown that most recent results of E710 and UA4/2 collaboration for the total cross section and ρ together with earlier measurements give good agreement with measurements for the differential cross section at 546 and 1800 GeV within the framework of Generalised Chou-Yang model. These results are also compared with the predictions of other models. (author)

  8. The Bohr--Einstein ''weighing-of-energy'' debate and the principle of equivalence

    International Nuclear Information System (INIS)

    Hughes, R.J.

    1990-01-01

    The Bohr--Einstein debate over the ''weighing of energy'' and the validity of the time--energy uncertainty relation is reexamined in the context of gravitation theories that do not respect the equivalence principle. Bohr's use of the equivalence principle is shown to be sufficient, but not necessary, to establish the validity of this uncertainty relation in Einstein's ''weighing-of-energy'' gedanken experiment. The uncertainty relation is shown to hold in any energy-conserving theory of gravity, and so a failure of the equivalence principle does not engender a failure of quantum mechanics. The relationship between the gravitational redshift and the equivalence principle is reviewed

  9. The Hayes principles: learning from the national pilot of information technology and core generalisable theory in informatics.

    Science.gov (United States)

    de Lusignan, Simon; Krause, Paul

    2010-01-01

    There has been much criticism of the NHS national programme for information technology (IT); it has been an expensive programme and some elements appear to have achieved little. The Hayes report was written as an independent review of health and social care IT in England. To identify key principles for health IT implementation which may have relevance beyond the critique of NHS IT. We elicit ten principles from the Hayes report, which if followed may result in more effective IT implementation in health care. They divide into patient-centred, subsidiarity and strategic principles. The patient-centred principles are: 1) the patient must be at the centre of all information systems; 2) the provision of patient-level operational data should form the foundation - avoid the dataset mentality; 3) store health data as close to the patient as possible; 4) enable the patient to take a more active role with their health data within a trusted doctor-patient relationship. The subsidiarity principles set out to balance the local and health-system-wide needs: 5) standardise centrally - patients must be able to benefit from interoperability; 6) provide a standard procurement package and an approved process that ensures safety standards and provision of interoperable systems; 7) authorise a range of local suppliers so that health providers can select the system best meeting local needs; 8) allow local migration from legacy systems, as and when improved functionality for patients is available. And finally the strategic principles: 9) evaluate health IT systems in terms of measureable benefits to patients; 10) strategic planning of systems should reflect strategic goals for the health of patients/the population. Had the Hayes principles been embedded within our approach to health IT, and in particular to medical record implementation, we might have avoided many of the costly mistakes with the UK national programme. However, these principles need application within the modern IT

  10. Quantum corrections to the thermodynamics of Schwarzschild-Tangherlini black hole and the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Z.W.; Zu, X.T. [University of Electronic Science and Technology of China, School of Physical Electronics, Chengdu (China); Li, H.L. [University of Electronic Science and Technology of China, School of Physical Electronics, Chengdu (China); Shenyang Normal University, College of Physics Science and Technology, Shenyang (China); Yang, S.Z. [China West Normal University, Physics and Space Science College, Nanchong (China)

    2016-04-15

    We investigate the thermodynamics of Schwarzschild-Tangherlini black hole in the context of the generalized uncertainty principle (GUP). The corrections to the Hawking temperature, entropy and the heat capacity are obtained via the modified Hamilton-Jacobi equation. These modifications show that the GUP changes the evolution of the Schwarzschild-Tangherlini black hole. Specially, the GUP effect becomes susceptible when the radius or mass of the black hole approaches the order of Planck scale, it stops radiating and leads to a black hole remnant. Meanwhile, the Planck scale remnant can be confirmed through the analysis of the heat capacity. Those phenomena imply that the GUP may give a way to solve the information paradox. Besides, we also investigate the possibilities to observe the black hole at the Large Hadron Collider (LHC), and the results demonstrate that the black hole cannot be produced in the recent LHC. (orig.)

  11. Generalised Chou-Yang model and recent results

    International Nuclear Information System (INIS)

    Fazal-e-Aleem; Rashid, H.

    1995-09-01

    It is shown that most recent results of E710 and UA4/2 collaboration for the total cross section and ρ together with earlier measurements give good agreement with measurements for the differential cross section at 546 and 1800 GeV the framework of Generalised Chou-Yang model. These results are also compared with the predictions of other models. (author). 16 refs, 2 figs

  12. Generalised Chou-Yang model and recent results

    Energy Technology Data Exchange (ETDEWEB)

    Fazal-e-Aleem [International Centre for Theoretical Physics, Trieste (Italy); Rashid, H. [Punjab Univ., Lahore (Pakistan). Centre for High Energy Physics

    1996-12-31

    It is shown that most recent results of E710 and UA4/2 collaboration for the total cross section and {rho} together with earlier measurements give good agreement with measurements for the differential cross section at 546 and 1800 GeV within the framework of Generalised Chou-Yang model. These results are also compared with the predictions of other models. (author) 16 refs.

  13. Dirac equations for generalised Yang-Mills systems

    International Nuclear Information System (INIS)

    Lechtenfeld, O.; Nahm, W.; Tchrakian, D.H.

    1985-06-01

    We present Dirac equations in 4p dimensions for the generalised Yang-Mills (GYM) theories introduced earlier. These Dirac equations are related to the self-duality equations of the GYM and are checked to be elliptic in a 'BPST' background. In this background these Dirac equations are integrated exactly. The possibility of imposing supersymmetry in the GYM-Dirac system is investigated, with negative results. (orig.)

  14. Anaesthesia for caesarean section in a patient with acute generalised pustular psoriasis.

    Science.gov (United States)

    Samieh-Tucker, A; Rupasinghe, M

    2007-10-01

    We describe a 30-year-old parturient with acute generalised pustular psoriasis who presented for urgent caesarean section. A multidisciplinary team was involved and general anaesthesia was used successfully. Management of this condition is discussed and the literature reviewed. While generalised pustular psoriasis or impetigo herpetiformis is well recognised in pregnancy, it has not hitherto been reported in obstetric anaesthesia literature. The purpose of this article is to delineate the clinical picture of this disease, its treatment, and the effect on the mother and the fetus.

  15. Fundamental principles of quantum theory

    International Nuclear Information System (INIS)

    Bugajski, S.

    1980-01-01

    After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)

  16. The impact of case specificity and generalisable skills on clinical performance: a correlated traits-correlated methods approach.

    Science.gov (United States)

    Wimmers, Paul F; Fung, Cha-Chi

    2008-06-01

    The finding of case or content specificity in medical problem solving moved the focus of research away from generalisable skills towards the importance of content knowledge. However, controversy about the content dependency of clinical performance and the generalisability of skills remains. This study aimed to explore the relative impact of both perspectives (case specificity and generalisable skills) on different components (history taking, physical examination, communication) of clinical performance within and across cases. Data from a clinical performance examination (CPX) taken by 350 Year 3 students were used in a correlated traits-correlated methods (CTCM) approach using confirmatory factor analysis, whereby 'traits' refers to generalisable skills and 'methods' to individual cases. The baseline CTCM model was analysed and compared with four nested models using structural equation modelling techniques. The CPX consisted of three skills components and five cases. Comparison of the four different models with the least-restricted baseline CTCM model revealed that a model with uncorrelated generalisable skills factors and correlated case-specific knowledge factors represented the data best. The generalisable processes found in history taking, physical examination and communication were responsible for half the explained variance, in comparison with the variance related to case specificity. Conclusions Pure knowledge-based and pure skill-based perspectives on clinical performance both seem too one-dimensional and new evidence supports the idea that a substantial amount of variance contributes to both aspects of performance. It could be concluded that generalisable skills and specialised knowledge go hand in hand: both are essential aspects of clinical performance.

  17. Effect Displays in R for Generalised Linear Models

    Directory of Open Access Journals (Sweden)

    John Fox

    2003-07-01

    Full Text Available This paper describes the implementation in R of a method for tabular or graphical display of terms in a complex generalised linear model. By complex, I mean a model that contains terms related by marginality or hierarchy, such as polynomial terms, or main effects and interactions. I call these tables or graphs effect displays. Effect displays are constructed by identifying high-order terms in a generalised linear model. Fitted values under the model are computed for each such term. The lower-order "relatives" of a high-order term (e.g., main effects marginal to an interaction are absorbed into the term, allowing the predictors appearing in the high-order term to range over their values. The values of other predictors are fixed at typical values: for example, a covariate could be fixed at its mean or median, a factor at its proportional distribution in the data, or to equal proportions in its several levels. Variations of effect displays are also described, including representation of terms higher-order to any appearing in the model.

  18. Generalised pollination systems for three invasive milkweeds in Australia.

    Science.gov (United States)

    Ward, M; Johnson, S D

    2013-05-01

    Because most plants require pollinator visits for seed production, the ability of an introduced plant species to establish pollinator relationships in a new ecosystem may have a central role in determining its success or failure as an invader. We investigated the pollination ecology of three milkweed species - Asclepias curassavica, Gomphocarpus fruticosus and G. physocarpus - in their invaded range in southeast Queensland, Australia. The complex floral morphology of milkweeds has often been interpreted as a general trend towards specialised pollination requirements. Based on this interpretation, invasion by milkweeds contradicts the expectation than plant species with specialised pollination systems are less likely to become invasive that those with more generalised pollination requirements. However, observations of flower visitors in natural populations of the three study species revealed that their pollination systems are essentially specialised at the taxonomic level of the order, but generalised at the species level. Specifically, pollinators of the two Gomphocarpus species included various species of Hymenoptera (particularly vespid wasps), while pollinators of A. curassavica were primarily Lepidoptera (particularly nymphalid butterflies). Pollinators of all three species are rewarded with copious amounts of highly concentrated nectar. It is likely that successful invasion by these three milkweed species is attributable, at least in part, to their generalised pollinator requirements. The results of this study are discussed in terms of how data from the native range may be useful in predicting pollination success of species in a new environment. © 2012 German Botanical Society and The Royal Botanical Society of the Netherlands.

  19. Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.

    Science.gov (United States)

    Hsieh, I-Hui; Saberi, Kourosh

    2016-02-01

    How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction.

  20. Quantum field theory in generalised Snyder spaces

    International Nuclear Information System (INIS)

    Meljanac, S.; Meljanac, D.; Mignemi, S.; Štrajn, R.

    2017-01-01

    We discuss the generalisation of the Snyder model that includes all possible deformations of the Heisenberg algebra compatible with Lorentz invariance and investigate its properties. We calculate perturbatively the law of addition of momenta and the star product in the general case. We also undertake the construction of a scalar field theory on these noncommutative spaces showing that the free theory is equivalent to the commutative one, like in other models of noncommutative QFT.

  1. Quantum field theory in generalised Snyder spaces

    Energy Technology Data Exchange (ETDEWEB)

    Meljanac, S.; Meljanac, D. [Rudjer Bošković Institute, Bijenička cesta 54, 10002 Zagreb (Croatia); Mignemi, S., E-mail: smignemi@unica.it [Dipartimento di Matematica e Informatica, Università di Cagliari, viale Merello 92, 09123 Cagliari (Italy); INFN, Sezione di Cagliari, Cittadella Universitaria, 09042 Monserrato (Italy); Štrajn, R. [Dipartimento di Matematica e Informatica, Università di Cagliari, viale Merello 92, 09123 Cagliari (Italy); INFN, Sezione di Cagliari, Cittadella Universitaria, 09042 Monserrato (Italy)

    2017-05-10

    We discuss the generalisation of the Snyder model that includes all possible deformations of the Heisenberg algebra compatible with Lorentz invariance and investigate its properties. We calculate perturbatively the law of addition of momenta and the star product in the general case. We also undertake the construction of a scalar field theory on these noncommutative spaces showing that the free theory is equivalent to the commutative one, like in other models of noncommutative QFT.

  2. W-algebra symmetries of generalised Drinfel'd-Sokolov hierarchies

    International Nuclear Information System (INIS)

    Spence, B.

    1992-01-01

    Using the zero curvature formulation, it is shown that W-algebra transformations are symmetries of corresponding generalised Drinfel'd-Sokolov hierarchies. This result is illustrated with the examples of the KdV and Boussinesque hierarchies, and the hierarchy associated to the Polyakov-Bershadsky W-algebra. (orig.)

  3. Generalised Multi-sequence Shift-Register Synthesis using Module Minimisation

    DEFF Research Database (Denmark)

    Nielsen, Johan Sebastian Rosenkilde

    2013-01-01

    We show how to solve a generalised version of the Multi-sequence Linear Feedback Shift-Register (MLFSR) problem using minimisation of free modules over F[x]. We show how two existing algorithms for minimising such modules run particularly fast on these instances. Furthermore, we show how one...

  4. Generalised synchronisation of spatiotemporal chaos using feedback control method and phase compression

    International Nuclear Information System (INIS)

    Xing-Yuan, Wang; Na, Zhang

    2010-01-01

    Coupled map lattices are taken as examples to study the synchronisation of spatiotemporal chaotic systems. First, a generalised synchronisation of two coupled map lattices is realised through selecting an appropriate feedback function and appropriate range of feedback parameter. Based on this method we use the phase compression method to extend the range of the parameter. So, we integrate the feedback control method with the phase compression method to implement the generalised synchronisation and obtain an exact range of feedback parameter. This technique is simple to implement in practice. Numerical simulations show the effectiveness and the feasibility of the proposed program. (general)

  5. Logarithmic corrections to the uncertainty principle and infinitude of the number of bound states of n-particle systems

    International Nuclear Information System (INIS)

    Perez, J.F.; Coutinho, F.A.B.; Malta, C.P.

    1985-01-01

    It is shown that critical long distance behaviour for a two-body potential, defining the finiteness or infinitude of the number of negative eigenvalues of Schrodinger operators in ν-dimensions, are given by v sub(k) (r) = - [ν-2/2r] 2 - 1/(2rlnr) 2 + ... - 1/(2rlnr.lnlnr...ln sub(k)r) 2 where k=0,1... for ν not=2 and k=1,2... if ν=2. This result is a consequence of logarithmic corrections to an inequality known as Uncertainty Principle. If the continuum threshold in the N-body problem is defined by a two-cluster break up our results generate corrections to the existing sufficient conditions for the existence of infinitely many bound states. (Author) [pt

  6. Uncertainty Relations and Possible Experience

    Directory of Open Access Journals (Sweden)

    Gregg Jaeger

    2016-06-01

    Full Text Available The uncertainty principle can be understood as a condition of joint indeterminacy of classes of properties in quantum theory. The mathematical expressions most closely associated with this principle have been the uncertainty relations, various inequalities exemplified by the well known expression regarding position and momentum introduced by Heisenberg. Here, recent work involving a new sort of “logical” indeterminacy principle and associated relations introduced by Pitowsky, expressable directly in terms of probabilities of outcomes of measurements of sharp quantum observables, is reviewed and its quantum nature is discussed. These novel relations are derivable from Boolean “conditions of possible experience” of the quantum realm and have been considered both as fundamentally logical and as fundamentally geometrical. This work focuses on the relationship of indeterminacy to the propositions regarding the values of discrete, sharp observables of quantum systems. Here, reasons for favoring each of these two positions are considered. Finally, with an eye toward future research related to indeterminacy relations, further novel approaches grounded in category theory and intended to capture and reconceptualize the complementarity characteristics of quantum propositions are discussed in relation to the former.

  7. Large-uncertainty intelligent states for angular momentum and angle

    International Nuclear Information System (INIS)

    Goette, Joerg B; Zambrini, Roberta; Franke-Arnold, Sonja; Barnett, Stephen M

    2005-01-01

    The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In this paper, we highlight the difference between intelligent states and minimum uncertainty states by investigating a class of intelligent states which obey the equality in the angular uncertainty relation while having an arbitrarily large uncertainty product. To develop an understanding for the uncertainties of angle and angular momentum for the large-uncertainty intelligent states we compare exact solutions with analytical approximations in two limiting cases

  8. Calculation of nuclear reactivity using the generalised Adams-Bashforth-Moulton predictor corrector method

    Energy Technology Data Exchange (ETDEWEB)

    Suescun-Diaz, Daniel [Surcolombiana Univ., Neiva (Colombia). Groupo de Fisica Teorica; Narvaez-Paredes, Mauricio [Javeriana Univ., Cali (Colombia). Groupo de Matematica y Estadistica Aplicada Pontificia; Lozano-Parada, Jamie H. [Univ. del Valle, Cali (Colombia). Dept. de Ingenieria

    2016-03-15

    In this paper, the generalisation of the 4th-order Adams-Bashforth-Moulton predictor-corrector method is proposed to numerically solve the point kinetic equations of the nuclear reactivity calculations without using the nuclear power history. Due to the nature of the point kinetic equations, different predictor modifiers are used in order improve the precision of the approximations obtained. The results obtained with the prediction formulas and generalised corrections improve the precision when compared with previous methods and are valid for various forms of nuclear power and different time steps.

  9. First-ever generalised tonic-clonic seizures in adults in the ...

    African Journals Online (AJOL)

    First-ever generalised tonic-clonic seizures in adults in the emergency room: Review of cranial computed tomography of 76 cases in a tertiary hospital in Benin-city, Nigeria. ... Clinical and CT diagnoses agreed only in 8.4% of the cases.

  10. Uncertainty in prediction and in inference

    International Nuclear Information System (INIS)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support

  11. Generalised model for anisotropic compact stars

    Energy Technology Data Exchange (ETDEWEB)

    Maurya, S.K. [University of Nizwa, Department of Mathematical and Physical Sciences College of Arts and Science, Nizwa (Oman); Gupta, Y.K. [Raj Kumar Goel Institute of Technology, Department of Mathematics, Ghaziabad, Uttar Pradesh (India); Ray, Saibal [Government College of Engineering and Ceramic Technology, Department of Physics, Kolkata, West Bengal (India); Deb, Debabrata [Indian Institute of Engineering Science and Technology, Shibpur, Department of Physics, Howrah, West Bengal (India)

    2016-12-15

    In the present investigation an exact generalised model for anisotropic compact stars of embedding class 1 is sought with a general relativistic background. The generic solutions are verified by exploring different physical aspects, viz. energy conditions, mass-radius relation, stability of the models, in connection to their validity. It is observed that the model presented here for compact stars is compatible with all these physical tests and thus physically acceptable as far as the compact star candidates RXJ 1856-37, SAX J 1808.4-3658 (SS1) and SAX J 1808.4-3658 (SS2) are concerned. (orig.)

  12. f(R in Holographic and Agegraphic Dark Energy Models and the Generalized Uncertainty Principle

    Directory of Open Access Journals (Sweden)

    Barun Majumder

    2013-01-01

    Full Text Available We studied a unified approach with the holographic, new agegraphic, and f(R dark energy model to construct the form of f(R which in general is responsible for the curvature driven explanation of the very early inflation along with presently observed late time acceleration. We considered the generalized uncertainty principle in our approach which incorporated the corrections in the entropy-area relation and thereby modified the energy densities for the cosmological dark energy models considered. We found that holographic and new agegraphic f(R gravity models can behave like phantom or quintessence models in the spatially flat FRW universe. We also found a distinct term in the form of f(R which goes as R 3 / 2 due to the consideration of the GUP modified energy densities. Although the presence of this term in the action can be important in explaining the early inflationary scenario, Capozziello et al. recently showed that f(R ~ R 3 / 2 leads to an accelerated expansion, that is, a negative value for the deceleration parameter q which fits well with SNeIa and WMAP data.

  13. Generalised pole-placement control of steam turbine speed

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez-del-Busto, R. [ITESM, Cuernavaca (Mexico). Div. de Ingenieria y Ciencias; Munoz, J. [ITESM, Xochimilco (Mexico). Div. de Ingenieria y Ciencias

    1996-12-31

    An application of a pole-placement self-tuning predictive control algorithm is developed to regulate speed of a power plant steam turbine model. Two types of system representation (CARMA and CARIMA) are used to test the control algorithm. Simulation results show that when using a CARMA model better results are produced. Two further comparisons are made when using a PI controller and a generalised predictive controller. (author)

  14. An information-theoretic basis for uncertainty analysis: application to the QUASAR severe accident study

    International Nuclear Information System (INIS)

    Unwin, S.D.; Cazzoli, E.G.; Davis, R.E.; Khatib-Rahbar, M.; Lee, M.; Nourbakhsh, H.; Park, C.K.; Schmidt, E.

    1989-01-01

    The probabilistic characterization of uncertainty can be problematic in circumstances where there is a paucity of supporting data and limited experience on which to base engineering judgement. Information theory provides a framework in which to address this issue through reliance upon entropy-related principles of uncertainty maximization. We describe an application of such principles in the United States Nuclear Regulatory Commission-sponsored program QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors). (author)

  15. Entropic formulation of the uncertainty principle for the number and annihilation operators

    International Nuclear Information System (INIS)

    Rastegin, Alexey E

    2011-01-01

    An entropic approach to formulating uncertainty relations for the number-annihilation pair is considered. We construct some normal operator that traces the annihilation operator as well as commuting quadratures with a complete system of common eigenfunctions. Expanding the measured wave function with respect to them, one obtains a relevant probability distribution. Another distribution is naturally generated by measuring the number operator. Due to the Riesz-Thorin theorem, there exists a nontrivial inequality between corresponding functionals of the above distributions. We find the bound in this inequality and further derive uncertainty relations in terms of both the Rényi and Tsallis entropies. Entropic uncertainty relations for a continuous distribution as well as relations for a discretized one are presented. (comment)

  16. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  17. Uncertainty in water resources availability in the Okavango River basin as a result of climate change

    Directory of Open Access Journals (Sweden)

    D. A. Hughes

    2011-03-01

    Full Text Available This paper assesses the hydrological response to scenarios of climate change in the Okavango River catchment in Southern Africa. Climate scenarios are constructed representing different changes in global mean temperature from an ensemble of 7 climate models assessed in the IPCC AR4. The results show a substantial change in mean flow associated with a global warming of 2 °C. However, there is considerable uncertainty in the sign and magnitude of the projected changes between different climate models, implying that the ensemble mean is not an appropriate generalised indicator of impact. The uncertainty in response between different climate model patterns is considerably greater than the range due to uncertainty in hydrological model parameterisation. There is also a clear need to evaluate the physical mechanisms associated with the model projected changes in this region. The implications for water resource management policy are considered.

  18. The Hayes principles: learning from the national pilot of information technology and core generalisable theory in informatics

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2010-06-01

    Conclusions Had the Hayes principles been embedded within our approach to health IT, and in particular to medical record implementation, we might have avoided many of the costly mistakes with the UK national programme. However, these principles need application within the modern IT environment. Closeness to the patient must not be interpreted as physical but instead as a virtual patient-centred space; data will be secure within the cloud and we should dump the vault and infrastructure mentality. Health IT should be developed as an adaptive ecosystem.

  19. Multi-Trial Guruswami–Sudan Decoding for Generalised Reed–Solomon Codes

    DEFF Research Database (Denmark)

    Nielsen, Johan Sebastian Rosenkilde; Zeh, Alexander

    2013-01-01

    An iterated refinement procedure for the Guruswami–Sudan list decoding algorithm for Generalised Reed–Solomon codes based on Alekhnovich’s module minimisation is proposed. The method is parametrisable and allows variants of the usual list decoding approach. In particular, finding the list...

  20. The VTTVIS line imaging spectrometer - principles, error sources, and calibration

    DEFF Research Database (Denmark)

    Jørgensen, R.N.

    2002-01-01

    work describing the basic principles, potential error sources, and/or adjustment and calibration procedures. This report fulfils the need for such documentationwith special focus on the system at KVL. The PGP based system has several severe error sources, which should be removed prior any analysis......Hyperspectral imaging with a spatial resolution of a few mm2 has proved to have a great potential within crop and weed classification and also within nutrient diagnostics. A commonly used hyperspectral imaging system is based on the Prism-Grating-Prism(PGP) principles produced by Specim Ltd...... in off-axis transmission efficiencies, diffractionefficiencies, and image distortion have a significant impact on the instrument performance. Procedures removing or minimising these systematic error sources are developed and described for the system build at KVL but can be generalised to other PGP...

  1. Aspects of string theory compactifications. D-brane statistics and generalised geometry

    International Nuclear Information System (INIS)

    Gmeiner, F.

    2006-01-01

    In this thesis we investigate two different aspects of string theory compactifications. The first part deals with the issue of the huge amount of possible string vacua, known as the landscape. Concretely we investigate a specific well defined subset of type II orientifold compactifications. We develop the necessary tools to construct a very large set of consistent models and investigate their gauge sector on a statistical basis. In particular we analyse the frequency distributions of gauge groups and the possible amount of chiral matter for compactifications to six and four dimensions. In the phenomenologically relevant case of four-dimensional compactifications, special attention is paid to solutions with gauge groups that include those of the standard model, as well as Pati-Salam, SU(5) and flipped SU(5) models. Additionally we investigate the frequency distribution of coupling constants and correlations between the observables in the gauge sector. These results are compared with a recent study of Gepner models. Moreover, we elaborate on questions concerning the finiteness of the number of solutions and the computational complexity of the algorithm. In the second part of this thesis we consider a new mathematical framework, called generalised geometry, to describe the six-manifolds used in string theory compactifications. In particular, the formulation of T-duality and mirror symmetry for nonlinear topological sigma models is investigated. Therefore we provide a reformulation and extension of the known topological A- and B-models to the generalised framework. The action of mirror symmetry on topological D-branes in this setup is presented and the transformation of the boundary conditions is analysed. To extend the considerations to D-branes in type II string theory, we introduce the notion of generalised calibrations. We show that the known calibration conditions of supersymmetric branes in type IIA and IIB can be obtained as special cases. Finally we investigate

  2. Aspects of string theory compactifications. D-brane statistics and generalised geometry

    Energy Technology Data Exchange (ETDEWEB)

    Gmeiner, F.

    2006-05-26

    In this thesis we investigate two different aspects of string theory compactifications. The first part deals with the issue of the huge amount of possible string vacua, known as the landscape. Concretely we investigate a specific well defined subset of type II orientifold compactifications. We develop the necessary tools to construct a very large set of consistent models and investigate their gauge sector on a statistical basis. In particular we analyse the frequency distributions of gauge groups and the possible amount of chiral matter for compactifications to six and four dimensions. In the phenomenologically relevant case of four-dimensional compactifications, special attention is paid to solutions with gauge groups that include those of the standard model, as well as Pati-Salam, SU(5) and flipped SU(5) models. Additionally we investigate the frequency distribution of coupling constants and correlations between the observables in the gauge sector. These results are compared with a recent study of Gepner models. Moreover, we elaborate on questions concerning the finiteness of the number of solutions and the computational complexity of the algorithm. In the second part of this thesis we consider a new mathematical framework, called generalised geometry, to describe the six-manifolds used in string theory compactifications. In particular, the formulation of T-duality and mirror symmetry for nonlinear topological sigma models is investigated. Therefore we provide a reformulation and extension of the known topological A- and B-models to the generalised framework. The action of mirror symmetry on topological D-branes in this setup is presented and the transformation of the boundary conditions is analysed. To extend the considerations to D-branes in type II string theory, we introduce the notion of generalised calibrations. We show that the known calibration conditions of supersymmetric branes in type IIA and IIB can be obtained as special cases. Finally we investigate

  3. Objective Bayesianism and the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Jon Williamson

    2013-09-01

    Full Text Available Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities; they should be calibrated to our evidence of physical probabilities; and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper, we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes’ Theorem.

  4. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  5. Propagation of nonlinear shock waves for the generalised Oskolkov equation and its dynamic motions in the presence of an external periodic perturbation

    Science.gov (United States)

    Ak, Turgut; Aydemir, Tugba; Saha, Asit; Kara, Abdul Hamid

    2018-06-01

    Propagation of nonlinear shock waves for the generalised Oskolkov equation and dynamic motions of the perturbed Oskolkov equation are investigated. Employing the unified method, a collection of exact shock wave solutions for the generalised Oskolkov equations is presented. Collocation finite element method is applied to the generalised Oskolkov equation for checking the accuracy of the proposed method by two test problems including the motion of shock wave and evolution of waves with Gaussian and undular bore initial conditions. Considering an external periodic perturbation, the dynamic motions of the perturbed generalised Oskolkov equation are studied depending on the system parameters with the help of phase portrait and time series plot. The perturbed generalised Oskolkov equation exhibits period-3, quasiperiodic and chaotic motions for some special values of the system parameters, whereas the generalised Oskolkov equation presents shock waves in the absence of external periodic perturbation.

  6. Dynamics of screw dislocations : a generalised minimising-movements scheme approach

    NARCIS (Netherlands)

    Bonaschi, G.A.; Meurs, van P.J.P.; Morandotti, M.

    2015-01-01

    The gradient flow structure of the model introduced in [CG99] for the dynamics of screw dislocations is investigated by means of a generalised minimising-movements scheme approach. The assumption of a finite number of available glide directions, together with the "maximal dissipation criterion" that

  7. The one-dimensional normalised generalised equivalence theory (NGET) for generating equivalent diffusion theory group constants for PWR reflector regions

    International Nuclear Information System (INIS)

    Mueller, E.Z.

    1991-01-01

    An equivalent diffusion theory PWR reflector model is presented, which has as its basis Smith's generalisation of Koebke's Equivalent Theory. This method is an adaptation, in one-dimensional slab geometry, of the Generalised Equivalence Theory (GET). Since the method involves the renormalisation of the GET discontinuity factors at nodal interfaces, it is called the Normalised Generalised Equivalence Theory (NGET) method. The advantages of the NGET method for modelling the ex-core nodes of a PWR are summarized. 23 refs

  8. On the Action of the Radiation Field Generated by a Traveling-Wave Element and Its Connection to the Time Energy Uncertainty Principle, Elementary Charge and the Fine Structure Constant

    Directory of Open Access Journals (Sweden)

    Vernon Cooray

    2017-02-01

    Full Text Available Recently, we published two papers in this journal. One of the papers dealt with the action of the radiation fields generated by a traveling-wave element and the other dealt with the momentum transferred by the same radiation fields and their connection to the time energy uncertainty principle. The traveling-wave element is defined as a conductor through which a current pulse propagates with the speed of light in free space from one end of the conductor to the other without attenuation. The goal of this letter is to combine the information provided in these two papers together and make conclusive statements concerning the connection between the energy dissipated by the radiation fields, the time energy uncertainty principle and the elementary charge. As we will show here, the results presented in these two papers, when combined together, show that the time energy uncertainty principle can be applied to the classical radiation emitted by a traveling-wave element and it results in the prediction that the smallest charge associated with the current that can be detected using radiated energy as a vehicle is on the order of the elementary charge. Based on the results, an expression for the fine structure constant is obtained. This is the first time that an order of magnitude estimation of the elementary charge based on electromagnetic radiation fields is obtained. Even though the results obtained in this paper have to be considered as order of magnitude estimations, a strict interpretation of the derived equations shows that the fine structure constant or the elementary charge may change as the size or the age of the universe increases.

  9. Correction of harmonic motion and Kepler orbit based on the minimal momentum uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Won Sang, E-mail: mimip4444@hanmail.net [Department of Physics and Research Institute of Natural Science, College of Natural Science, Gyeongsang National University, Jinju 660-701 (Korea, Republic of); Hassanabadi, Hassan, E-mail: h.hasanabadi@shahroodut.ac.ir [Physics Department, Shahrood University of Technology, Shahrood (Iran, Islamic Republic of)

    2017-03-18

    In this paper we consider the deformed Heisenberg uncertainty principle with the minimal uncertainty in momentum which is called a minimal momentum uncertainty principle (MMUP). We consider MMUP in D-dimension and its classical analogue. Using these we investigate the MMUP effect for the harmonic motion and Kepler orbit. - Highlights: • We discussed minimal momentum uncertainty relation. • We considered MMUR in D-dimension and used the deformed Poisson bracket to find the classical mechanics based on the MMUR. • Using these we investigate the MMUR effect for the harmonic motion and Kepler orbit. • Especially, we computed the corrected precession angle for each case. • We found that the corrected precession angle is always positive.

  10. Generalised Batho correction factor

    International Nuclear Information System (INIS)

    Siddon, R.L.

    1984-01-01

    There are various approximate algorithms available to calculate the radiation dose in the presence of a heterogeneous medium. The Webb and Fox product over layers formulation of the generalised Batho correction factor requires determination of the number of layers and the layer densities for each ray path. It has been shown that the Webb and Fox expression is inefficient for the heterogeneous medium which is expressed as regions of inhomogeneity rather than layers. The inefficiency of the layer formulation is identified as the repeated problem of determining for each ray path which inhomogeneity region corresponds to a particular layer. It has been shown that the formulation of the Batho correction factor as a product over inhomogeneity regions avoids that topological problem entirely. The formulation in terms of a product over regions simplifies the computer code and reduces the time required to calculate the Batho correction factor for the general heterogeneous medium. (U.K.)

  11. On entropic uncertainty relations in the presence of a minimal length

    Science.gov (United States)

    Rastegin, Alexey E.

    2017-07-01

    Entropic uncertainty relations for the position and momentum within the generalized uncertainty principle are examined. Studies of this principle are motivated by the existence of a minimal observable length. Then the position and momentum operators satisfy the modified commutation relation, for which more than one algebraic representation is known. One of them is described by auxiliary momentum so that the momentum and coordinate wave functions are connected by the Fourier transform. However, the probability density functions of the physically true and auxiliary momenta are different. As the corresponding entropies differ, known entropic uncertainty relations are changed. Using differential Shannon entropies, we give a state-dependent formulation with correction term. State-independent uncertainty relations are obtained in terms of the Rényi entropies and the Tsallis entropies with binning. Such relations allow one to take into account a finiteness of measurement resolution.

  12. Asymptotic Behaviour of Total Generalised Variation

    KAUST Repository

    Papafitsoros, Konstantinos; Valkonen, Tuomo

    2015-01-01

    © Springer International Publishing Switzerland 2015. The recently introduced second order total generalised variation functional TGV2 β,α has been a successful regulariser for image processing purposes. Its definition involves two positive parameters α and β whose values determine the amount and the quality of the regularisation. In this paper we report on the behaviour of TGV2 β,α in the cases where the parameters α, β as well as their ratio β/α becomes very large or very small. Among others, we prove that for sufficiently symmetric two dimensional data and large ratio β/α, TGV2 β,α regularisation coincides with total variation (TV) regularization

  13. Uncertainty in spatial planning proceedings

    Directory of Open Access Journals (Sweden)

    Aleš Mlakar

    2009-01-01

    Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.

  14. Maximally Localized States and Quantum Corrections of Black Hole Thermodynamics in the Framework of a New Generalized Uncertainty Principle

    International Nuclear Information System (INIS)

    Zhang, Shao-Jun; Miao, Yan-Gang; Zhao, Ying-Jie

    2015-01-01

    As a generalized uncertainty principle (GUP) leads to the effects of the minimal length of the order of the Planck scale and UV/IR mixing, some significant physical concepts and quantities are modified or corrected correspondingly. On the one hand, we derive the maximally localized states—the physical states displaying the minimal length uncertainty associated with a new GUP proposed in our previous work. On the other hand, in the framework of this new GUP we calculate quantum corrections to the thermodynamic quantities of the Schwardzschild black hole, such as the Hawking temperature, the entropy, and the heat capacity, and give a remnant mass of the black hole at the end of the evaporation process. Moreover, we compare our results with that obtained in the frameworks of several other GUPs. In particular, we observe a significant difference between the situations with and without the consideration of the UV/IR mixing effect in the quantum corrections to the evaporation rate and the decay time. That is, the decay time can greatly be prolonged in the former case, which implies that the quantum correction from the UV/IR mixing effect may give rise to a radical rather than a tiny influence to the Hawking radiation.

  15. Statistical Approaches Accomodating Uncertainty in Modern Genomic Data

    DEFF Research Database (Denmark)

    Skotte, Line

    the contributed method applicable to case-control studies as well as mapping of quantitative traits. The contributed method provides a needed association test for quantitative traits in the presence of uncertain genotypes and it further allows correction for population structure in association tests for disease...... the potential of the technological advances. The first of the four papers included in this thesis describes a new method for association mapping that accommodates uncertain genotypes from low-coverage re-sequencing data. The method allows uncertain genotypes using a score statistic based on the joint likelihood...... of the observed phenotypes and the observed sequencing data. This joint likelihood accounts for the genotype uncertainties via the posterior probabilities of each genotype given the observed sequencing data and the phenotype distributions are modelled using a generalised linear model framework which makes...

  16. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  17. Generalisation of a 1:10k map from municipal data

    NARCIS (Netherlands)

    Van Altena, V.; Bakermans, J.; Lentjes, P.; Nijhuis, R.; Post, M.; Reuvers, M.; Stoter, J.E.

    2014-01-01

    This paper reports about the feasibility study carried out by the Dutch Kadaster to automatically generalise the largest scale topographical data set maintained by the Kadaster (i.e. TOP10NL) from the 1:1k topographical object oriented data set, which is currently being collected and structured by

  18. Generalised time functions and finiteness of the Lorentzian distance

    OpenAIRE

    Rennie, Adam; Whale, Ben E.

    2014-01-01

    We show that finiteness of the Lorentzian distance is equivalent to the existence of generalised time functions with gradient uniformly bounded away from light cones. To derive this result we introduce new techniques to construct and manipulate achronal sets. As a consequence of these techniques we obtain a functional description of the Lorentzian distance extending the work of Franco and Moretti.

  19. Commitment of mathematicians in medicine: a personal experience, and generalisations.

    Science.gov (United States)

    Clairambault, Jean

    2011-12-01

    I will present here a personal point of view on the commitment of mathematicians in medicine. Starting from my personal experience, I will suggest generalisations including favourable signs and caveats to show how mathematicians can be welcome and helpful in medicine, both in a theoretical and in a practical way.

  20. Optimal entropic uncertainty relation for successive measurements ...

    Indian Academy of Sciences (India)

    measurements in quantum information theory. M D SRINIVAS ... derived by Robertson in 1929 [2] from the first principles of quantum theory, does not ... systems and may hence be referred to as 'uncertainty relations for distinct measurements'.

  1. [Epileptic seizures during childbirth in a patient with idiopathic generalised epilepsy

    NARCIS (Netherlands)

    Voermans, N.C.; Zwarts, M.J.; Renier, W.O.; Bloem, B.R.

    2005-01-01

    During her first pregnancy, a 37-year-old woman with idiopathic generalised epilepsy that was adequately controlled with lamotrigine experienced a series of epileptic seizures following an elective caesarean section. The attacks were terminated with diazepam. The following day, she developed

  2. Projecting UK mortality using Bayesian generalised additive models

    OpenAIRE

    Hilton, Jason; Dodd, Erengul; Forster, Jonathan; Smith, Peter W.F.

    2018-01-01

    Forecasts of mortality provide vital information about future populations, with implications for pension and health-care policy as well as for decisions made by private companies about life insurance and annuity pricing. This paper presents a Bayesian approach to the forecasting of mortality that jointly estimates a Generalised Additive Model (GAM) for mortality for the majority of the age-range and a parametric model for older ages where the data are sparser. The GAM allows smooth components...

  3. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  4. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  5. Quantification of uncertainties of modeling and simulation

    International Nuclear Information System (INIS)

    Ma Zhibo; Yin Jianwei

    2012-01-01

    The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)

  6. Uncertainty for Part Density Determination: An Update

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Mario Orlando [Los Alamos National Laboratory

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.

  7. A note on a generalisation of Weyl's theory of gravitation

    International Nuclear Information System (INIS)

    Dereli, T.; Tucker, R.W.

    1982-01-01

    A scale-invariant gravitational theory due to Bach and Weyl is generalised by the inclusion of space-time torsion. The difference between the arbitrary and zero torsion constrained variations of the Weyl action is elucidated. Conformal rescaling properties of the gravitational fields are discussed. A new class of classical solutions with torsion is presented. (author)

  8. Generalisation of geographic information cartographic modelling and applications

    CERN Document Server

    Mackaness, William A; Sarjakoski, L Tiina

    2011-01-01

    Theoretical and Applied Solutions in Multi Scale MappingUsers have come to expect instant access to up-to-date geographical information, with global coverage--presented at widely varying levels of detail, as digital and paper products; customisable data that can readily combined with other geographic information. These requirements present an immense challenge to those supporting the delivery of such services (National Mapping Agencies (NMA), Government Departments, and private business. Generalisation of Geographic Information: Cartographic Modelling and Applications provides detailed review

  9. The stability of vacuum solutions in generalised gravity

    Energy Technology Data Exchange (ETDEWEB)

    Madsen, M.S. (Sussex Univ., Brighton (UK). Astronomy Centre); Low, R.J. (Coventry (Lanchester) Polytechnic (UK). Dept. of Mathematics)

    1990-05-10

    The stability of the Ricci-flat solutions of a large class of generalised gravity theories is examined. It is shown by use of complementary methods that all such solutions are stable in a given theory if that theory admits a truncation to a quadratic theory in which the solution is stable. In particular, this means that the exterior Schwarzschild solution is stable in any gravity theory constructed purely from the Ricci scalar, provided that it exists in that theory. (orig.).

  10. The stability of vacuum solutions in generalised gravity

    International Nuclear Information System (INIS)

    Madsen, M.S.; Low, R.J.

    1990-01-01

    The stability of the Ricci-flat solutions of a large class of generalised gravity theories is examined. It is shown by use of complementary methods that all such solutions are stable in a given theory if that theory admits a truncation to a quadratic theory in which the solution is stable. In particular, this means that the exterior Schwarzschild solution is stable in any gravity theory constructed purely from the Ricci scalar, provided that it exists in that theory. (orig.)

  11. Comments on 'On a proposed new test of Heisenberg's principle'

    International Nuclear Information System (INIS)

    Home, D.; Sengupta, S.

    1981-01-01

    A logical fallacy is pointed out in Robinson's analysis (J. Phys. A.; 13:877 (1980)) of a thought experiment purporting to show violation of Heisenberg's uncertainty principle. The real problem concerning the interpretation of Heisenberg's principle is precisely stated. (author)

  12. Position-momentum uncertainty relations in the presence of quantum memory

    DEFF Research Database (Denmark)

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco

    2014-01-01

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear oper....... As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states....

  13. Uncertainty in perception and the Hierarchical Gaussian Filter

    Directory of Open Access Journals (Sweden)

    Christoph Daniel Mathys

    2014-11-01

    Full Text Available In its full sense, perception rests on an agent’s model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the hierarchical Gaussian filter (HGF offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (instability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF’s hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder-Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient - but at the same time intuitive - framework for the resolution of perceptual uncertainty in behaving agents.

  14. Generalised shot noise Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We introduce a new class of Cox cluster processes called generalised shot-noise processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process which drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can...... be random. Thereby a very large class of models for aggregated or clustered point patterns is obtained. Due to the structure of GSNCPs, a number of useful results can be established. We focus first on deriving summary statistics for GSNCPs and next on how to make simulation for GSNCPs. Particularly, results...... for first and second order moment measures, reduced Palm distributions, the -function, simulation with or without edge effects, and conditional simulation of the intensity function driving a GSNCP are given. Our results are exemplified for special important cases of GSNCPs, and we discuss the relation...

  15. Acute generalised exanthematous pustulosis: An update

    Directory of Open Access Journals (Sweden)

    Abhishek De

    2018-01-01

    Full Text Available Acute generalised exanthematous pustulosis (AGEP is a severe cutaneous adverse reaction and is attributed to drugs in more than 90% of cases. It is a rare disease, with an estimated incidence of 1–5 patients per million per year. The clinical manifestations characterised by the rapid development of sterile pustular lesions, fever and leucocytosis. Number of drugs has been reported to be associated with AGEP, most common being the antibiotics. Histopathologically there is intraepidermal pustules and papillary dermal oedema with neutrophilic and eosinophilic infiltrations. Systemic involvement can be present in more severe cases. Early diagnosis with withdrawal of the causative drug is the most important step in the management. Treatment includes supportive care, prevention of antibiotics and use of a potent topical steroid.

  16. Acute Generalised Exanthematous Pustulosis: An Update.

    Science.gov (United States)

    De, Abhishek; Das, Sudip; Sarda, Aarti; Pal, Dayamay; Biswas, Projna

    2018-01-01

    Acute generalised exanthematous pustulosis (AGEP) is a severe cutaneous adverse reaction and is attributed to drugs in more than 90% of cases. It is a rare disease, with an estimated incidence of 1-5 patients per million per year. The clinical manifestations characterised by the rapid development of sterile pustular lesions, fever and leucocytosis. Number of drugs has been reported to be associated with AGEP, most common being the antibiotics. Histopathologically there is intraepidermal pustules and papillary dermal oedema with neutrophilic and eosinophilic infiltrations. Systemic involvement can be present in more severe cases. Early diagnosis with withdrawal of the causative drug is the most important step in the management. Treatment includes supportive care, prevention of antibiotics and use of a potent topical steroid.

  17. Role of information theoretic uncertainty relations in quantum theory

    International Nuclear Information System (INIS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-01-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed

  18. Role of information theoretic uncertainty relations in quantum theory

    Energy Technology Data Exchange (ETDEWEB)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  19. Generalised BRST symmetry and gaugeon formalism for perturbative quantum gravity: Novel observation

    International Nuclear Information System (INIS)

    Upadhyay, Sudhaker

    2014-01-01

    In this paper the novel features of Yokoyama gaugeon formalism are stressed out for the theory of perturbative quantum gravity in the Einstein curved spacetime. The quantum gauge transformations for the theory of perturbative gravity are demonstrated in the framework of gaugeon formalism. These quantum gauge transformations lead to renormalised gauge parameter. Further, we analyse the BRST symmetric gaugeon formalism which embeds more acceptable Kugo–Ojima subsidiary condition. Further, the BRST symmetry is made finite and field-dependent. Remarkably, the Jacobian of path integral under finite and field-dependent BRST symmetry amounts to the exact gaugeon action in the effective theory of perturbative quantum gravity. -- Highlights: •We analyse the perturbative gravity in gaugeon formalism. •The generalisation of BRST transformation is also studied in this context. •Within the generalised BRST framework we found the exact gaugeon modes in the theory

  20. General principles of quantum mechanics

    International Nuclear Information System (INIS)

    Pauli, W.

    1980-01-01

    This book is a textbook for a course in quantum mechanics. Starting from the complementarity and the uncertainty principle Schroedingers equation is introduced together with the operator calculus. Then stationary states are treated as eigenvalue problems. Furthermore matrix mechanics are briefly discussed. Thereafter the theory of measurements is considered. Then as approximation methods perturbation theory and the WKB approximation are introduced. Then identical particles, spin, and the exclusion principle are discussed. There after the semiclassical theory of radiation and the relativistic one-particle problem are discussed. Finally an introduction is given into quantum electrodynamics. (HSI)

  1. Uncertainty Einstein, Heisenberg, Bohr, and the struggle for the soul of science

    CERN Document Server

    Lindley, David

    2007-01-01

    The uncertainty in this delightful book refers to Heisenberg's Uncertainty Principle, an idea first postulated in 1927 by physicist Werner Heisenberg in his attempt to make sense out of the developing field of quantum mechanics. As Lindley so well explains it, the concept of uncertainty shook the philosophical underpinnings of science. It was Heisenberg's work that, to a great extent, kept Einstein from accepting quantum mechanics as a full explanation for physical reality. Similarly, it was the Uncertainty Principle that demonstrated the limits of scientific investigation: if Heisenberg is correct there are some aspects of the physical universe that are to remain beyond the reach of scientists. As he has done expertly in books like Boltzmann's Atom, Lindley brings to life a critical period in the history of science, explaining complex issues to the general reader, presenting the major players in an engaging fashion, delving into the process of scientific discovery and discussing the interaction between scien...

  2. Deformations of the generalised Picard bundle

    International Nuclear Information System (INIS)

    Biswas, I.; Brambila-Paz, L.; Newstead, P.E.

    2004-08-01

    Let X be a nonsingular algebraic curve of genus g ≥ 3, and let Mξ denote the moduli space of stable vector bundles of rank n ≥ 2 and degree d with fixed determinant ξ over X such that n and d are coprime. We assume that if g = 3 then n ≥ 4 and if g = 4 then n ≥ 3, and suppose further that n 0 , d 0 are integers such that n 0 ≥ 1 and nd 0 + n 0 d > nn 0 (2g - 2). Let E be a semistable vector bundle over X of rank n 0 and degree d 0 . The generalised Picard bundle W ξ (E) is by definition the vector bundle over M ξ defined by the direct image p M ξ *(U ξ x p X * E) where U ξ is a universal vector bundle over X x M ξ . We obtain an inversion formula allowing us to recover E from W ξ (E) and show that the space of infinitesimal deformations of W ξ (E) is isomorphic to H 1 (X, End(E)). This construction gives a locally complete family of vector bundles over M ξ parametrised by the moduli space M(n 0 ,d 0 ) of stable bundles of rank n 0 and degree d 0 over X. If (n 0 ,d 0 ) = 1 and W ξ (E) is stable for all E is an element of M(n 0 ,d 0 ), the construction determines an isomorphism from M(n 0 ,d 0 ) to a connected component M 0 of a moduli space of stable sheaves over M ξ . This applies in particular when n 0 = 1, in which case M 0 is isomorphic to the Jacobian J of X as a polarised variety. The paper as a whole is a generalisation of results of Kempf and Mukai on Picard bundles over J, and is also related to a paper of Tyurin on the geometry of moduli of vector bundles. (author)

  3. Effect of lamotrigine on cerebral blood flow in patients with idiopathic generalised epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Eun Yeon [Ewha Womans University, Department of Neurology, College of Medicine, Seoul (Korea); Hong, Seung Bong; Tae, Woo Suk; Han, Sun Jung; Seo, Dae Won [Sungkyunkwan University School of Medicine, Department of Neurology, Samsung Medical Center and Center for Clinical Medicine, SBRI, Gangnam-Gu, Seoul (Korea); Lee, Kyung-Han [Sungkyunkwan University School of Medicine, Department of Nuclear Medicine, Samsung Medical Center and Center for Clinical Medicine, SBRI, Gangnam-Gu, Seoul (Korea); Lee, Mann Hyung [Catholic University of Daegu, College of Pharmacy, Gyeongbuk (Korea)

    2006-06-15

    The purpose of this study was to investigate the effects of the new anti-epileptic drug, lamotrigine, on cerebral blood flow by performing {sup 99m}Tc-ethylcysteinate dimer (ECD) single-photon emission computed tomography (SPECT) before and after medication in patients with drug-naive idiopathic generalised epilepsy. Interictal {sup 99m}Tc-ECD brain SPECT was performed before drug treatment started and then repeated after lamotrigine medication for 4-5 months in 30 patients with generalised epilepsy (M/F=14/16, 19.3{+-}3.4 years). Seizure types were generalised tonic-clonic seizure in 23 patients and myoclonic seizures in seven. The mean lamotrigine dose used was 214.1{+-}29.1 mg/day. For SPM analysis, all SPECT images were spatially normalised to the standard SPECT template and then smoothed using a 12-mm full-width at half-maximum Gaussian kernel. The paired t test was used to compare pre- and post-lamotrigine SPECT images. SPM analysis of pre- and post-lamotrigine brain SPECT images showed decreased perfusion in bilateral dorsomedial nuclei of thalami, bilateral uncus, right amygdala, left subcallosal gyrus, right superior and inferior frontal gyri, right precentral gyrus, bilateral superior and inferior temporal gyri and brainstem (pons, medulla) after lamotrigine medication at a false discovery rate-corrected p<0.05. No brain region showed increased perfusion after lamotrigine administration. (orig.)

  4. Passivation controller design for turbo-generators based on generalised Hamiltonian system theory

    NARCIS (Netherlands)

    Cao, M.; Shen, T.L.; Song, Y.H.

    2002-01-01

    A method of pre-feedback to formulate the generalised forced Hamiltonian system model for speed governor control systems is proposed. Furthermore, passivation controllers are designed based on the scheme of Hamiltonian structure for single machne infinite bus and multimachine power systems. In

  5. Learning and Generalisation in Neural Networks with Local Preprocessing

    OpenAIRE

    Kutsia, Merab

    2007-01-01

    We study learning and generalisation ability of a specific two-layer feed-forward neural network and compare its properties to that of a simple perceptron. The input patterns are mapped nonlinearly onto a hidden layer, much larger than the input layer, and this mapping is either fixed or may result from an unsupervised learning process. Such preprocessing of initially uncorrelated random patterns results in the correlated patterns in the hidden layer. The hidden-to-output mapping of the net...

  6. Generalised pruritus as a presentation of Grave’s disease

    OpenAIRE

    Tan, CE; Loh, KY

    2013-01-01

    Pruritus is a lesser known symptom of hyperthyroidism, particularly in autoimmune thyroid disorders. This is a case report of a 27-year-old woman who presented with generalised pruritus at a primary care clinic. Incidental findings of tachycardia and a goiter led to the investigations of her thyroid status. The thyroid function test revealed elevated serum free T4 and suppressed thyroid stimulating hormone levels. The anti-thyroid antibodies were positive. She was diagnosed with Graves’ disea...

  7. Resolving uncertainty in chemical speciation determinations

    Science.gov (United States)

    Smith, D. Scott; Adams, Nicholas W. H.; Kramer, James R.

    1999-10-01

    Speciation determinations involve uncertainty in system definition and experimentation. Identification of appropriate metals and ligands from basic chemical principles, analytical window considerations, types of species and checking for consistency in equilibrium calculations are considered in system definition uncertainty. A systematic approach to system definition limits uncertainty in speciation investigations. Experimental uncertainty is discussed with an example of proton interactions with Suwannee River fulvic acid (SRFA). A Monte Carlo approach was used to estimate uncertainty in experimental data, resulting from the propagation of uncertainties in electrode calibration parameters and experimental data points. Monte Carlo simulations revealed large uncertainties present at high (>9-10) and low (monoprotic ligands. Least-squares fit the data with 21 sites, whereas linear programming fit the data equally well with 9 sites. Multiresponse fitting, involving simultaneous fluorescence and pH measurements, improved model discrimination. Deconvolution of the excitation versus emission fluorescence surface for SRFA establishes a minimum of five sites. Diprotic sites are also required for the five fluorescent sites, and one non-fluorescent monoprotic site was added to accommodate the pH data. Consistent with greater complexity, the multiresponse method had broader confidence limits than the uniresponse methods, but corresponded better with the accepted total carboxylic content for SRFA. Overall there was a 40% standard deviation in total carboxylic content for the multiresponse fitting, versus 10% and 1% for least-squares and linear programming, respectively.

  8. Generalised Multiplicative Indices of Polycyclic Aromatic Hydrocarbons and Benzenoid Systems

    Science.gov (United States)

    Kulli, V. R.; Stone, Branden; Wang, Shaohui; Wei, Bing

    2017-05-01

    Many types of topological indices such as degree-based topological indices, distance-based topological indices, and counting-related topological indices are explored during past recent years. Among degree-based topological indices, Zagreb indices are the oldest one and studied well. In the paper, we define a generalised multiplicative version of these indices and compute exact formulas for Polycyclic Aromatic Hydrocarbons and jagged-rectangle Benzenoid systems.

  9. Modeling of uncertainties in statistical inverse problems

    International Nuclear Information System (INIS)

    Kaipio, Jari

    2008-01-01

    In all real world problems, the models that tie the measurements to the unknowns of interest, are at best only approximations for reality. While moderate modeling and approximation errors can be tolerated with stable problems, inverse problems are a notorious exception. Typical modeling errors include inaccurate geometry, unknown boundary and initial data, properties of noise and other disturbances, and simply the numerical approximations of the physical models. In principle, the Bayesian approach to inverse problems, in which all uncertainties are modeled as random variables, is capable of handling these uncertainties. Depending on the type of uncertainties, however, different strategies may be adopted. In this paper we give an overview of typical modeling errors and related strategies within the Bayesian framework.

  10. Trans-Planckian Effects in Inflationary Cosmology and the Modified Uncertainty Principle

    DEFF Research Database (Denmark)

    F. Hassan, S.; Sloth, Martin Snoager

    2002-01-01

    There are good indications that fundamental physics gives rise to a modified space-momentum uncertainty relation that implies the existence of a minimum length scale. We implement this idea in the scalar field theory that describes density perturbations in flat Robertson-Walker space-time. This l...

  11. Navigation towards a goal position: from reactive to generalised learned control

    Energy Technology Data Exchange (ETDEWEB)

    Freire da Silva, Valdinei [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil); Selvatici, Antonio Henrique [Universidade Nove de Julho, Rua Vergueiro, 235, Sao Paulo (Brazil); Reali Costa, Anna Helena, E-mail: valdinei.freire@gmail.com, E-mail: antoniohps@uninove.br, E-mail: anna.reali@poli.usp.br [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil)

    2011-03-01

    The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.

  12. Navigation towards a goal position: from reactive to generalised learned control

    International Nuclear Information System (INIS)

    Freire da Silva, Valdinei; Selvatici, Antonio Henrique; Reali Costa, Anna Helena

    2011-01-01

    The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.

  13. Is the Precautionary Principle Really Incoherent?

    Science.gov (United States)

    Boyer-Kassem, Thomas

    2017-11-01

    The Precautionary Principle has been an increasingly important principle in international treaties since the 1980s. Through varying formulations, it states that when an activity can lead to a catastrophe for human health or the environment, measures should be taken to prevent it even if the cause-and-effect relationship is not fully established scientifically. The Precautionary Principle has been critically discussed from many sides. This article concentrates on a theoretical argument by Peterson (2006) according to which the Precautionary Principle is incoherent with other desiderata of rational decision making, and thus cannot be used as a decision rule that selects an action among several ones. I claim here that Peterson's argument fails to establish the incoherence of the Precautionary Principle, by attacking three of its premises. I argue (i) that Peterson's treatment of uncertainties lacks generality, (ii) that his Archimedian condition is problematic for incommensurability reasons, and (iii) that his explication of the Precautionary Principle is not adequate. This leads me to conjecture that the Precautionary Principle can be envisaged as a coherent decision rule, again. © 2017 Society for Risk Analysis.

  14. Principles for fostering the transdisciplinary development of assistive technologies.

    Science.gov (United States)

    Boger, Jennifer; Jackson, Piper; Mulvenna, Maurice; Sixsmith, Judith; Sixsmith, Andrew; Mihailidis, Alex; Kontos, Pia; Miller Polgar, Janice; Grigorovich, Alisa; Martin, Suzanne

    2017-07-01

    Developing useful and usable assistive technologies often presents complex (or "wicked") challenges that require input from multiple disciplines and sectors. Transdisciplinary collaboration can enable holistic understanding of challenges that may lead to innovative, impactful and transformative solutions. This paper presents generalised principles that are intended to foster transdisciplinary assistive technology development. The paper introduces the area of assistive technology design before discussing general aspects of transdisciplinary collaboration followed by an overview of relevant concepts, including approaches, methodologies and frameworks for conducting and evaluating transdisciplinary working and assistive technology design. The principles for transdisciplinary development of assistive technologies are presented and applied post hoc to the COACH project, an ambient-assisted living technology for guiding completion of activities of daily living by older adults with dementia as an illustrative example. Future work includes the refinement and validation of these principles through their application to real-world transdisciplinary assistive technology projects. Implications for rehabilitation Transdisciplinarity encourages a focus on real world 'wicked' problems. A transdisciplinary approach involves transcending disciplinary boundaries and collaborating with interprofessional and community partners (including the technology's intended users) on a shared problem. Transdisciplinarity fosters new ways of thinking about and doing research, development, and implementation, expanding the scope, applicability, and commercial viability of assistive technologies.

  15. Energy and Uncertainty in General Relativity

    Science.gov (United States)

    Cooperstock, F. I.; Dupre, M. J.

    2018-03-01

    The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.

  16. Generalised boundary terms for higher derivative theories of gravity

    Energy Technology Data Exchange (ETDEWEB)

    Teimouri, Ali; Talaganis, Spyridon; Edholm, James [Consortium for Fundamental Physics, Lancaster University,North West Drive, Lancaster, LA1 4YB (United Kingdom); Mazumdar, Anupam [Consortium for Fundamental Physics, Lancaster University,North West Drive, Lancaster, LA1 4YB (United Kingdom); Kapteyn Astronomical Institute, University of Groningen,9700 AV Groningen (Netherlands)

    2016-08-24

    In this paper we wish to find the corresponding Gibbons-Hawking-York term for the most general quadratic in curvature gravity by using Coframe slicing within the Arnowitt-Deser-Misner (ADM) decomposition of spacetime in four dimensions. In order to make sure that the higher derivative gravity is ghost and tachyon free at a perturbative level, one requires infinite covariant derivatives, which yields a generalised covariant infinite derivative theory of gravity. We will be exploring the boundary term for such a covariant infinite derivative theory of gravity.

  17. Quantitative modelling of HDPE spurt experiments using wall slip and generalised Newtonian flow

    NARCIS (Netherlands)

    Doelder, den C.F.J.; Koopmans, R.J.; Molenaar, J.

    1998-01-01

    A quantitative model to describe capillary rheometer experiments is presented. The model can generate ‘two-branched' discontinuous flow curves and the associated pressure oscillations. Polymer compressibility in the barrel, incompressible axisymmetric generalised Newtonian flow in the die, and a

  18. Modelling Problem-Solving Situations into Number Theory Tasks: The Route towards Generalisation

    Science.gov (United States)

    Papadopoulos, Ioannis; Iatridou, Maria

    2010-01-01

    This paper examines the way two 10th graders cope with a non-standard generalisation problem that involves elementary concepts of number theory (more specifically linear Diophantine equations) in the geometrical context of a rectangle's area. Emphasis is given on how the students' past experience of problem solving (expressed through interplay…

  19. The precautionary principle as a rational decision criterion

    International Nuclear Information System (INIS)

    Hovi, Jon

    2001-12-01

    The paper asks if the precautionary principle may be seen as a rational decision criterion. Six main questions are discussed. 1. Does the principle basically represent a particular set of political options or is it a genuine decision criterion? 2. If it is the latter, can it be reduced to any of the existing criteria for decision making under uncertainty? 3. In what kinds of situation is the principle applicable? 4. What is the relation between the precautionary principle and other principles for environmental regulation? 5. How plausible is the principle's claim that the burden of proof should be reversed? 6. Do the proponents of environmental regulation carry no burden of proof at all? A main conclusion is that, for now at least, the principle contains too many unclear elements to satisfy the requirements of precision and consistency that should reasonably be satisfied by a rational decision criterion. (author)

  20. Implementing the Precautionary Principle through Stakeholder Engagement for Product and Service Development

    Directory of Open Access Journals (Sweden)

    Pierre de Coninck

    2007-05-01

    Full Text Available The precautionary principle is a sustainable development principle that attempts to articulate an ethic in decision making since it deals with the notion of uncertainty of harm. Uncertainty becomes a weakness when it has to serve as a predictor by which to take action. Since humans are responsible for their actions, and ethics is based in action, then decisions based in uncertainty require an ethical framework. Beyond the professional deontological responsibility, there is a need to consider the process of conception based on an ethic of the future and therefore to develop a new ethical framework which is more global and fundamental. This will expose the justifications for choices, present these in debates with other stakeholders, and ultimately adopt an axiology of decision making for conception. Responsibility and participative discourse for an equal justice among actors are a basis of such an ethic. By understanding the ethical framework of this principle and applying this knowledge towards design or innovation, the precautionary principle becomes operational. This paper suggests that to move towards sustainability, stakeholders must adopt decision making processes that are precautionary. A commitment to precaution encourages a global perspective and the search for alternatives. Methods such as alternative assessment and precautionary deliberation through stakeholder engagement can assist in this shift towards sustainability.

  1. Reformulation of a stochastic action principle for irregular dynamics

    International Nuclear Information System (INIS)

    Wang, Q.A.; Bangoup, S.; Dzangue, F.; Jeatsa, A.; Tsobnang, F.; Le Mehaute, A.

    2009-01-01

    A stochastic action principle for random dynamics is revisited. Numerical diffusion experiments are carried out to show that the diffusion path probability depends exponentially on the Lagrangian action A=∫ a b Ldt. This result is then used to derive the Shannon measure for path uncertainty. It is shown that the maximum entropy principle and the least action principle of classical mechanics can be unified into δA-bar=0 where the average is calculated over all possible paths of the stochastic motion between two configuration points a and b. It is argued that this action principle and the maximum entropy principle are a consequence of the mechanical equilibrium condition extended to the case of stochastic dynamics.

  2. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    Science.gov (United States)

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  3. Moist air state above counterflow wet-cooling tower fill based on Merkel, generalised Merkel and Klimanek & Białecky models

    Science.gov (United States)

    Hyhlík, Tomáš

    2017-09-01

    The article deals with an evaluation of moist air state above counterflow wet-cooling tower fill. The results based on Klimanek & Białecky model are compared with results of Merkel model and generalised Merkel model. Based on the numerical simulation it is shown that temperature is predicted correctly by using generalised Merkel model in the case of saturated or super-saturated air above the fill, but the temperature is underpredicted in the case of unsaturated moist air above the fill. The classical Merkel model always under predicts temperature above the fill. The density of moist air above the fill, which is calculated using generalised Merkel model, is strongly over predicted in the case of unsaturated moist air above the fill.

  4. Uncertainty, causality and decision: The case of social risks and nuclear risk in particular

    International Nuclear Information System (INIS)

    Lahidji, R.

    2012-01-01

    Probability and causality are two indispensable tools for addressing situations of social risk. Causal relations are the foundation for building risk assessment models and identifying risk prevention, mitigation and compensation measures. Probability enables us to quantify risk assessments and to calibrate intervention measures. It therefore seems not only natural, but also necessary to make the role of causality and probability explicit in the definition of decision problems in situations of social risk. Such is the aim of this thesis.By reviewing the terminology of risk and the logic of public interventions in various fields of social risk, we gain a better understanding of the notion and of the issues that one faces when trying to model it. We further elaborate our analysis in the case of nuclear safety, examining in detail how methods and policies have been developed in this field and how they have evolved through time. This leads to a number of observations concerning risk and safety assessments.Generalising the concept of intervention in a Bayesian network allows us to develop a variety of causal Bayesian networks adapted to our needs. In this framework, we propose a definition of risk which seems to be relevant for a broad range of issues. We then offer simple applications of our model to specific aspects of the Fukushima accident and other nuclear safety problems. In addition to specific lessons, the analysis leads to the conclusion that a systematic approach for identifying uncertainties is needed in this area. When applied to decision theory, our tool evolves into a dynamic decision model in which acts cause consequences and are causally interconnected. The model provides a causal interpretation of Savage's conceptual framework, solves some of its paradoxes and clarifies certain aspects. It leads us to considering uncertainty with regard to a problem's causal structure as the source of ambiguity in decision-making, an interpretation which corresponds to a

  5. Using the generalised invariant formalism: a class of conformally flat pure radiation metrics with a negative cosmological constant

    Energy Technology Data Exchange (ETDEWEB)

    Edgar, S Brian [Department of Mathematics, Linkoepings Universitet Linkoeping, S-581 83 (Sweden); Ramos, M P Machado [Departamento de Matematica para a Ciencia e Tecnologia, Azurem 4800-058 Guimaraes, Universidade do Minho (Portugal)

    2007-05-15

    We demonstrate an integration procedure for the generalised invariant formalism by obtaining a subclass of conformally flat pure radiation spacetimes with a negative cosmological constant. The method used is a development of the methods used earlier for pure radiation spacetimes of Petrov types O and N respectively. This subclass of spacetimes turns out to have one degree of isotropy freedom, so in this paper we have extended the integration procedure for the generalised invariant formalism to spacetimes with isotropy freedom,.

  6. Using the generalised invariant formalism: a class of conformally flat pure radiation metrics with a negative cosmological constant

    International Nuclear Information System (INIS)

    Edgar, S Brian; Ramos, M P Machado

    2007-01-01

    We demonstrate an integration procedure for the generalised invariant formalism by obtaining a subclass of conformally flat pure radiation spacetimes with a negative cosmological constant. The method used is a development of the methods used earlier for pure radiation spacetimes of Petrov types O and N respectively. This subclass of spacetimes turns out to have one degree of isotropy freedom, so in this paper we have extended the integration procedure for the generalised invariant formalism to spacetimes with isotropy freedom,

  7. Uncertainty Regarding Waste Handling in Everyday Life

    Directory of Open Access Journals (Sweden)

    Susanne Ewert

    2010-09-01

    Full Text Available According to our study, based on interviews with households in a residential area in Sweden, uncertainty is a cultural barrier to improved recycling. Four causes of uncertainty are identified. Firstly, professional categories not matching cultural categories—people easily discriminate between certain categories (e.g., materials such as plastic and paper but not between others (e.g., packaging and “non-packaging”. Thus a frequent cause of uncertainty is that the basic categories of the waste recycling system do not coincide with the basic categories used in everyday life. Challenged habits—source separation in everyday life is habitual, but when a habit is challenged, by a particular element or feature of the waste system, uncertainty can arise. Lacking fractions—some kinds of items cannot be left for recycling and this makes waste collection incomplete from the user’s point of view and in turn lowers the credibility of the system. Missing or contradictory rules of thumb—the above causes seem to be particularly relevant if no motivating principle or rule of thumb (within the context of use is successfully conveyed to the user. This paper discusses how reducing uncertainty can improve recycling.

  8. Practical application of the ALARA principle in management of the nuclear legacy: optimization under uncertainty

    International Nuclear Information System (INIS)

    Smith, Graham; Sneve, Malgorzata K.

    2008-01-01

    Full text: Radiological protection has a long and distinguished history in taking a balanced approach to optimization. Both utilitarian and individual interests and perspectives are addressed through a process of constrained optimisation, with optimisation intended to lead to the most benefit to the most people, and constraints being operative to limit the degree of inequity among the individuals exposed. At least, expressed simplistically, that is what the recommendations on protection are intended to achieve. This paper examines the difficulties in achieving that objective, based on consideration of the active role of optimisation in regulatory supervision of the historic nuclear legacy. This example is chosen because the application of the ALARA principle has important implications for some very major projects whose objective is remediation of existing legacy facilities. But it is also relevant because timely, effective and cost efficient completion of those projects has implications for confidence in the future development of nuclear power and other uses of radioactive materials. It is also an interesting example because legacy management includes mitigation of some major short and long term hazards, but those mitigating measures themselves involve operations with their own risk, cost and benefit profiles. Like any other complex activity, a legacy management project has to be broken down into logistically feasible parts. However, from a regulatory perspective, simultaneous application of ALARA to worker protection, major accident risk mitigation and long-term environmental and human health protection presents its own challenges. Major uncertainties which exacerbate the problem arise from ill-characterised source terms, estimation of the likelihood of unlikely failures in operational processes, and prospective assessment of radiological impacts over many hundreds of years and longer. The projects themselves are set to run over decades, during which time the

  9. Analysis of uncertainties of thermal hydraulic calculations

    International Nuclear Information System (INIS)

    Macek, J.; Vavrin, J.

    2002-12-01

    In 1993-1997 it was proposed, within OECD projects, that a common program should be set up for uncertainty analysis by a probabilistic method based on a non-parametric statistical approach for system computer codes such as RELAP, ATHLET and CATHARE and that a method should be developed for statistical analysis of experimental databases for the preparation of the input deck and statistical analysis of the output calculation results. Software for such statistical analyses would then have to be processed as individual tools independent of the computer codes used for the thermal hydraulic analysis and programs for uncertainty analysis. In this context, a method for estimation of a thermal hydraulic calculation is outlined and selected methods of statistical analysis of uncertainties are described, including methods for prediction accuracy assessment based on the discrete Fourier transformation principle. (author)

  10. Outcome and value uncertainties in global-change policy

    International Nuclear Information System (INIS)

    Hammitt, J.K.

    1995-01-01

    Choices among environmental policies can be informed by analysis of the potential physical, biological, and social outcomes of alternative choices, and analysis of social preferences among these outcomes. Frequently, however, the consequences of alternative policies cannot be accurately predicted because of substantial outcome uncertainties concerning physical, chemical, biological, and social processes linking policy choices to consequences. Similarly, assessments of social preferences among alternative outcomes are limited by value uncertainties arising from limitations of moral principles, the absence of economic markets for many environmental attributes, and other factors. Outcome and value uncertainties relevant to global-change policy are described and their magnitudes are examined for two cases: stratospheric-ozone depletion and global climate change. Analysis of information available in the mid 1980s, when international ozone regulations were adopted, suggests that contemporary uncertainties surrounding CFC emissions and the atmospheric response were so large that plausible ozone depletion, absent regulation, ranged from negligible to catastrophic, a range that exceeded the plausible effect of the regulations considered. Analysis of climate change suggests that, important as outcome uncertainties are, uncertainties about values may be even more important for policy choice. 53 refs., 3 figs., 3 tabs

  11. Uncertainty assessment of a model for biological nitrogen and phosphorus removal: Application to a large wastewater treatment plant

    Science.gov (United States)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.

  12. Ascertaining the uncertainty relations via quantum correlations

    International Nuclear Information System (INIS)

    Li, Jun-Li; Du, Kun; Qiao, Cong-Feng

    2014-01-01

    We propose a new scheme to express the uncertainty principle in the form of inequality of the bipartite correlation functions for a given multipartite state, which provides an experimentally feasible and model-independent way to verify various uncertainty and measurement disturbance relations. By virtue of this scheme, the implementation of experimental measurement on the measurement disturbance relation to a variety of physical systems becomes practical. The inequality in turn, also imposes a constraint on the strength of correlation, i.e. it determines the maximum value of the correlation function for two-body system and a monogamy relation of the bipartite correlation functions for multipartite system. (paper)

  13. Position-momentum uncertainty relations in the presence of quantum memory

    Energy Technology Data Exchange (ETDEWEB)

    Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp [Department of Physics, Graduate School of Science, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Berta, Mario [Institute for Quantum Information and Matter, Caltech, Pasadena, California 91125 (United States); Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Tomamichel, Marco [School of Physics, The University of Sydney, Sydney 2006 (Australia); Centre for Quantum Technologies, National University of Singapore, Singapore 117543 (Singapore); Scholz, Volkher B. [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Christandl, Matthias [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Department of Mathematical Sciences, University of Copenhagen, Universitetsparken 5, 2100 Copenhagen (Denmark)

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  14. A retrospective study of carbamazepine therapy in the treatment of idiopathic generalised epilepsy

    LENUS (Irish Health Repository)

    O'Connor, G

    2011-05-01

    Objective: The exacerbation of idiopathic generalised epilepsy (IGE) by some anti-epileptic drugs (AEDs) such as carbamazepine (CBZ) has been well documented. However, it is unclear whether IGE is always worsened by the use of CBZ, or whether some patients with IGE benefit from its use. \\r\

  15. [A magnetoencephalographic study of generalised developmental disorders. A new proposal for their classification].

    Science.gov (United States)

    Muñoz Yunta, J A; Palau Baduell, M; Salvado Salvado, B; Amo, C; Fernandez Lucas, A; Maestu, F; Ortiz, T

    2004-02-01

    Autistic spectrum disorders (ASD) is a term that is not included in DSM IV or in ICD 10, which are the diagnostic tools most commonly used by clinical professionals but can offer problems in research when it comes to finding homogenous groups. From a neuropaediatric point of view, there is a need for a classification of the generalised disorders affecting development and for this purpose we used Wing's triad, which defines the continuum of the autistic spectrum, and the information provided by magnetoencephalography (MEG) as grouping elements. Specific generalised developmental disorders were taken as being those syndromes that partially expressed some autistic trait, but with their own personality so that they could be considered to be a specific disorder. ASD were classified as being primary, cryptogenic or secondary. The primary disorders, in turn, express a continuum that ranges from Savant syndrome to Asperger's syndrome and the different degrees of early infantile autism. MEG is a functional neuroimaging technique that has enabled us to back up this classification.

  16. Collaborative care for panic disorder, generalised anxiety disorder and social phobia in general practice

    DEFF Research Database (Denmark)

    Curth, Nadja Kehler; Brinck-Claussen, Ursula Ødum; Davidsen, Annette Sofie

    2017-01-01

    such as cognitive behavioral therapy. A limited number of studies suggest that collaborative care has a positive effect on symptoms for people with anxiety disorders. However, most studies are carried out in the USA and none have reported results for social phobia or generalised anxiety disorder separately. Thus...... in this protocol and focus on panic disorder, generalised anxiety disorder and social phobia. The aim is to investigate whether treatment according to the Collabri model has a better effect than usual treatment on symptoms when provided to people with anxiety disorders. Methods: Three cluster-randomised, clinical...... practices located in the Capital Region of Denmark. For all trials, the primary outcome is anxiety symptoms (Beck Anxiety Inventory (BAI)) 6 months after baseline. Secondary outcomes include BAI after 15 months, depression symptoms (Beck Depression Inventory) after 6 months, level of psychosocial...

  17. Uncertainty of spatial straightness in 3D measurement

    International Nuclear Information System (INIS)

    Wang Jinxing; Jiang Xiangqian; Ma Limin; Xu Zhengao; Li Zhu

    2005-01-01

    The least-square method is commonly employed to verify the spatial straightness in actual three-dimensional measurement process, but the uncertainty of the verification result is usually not given by the coordinate measuring machines. According to the basic principle of spatial straightness least-square verification and the uncertainty propagation formula given by ISO/TS 14253-2, a calculation method for the uncertainty of spatial straightness least-square verification is proposed in this paper. By this method, the coefficients of the line equation are regarded as a statistical vector, so that the line equation, the result of the spatial straightness verification and the uncertainty of the result can be obtained after the expected value and covariance matrix of the vector are determined. The method not only assures the integrity of the verification result, but also accords with the requirement of the new generation of GPS standards, which can improve the veracity of verification

  18. Stability analysis and observational measurement in chameleonic generalised Brans-Dicke cosmology

    International Nuclear Information System (INIS)

    Farajollahi, Hossein; Salehi, Amin

    2011-01-01

    We investigate the dynamics of the chameleonic Generalised Brans-Dicke model in flat FRW cosmology. In a new approach, a framework to study stability and attractor solutions in the phase space for the model is developed by simultaneously best fitting the stability and model parameters with the observational data. The results show that for an accelerating universe the phantom crossing does not occur in the past and near future

  19. A study of the one dimensional total generalised variation regularisation problem

    KAUST Repository

    Papafitsoros, Konstantinos

    2015-03-01

    © 2015 American Institute of Mathematical Sciences. In this paper we study the one dimensional second order total generalised variation regularisation (TGV) problem with L2 data fitting term. We examine the properties of this model and we calculate exact solutions using simple piecewise affine functions as data terms. We investigate how these solutions behave with respect to the TGV parameters and we verify our results using numerical experiments.

  20. A study of the one dimensional total generalised variation regularisation problem

    KAUST Repository

    Papafitsoros, Konstantinos; Bredies, Kristian

    2015-01-01

    © 2015 American Institute of Mathematical Sciences. In this paper we study the one dimensional second order total generalised variation regularisation (TGV) problem with L2 data fitting term. We examine the properties of this model and we calculate exact solutions using simple piecewise affine functions as data terms. We investigate how these solutions behave with respect to the TGV parameters and we verify our results using numerical experiments.

  1. CSAU (Code Scaling, Applicability and Uncertainty)

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1989-01-01

    Best Estimate computer codes have been accepted by the U.S. Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs

  2. Generalised Partially Linear Regression with Misclassified Data and an Application to Labour Market Transitions

    DEFF Research Database (Denmark)

    Dlugosz, Stephan; Mammen, Enno; Wilke, Ralf

    We consider the semiparametric generalised linear regression model which has mainstream empirical models such as the (partially) linear mean regression, logistic and multinomial regression as special cases. As an extension to related literature we allow a misclassified covariate to be interacted...

  3. The equivalence principle in a quantum world

    DEFF Research Database (Denmark)

    Bjerrum-Bohr, N. Emil J.; Donoghue, John F.; El-Menoufi, Basem Kamal

    2015-01-01

    the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry - general coordinate invariance - that is used to organize the effective field theory......We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When...

  4. Building Abelian Functions with Generalised Baker-Hirota Operators

    Directory of Open Access Journals (Sweden)

    Matthew England

    2012-06-01

    Full Text Available We present a new systematic method to construct Abelian functions on Jacobian varieties of plane, algebraic curves. The main tool used is a symmetric generalisation of the bilinear operator defined in the work of Baker and Hirota. We give explicit formulae for the multiple applications of the operators, use them to define infinite sequences of Abelian functions of a prescribed pole structure and deduce the key properties of these functions. We apply the theory on the two canonical curves of genus three, presenting new explicit examples of vector space bases of Abelian functions. These reveal previously unseen similarities between the theories of functions associated to curves of the same genus.

  5. On quantization, the generalised Schroedinger equation and classical mechanics

    International Nuclear Information System (INIS)

    Jones, K.R.W.

    1991-01-01

    A ψ-dependent linear functional operator, was defined, which solves the problem of quantization in non-relativistic quantum mechanics. Weyl ordering is implemented automatically and permits derivation of many of the quantum to classical correspondences. The parameter λ presents a natural C ∞ deformation of the dynamical structure of quantum mechanics via a non-linear integro-differential 'Generalised Schroedinger Equation', admitting an infinite family of soliton solutions. All these solutions are presented and it is shown that this equation gives an exact dynamic and energetic reproduction of classical mechanics with the correct measurement theoretic limit. 23 refs

  6. Sleep onset uncovers thalamic abnormalities in patients with idiopathic generalised epilepsy

    Directory of Open Access Journals (Sweden)

    Andrew P. Bagshaw

    Full Text Available The thalamus is crucial for sleep regulation and the pathophysiology of idiopathic generalised epilepsy (IGE, and may serve as the underlying basis for the links between the two. We investigated this using EEG-fMRI and a specific emphasis on the role and functional connectivity (FC of the thalamus. We defined three types of thalamic FC: thalamocortical, inter-hemispheric thalamic, and intra-hemispheric thalamic. Patients and controls differed in all three measures, and during wakefulness and sleep, indicating disorder-dependent and state-dependent modification of thalamic FC. Inter-hemispheric thalamic FC differed between patients and controls in somatosensory regions during wakefulness, and occipital regions during sleep. Intra-hemispheric thalamic FC was significantly higher in patients than controls following sleep onset, and disorder-dependent alterations to FC were seen in several thalamic regions always involving somatomotor and occipital regions. As interactions between thalamic sub-regions are indirect and mediated by the inhibitory thalamic reticular nucleus (TRN, the results suggest abnormal TRN function in patients with IGE, with a regional distribution which could suggest a link with the thalamocortical networks involved in the generation of alpha rhythms. Intra-thalamic FC could be a more widely applicable marker beyond patients with IGE. Keywords: Functional connectivity, Generalised epilepsy, Sleep, Thalamic reticular nucleus thalamus

  7. Generalised brain edema and brain infarct in ergotamine abuse: Visualization by CT, MR and angiography

    International Nuclear Information System (INIS)

    Toedt, C.; Hoetzinger, H.; Salbeck, R.; Beyer, H.K.

    1989-01-01

    Abuse of ergotamine can release a generalised brain edema and brain infarctions. This can be visualized by CT, MR and angiography. The reason, however, can only be found in the patients history. (orig.) [de

  8. Recent Fuzzy Generalisations of Rough Sets Theory: A Systematic Review and Methodological Critique of the Literature

    Directory of Open Access Journals (Sweden)

    Abbas Mardani

    2017-01-01

    Full Text Available Rough set theory has been used extensively in fields of complexity, cognitive sciences, and artificial intelligence, especially in numerous fields such as expert systems, knowledge discovery, information system, inductive reasoning, intelligent systems, data mining, pattern recognition, decision-making, and machine learning. Rough sets models, which have been recently proposed, are developed applying the different fuzzy generalisations. Currently, there is not a systematic literature review and classification of these new generalisations about rough set models. Therefore, in this review study, the attempt is made to provide a comprehensive systematic review of methodologies and applications of recent generalisations discussed in the area of fuzzy-rough set theory. On this subject, the Web of Science database has been chosen to select the relevant papers. Accordingly, the systematic and meta-analysis approach, which is called “PRISMA,” has been proposed and the selected articles were classified based on the author and year of publication, author nationalities, application field, type of study, study category, study contribution, and journal in which the articles have appeared. Based on the results of this review, we found that there are many challenging issues related to the different application area of fuzzy-rough set theory which can motivate future research studies.

  9. Investigation of Free Particle Propagator with Generalized Uncertainty Problem

    International Nuclear Information System (INIS)

    Hassanabadi, H.; Ghobakhloo, F.

    2016-01-01

    We consider the Schrödinger equation with a generalized uncertainty principle for a free particle. We then transform the problem into a second-order ordinary differential equation and thereby obtain the corresponding propagator. The result of ordinary quantum mechanics is recovered for vanishing minimal length parameter.

  10. Processing bias in children with separation anxiety disorder, social phobia and generalised anxiety disorder

    NARCIS (Netherlands)

    Kindt, M.; Bögels, S.M.; Morren, M.

    2003-01-01

    The present study examined processing bias in children suffering from anxiety disorders. Processing bias was assessed using of the emotional Stroop task in clinically referred children with separation anxiety disorder (SAD), social phobia (SP), and/or generalised anxiety disorder (GAD) and normal

  11. Uncertainty in visual processes predicts geometrical optical illusions.

    Science.gov (United States)

    Fermüller, Cornelia; Malm, Henrik

    2004-03-01

    It is proposed in this paper that many geometrical optical illusions, as well as illusory patterns due to motion signals in line drawings, are due to the statistics of visual computations. The interpretation of image patterns is preceded by a step where image features such as lines, intersections of lines, or local image movement must be derived. However, there are many sources of noise or uncertainty in the formation and processing of images, and they cause problems in the estimation of these features; in particular, they cause bias. As a result, the locations of features are perceived erroneously and the appearance of the patterns is altered. The bias occurs with any visual processing of line features; under average conditions it is not large enough to be noticeable, but illusory patterns are such that the bias is highly pronounced. Thus, the broader message of this paper is that there is a general uncertainty principle which governs the workings of vision systems, and optical illusions are an artifact of this principle.

  12. Quantum principles and particles

    CERN Document Server

    Wilcox, Walter

    2012-01-01

    QUANTUM PRINCIPLESPerspective and PrinciplesPrelude to Quantum MechanicsStern-Gerlach Experiment Idealized Stern-Gerlach ResultsClassical Model AttemptsWave Functions for Two Physical-Outcome CaseProcess Diagrams, Operators, and Completeness Further Properties of Operators/ModulationOperator ReformulationOperator RotationBra-Ket Notation/Basis StatesTransition AmplitudesThree-Magnet Setup Example-CoherenceHermitian ConjugationUnitary OperatorsA Very Special OperatorMatrix RepresentationsMatrix Wave Function RecoveryExpectation ValuesWrap Up ProblemsFree Particles in One DimensionPhotoelectric EffectCompton EffectUncertainty Relation for PhotonsStability of Ground StatesBohr ModelFourier Transform and Uncertainty RelationsSchrödinger EquationSchrödinger Equation ExampleDirac Delta FunctionsWave Functions and ProbabilityProbability CurrentTime Separable SolutionsCompleteness for Particle StatesParticle Operator PropertiesOperator RulesTime Evolution and Expectation ValuesWrap-UpProblemsSome One-Dimensional So...

  13. Entropy-power uncertainty relations: towards a tight inequality for all Gaussian pure states

    International Nuclear Information System (INIS)

    Hertz, Anaelle; Jabbour, Michael G; Cerf, Nicolas J

    2017-01-01

    We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate continuous variables relies on entropy power, a standard notion in Shannon information theory for real-valued signals. The resulting entropy-power uncertainty relation is equivalent to the entropic formulation of the uncertainty relation due to Bialynicki-Birula and Mycielski, but can be further extended to rotated variables. Hence, based on a reasonable assumption, we give a partial proof of a tighter form of the entropy-power uncertainty relation taking correlations into account and provide extensive numerical evidence of its validity. Interestingly, it implies the generalized (rotation-invariant) Schrödinger–Robertson uncertainty relation exactly as the original entropy-power uncertainty relation implies Heisenberg relation. It is saturated for all Gaussian pure states, in contrast with hitherto known entropic formulations of the uncertainty principle. (paper)

  14. Ideas underlying the Quantification of Margins and Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin, E-mail: mpilch@sandia.gov [Department 1514, Sandia National Laboratories, Albuquerque, NM 87185-0828 (United States); Trucano, Timothy G. [Department 1411, Sandia National Laboratories, Albuquerque, NM 87185-0370 (United States); Helton, Jon C. [Department of Mathematics and Statistics, Arizona State University, Tempe, AZ 85287-1804 (United States)

    2011-09-15

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  15. Experimental Realization of Popper's Experiment: Violation of Uncertainty Principle?

    Science.gov (United States)

    Kim, Yoon-Ho; Yu, Rong; Shih, Yanhua

    An entangled pair of photon 1 and 2 are emitted in opposite directions along the positive and negative x-axis. A narrow slit is placed in the path of photon 1 which provides precise knowledge about its position along the y-axis and because of the quantum entanglement this in turn provides precise knowledge of the position y of its twin, photon 2. Does photon 2 experience a greater uncertainty in its momentum, i.e., a greater Δpy, due to the precise knowledge of its position y? This is the historical thought experiment of Sir Karl Popper which was aimed to undermine the Copenhagen interpretation in favor of a realistic viewpoint of quantum mechanics. Thispaper reports an experimental realization of the Popper's experiment. One may not agree with Popper's position on quantum mechanics; however, it calls for a correct understanding and interpretation of the experimental results.

  16. An anisotropic elastoplastic constitutive formulation generalised for orthotropic materials

    Science.gov (United States)

    Mohd Nor, M. K.; Ma'at, N.; Ho, C. S.

    2018-03-01

    This paper presents a finite strain constitutive model to predict a complex elastoplastic deformation behaviour that involves very high pressures and shockwaves in orthotropic materials using an anisotropic Hill's yield criterion by means of the evolving structural tensors. The yield surface of this hyperelastic-plastic constitutive model is aligned uniquely within the principal stress space due to the combination of Mandel stress tensor and a new generalised orthotropic pressure. The formulation is developed in the isoclinic configuration and allows for a unique treatment for elastic and plastic orthotropy. An isotropic hardening is adopted to define the evolution of plastic orthotropy. The important feature of the proposed hyperelastic-plastic constitutive model is the introduction of anisotropic effect in the Mie-Gruneisen equation of state (EOS). The formulation is further combined with Grady spall failure model to predict spall failure in the materials. The proposed constitutive model is implemented as a new material model in the Lawrence Livermore National Laboratory (LLNL)-DYNA3D code of UTHM's version, named Material Type 92 (Mat92). The combination of the proposed stress tensor decomposition and the Mie-Gruneisen EOS requires some modifications in the code to reflect the formulation of the generalised orthotropic pressure. The validation approach is also presented in this paper for guidance purpose. The \\varvec{ψ} tensor used to define the alignment of the adopted yield surface is first validated. This is continued with an internal validation related to elastic isotropic, elastic orthotropic and elastic-plastic orthotropic of the proposed formulation before a comparison against range of plate impact test data at 234, 450 and {895 ms}^{-1} impact velocities is performed. A good agreement is obtained in each test.

  17. Generalised pruritus as a presentation of Grave's disease.

    Science.gov (United States)

    Tan, Ce; Loh, Ky

    2013-01-01

    Pruritus is a lesser known symptom of hyperthyroidism, particularly in autoimmune thyroid disorders. This is a case report of a 27-year-old woman who presented with generalised pruritus at a primary care clinic. Incidental findings of tachycardia and a goiter led to the investigations of her thyroid status. The thyroid function test revealed elevated serum free T4 and suppressed thyroid stimulating hormone levels. The anti-thyroid antibodies were positive. She was diagnosed with Graves' disease and treated with carbimazole until her symptoms subsided. Graves' disease should be considered as an underlying cause for patients presenting with pruritus. A thorough history and complete physical examination are crucial in making an accurate diagnosis. Underlying causes must be determined before treating the symptoms.

  18. Coherence Generalises Duality: A Logical Explanation of Multiparty Session Types

    DEFF Research Database (Denmark)

    Carbone, Marco; Lindley, Sam; Montesi, Fabrizio

    2016-01-01

    the duality of classical linear logic (relating two types) with a more general notion of coherence (relating an arbitrary number of types). This paper introduces variants of CP and MCP, plus a new intermediate calculus of Globally-governed Classical Processes (GCP). We show a tight relation between......Wadler introduced Classical Processes (CP), a calculus based on a propositions-as-types correspondence between propositions of classical linear logic and session types. Carbone et al. introduced Multiparty Classical Processes, a calculus that generalises CP to multiparty session types, by replacing...

  19. Improvement of Statistical Decisions under Parametric Uncertainty

    Science.gov (United States)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

    2011-10-01

    A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

  20. Cosmological implications of Heisenberg's principle

    CERN Document Server

    Gonzalo, Julio A

    2015-01-01

    The aim of this book is to analyze the all important implications of Heisenberg's Uncertainty Principle for a finite universe with very large mass-energy content such as ours. The earlier and main contributors to the formulation of Quantum Mechanics are briefly reviewed regarding the formulation of Heisenberg's Principle. After discussing “indeterminacy” versus ”uncertainty”, the universal constants of physics are reviewed and Planck's units are given. Next, a novel set of units, Heisenberg–Lemaitre units, are defined in terms of the large finite mass of the universe. With the help of Heisenberg's principle, the time evolution of the finite zero-point energy for the universe is investigated quantitatively. Next, taking advantage of the rigorous solutions of Einstein's cosmological equation for a flat, open and mixed universe of finite mass, the most recent and accurate data on the “age” (to) and the expansion rate (Ho) of the universe and their implications are reconsidered.

  1. Uncertainty and stress: Why it causes diseases and how it is mastered by the brain.

    Science.gov (United States)

    Peters, Achim; McEwen, Bruce S; Friston, Karl

    2017-09-01

    The term 'stress' - coined in 1936 - has many definitions, but until now has lacked a theoretical foundation. Here we present an information-theoretic approach - based on the 'free energy principle' - defining the essence of stress; namely, uncertainty. We address three questions: What is uncertainty? What does it do to us? What are our resources to master it? Mathematically speaking, uncertainty is entropy or 'expected surprise'. The 'free energy principle' rests upon the fact that self-organizing biological agents resist a tendency to disorder and must therefore minimize the entropy of their sensory states. Applied to our everyday life, this means that we feel uncertain, when we anticipate that outcomes will turn out to be something other than expected - and that we are unable to avoid surprise. As all cognitive systems strive to reduce their uncertainty about future outcomes, they face a critical constraint: Reducing uncertainty requires cerebral energy. The characteristic of the vertebrate brain to prioritize its own high energy is captured by the notion of the 'selfish brain'. Accordingly, in times of uncertainty, the selfish brain demands extra energy from the body. If, despite all this, the brain cannot reduce uncertainty, a persistent cerebral energy crisis may develop, burdening the individual by 'allostatic load' that contributes to systemic and brain malfunction (impaired memory, atherogenesis, diabetes and subsequent cardio- and cerebrovascular events). Based on the basic tenet that stress originates from uncertainty, we discuss the strategies our brain uses to avoid surprise and thereby resolve uncertainty. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Nuclear Data Uncertainties in 2004: A Perspective

    International Nuclear Information System (INIS)

    Smith, Donald L.

    2005-01-01

    Interest in nuclear data uncertainties is growing robustly after having languished for several years. Renewed attention to this topic is being motivated by the practical need for assuring that nuclear systems will be safe, reliable, and cost effective, according to the individual requirements of each specific nuclear technology. Furthermore, applications are emerging in certain areas of basic nuclear science, e.g., in astrophysics, where, until recently, attention has focused mainly on understanding basic concepts and physics principles rather than on dealing with detailed quantitative information. The availability of fast computers and the concurrent development of sophisticated software enable nuclear data uncertainty information to be used more effectively than ever before. For example, data uncertainties and associated methodologies play useful roles in advanced data measurement, analysis, and evaluation procedures. Unfortunately, the current inventory of requisite uncertainty information is rather limited when measured against these evolving demands. Consequently, there is a real need to generate more comprehensive and reasonable nuclear data uncertainty information, and to make this available relatively soon in suitable form for use in the computer codes employed for nuclear analyses and the development of advanced nuclear energy systems. This conference contribution discusses several conceptual and technical issues that need to be addressed in meeting this demand during the next few years. The role of data uncertainties in several areas of nuclear science will also be mentioned briefly. Finally, the opportunities that ultimately will be afforded by the availability of more extensive and reasonable uncertainty information, and some technical challenges to master, will also be explored in this paper

  3. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    International Nuclear Information System (INIS)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic; Mollicone, Danilo; Federici, Sandro

    2008-01-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation

  4. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    Energy Technology Data Exchange (ETDEWEB)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic [Institute for Environment and Sustainability, Joint Research Centre of the European Commission, I-21020 Ispra (Italy); Mollicone, Danilo [Department of Geography, University of Alcala de Henares, Madrid (Spain); Federici, Sandro

    2008-07-15

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.

  5. Harvest Regulations and Implementation Uncertainty in Small Game Harvest Management

    Directory of Open Access Journals (Sweden)

    Pål F. Moa

    2017-09-01

    Full Text Available A main challenge in harvest management is to set policies that maximize the probability that management goals are met. While the management cycle includes multiple sources of uncertainty, only some of these has received considerable attention. Currently, there is a large gap in our knowledge about implemention of harvest regulations, and to which extent indirect control methods such as harvest regulations are actually able to regulate harvest in accordance with intended management objectives. In this perspective article, we first summarize and discuss hunting regulations currently used in management of grouse species (Tetraonidae in Europe and North America. Management models suggested for grouse are most often based on proportional harvest or threshold harvest principles. These models are all built on theoretical principles for sustainable harvesting, and provide in the end an estimate on a total allowable catch. However, implementation uncertainty is rarely examined in empirical or theoretical harvest studies, and few general findings have been reported. Nevertheless, circumstantial evidence suggest that many of the most popular regulations are acting depensatory so that harvest bag sizes is more limited in years (or areas where game density is high, contrary to general recommendations. A better understanding of the implementation uncertainty related to harvest regulations is crucial in order to establish sustainable management systems. We suggest that scenario tools like Management System Evaluation (MSE should be more frequently used to examine robustness of currently applied harvest regulations to such implementation uncertainty until more empirical evidence is available.

  6. Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle

    Science.gov (United States)

    Shoufan Fang; George Z. Gertner

    2000-01-01

    When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...

  7. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  8. Risk, Uncertainty and Precaution in Science: The Threshold of the Toxicological Concern Approach in Food Toxicology.

    Science.gov (United States)

    Bschir, Karim

    2017-04-01

    Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.

  9. Uncertainty quantification in lattice QCD calculations for nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Beane, Silas R. [Univ. of Washington, Seattle, WA (United States); Detmold, William [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Orginos, Kostas [College of William and Mary, Williamsburg, VA (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Savage, Martin J. [Institute for Nuclear Theory, Seattle, WA (United States)

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  10. Threshold corrections, generalised prepotentials and Eichler integrals

    CERN Document Server

    Angelantonj, Carlo; Pioline, Boris

    2015-06-12

    We continue our study of one-loop integrals associated to BPS-saturated amplitudes in $\\mathcal{N}=2$ heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur-Poincar\\'e series in the complex structure modulus. The closure of Niebur-Poincar\\'e series under modular derivatives implies that such integrals derive from holomorphic prepotentials $f_n$, generalising the familiar prepotential of $\\mathcal{N}=2$ supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involv...

  11. The principle of optimisation: reasons for success and legal criticism

    International Nuclear Information System (INIS)

    Fernandez Regalado, Luis

    2008-01-01

    The International Commission on Radiological Protection (ICRP) has adopted new recommendations in 2007. In broad outlines they fundamentally continue the recommendations already approved in 1990 and later on. The principle of optimisation of protection, together with the principles of justification and dose limits, remains playing a key role of the ICRP recommendations, and it has so been for the last few years. This principle, somehow reinforced in the 2007 ICRP recommendations, has been incorporated into norms and legislation which have peacefully been in force in many countries all over the world. There are three main reasons to explain the success in the application of the principle of optimisation in radiological protection: First, the subjectivity of the sentence that embraces the principle of optimisation, 'As low as reasonably achievable' (ALARA), that allows different valid interpretations under different circumstances. Second, the pragmatism and adaptability of ALARA to all exposure situations. And third, the scientific humbleness which is behind the principle of optimisation, which makes a clear contrast with the old fashioned scientific positivism that enshrined scientist opinions. Nevertheless, from a legal point of view, there is some criticism cast over the principle of optimisation in radiological protection, where it has been transformed in compulsory norm. This criticism is based on two arguments: The lack of democratic participation in the process of elaboration of the norm, and the legal uncertainty associated to its application. Both arguments are somehow known by the ICRP which, on the one hand, has broadened the participation of experts, associations and the professional radiological protection community, increasing the transparency on how decisions on recommendations have been taken, and on the other hand, the ICRP has warned about the need for authorities to specify general criteria to develop the principle of optimisation in national

  12. Understanding and applying principles of social cognition and ...

    Science.gov (United States)

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people’s decision making that cloud their judgment and create conflict. These systems must also satisfy people’s fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance. Social-ecological stressors place significant pressure on major societal systems, triggering adaptive reforms in human governance and environmental law. Though potentially benefici

  13. Efficacy of oral afoxolaner for the treatment of canine generalised demodicosis

    Directory of Open Access Journals (Sweden)

    Beugnet Frédéric

    2016-01-01

    Full Text Available The efficacy of oral treatment with a chewable tablet containing afoxolaner 2.27% w/w (NexGard®, Merial administered orally was assessed in eight dogs diagnosed with generalised demodicosis and compared with efficacy in eight dogs under treatment with a topical combination of imidacloprid/moxidectin (Advocate®, Bayer. Afoxolaner was administered at the recommended dose (at least 2.5 mg/kg on Days 0, 14, 28 and 56. The topical combination of imidacloprid/moxidectin was given at the same intervals at the recommended concentration. Clinical examinations and deep skin scrapings were performed every month in order to evaluate the effect on mite numbers and the resolution of clinical signs. The percentage reductions of mite counts were 99.2%, 99.9% and 100% on Days 28, 56 and 84, respectively, in the afoxolaner-treated group, compared to 89.8%, 85.2% and 86.6% on Days 28, 56 and 84 in the imidacloprid/moxidectin-treated group. Skin condition of the dogs also improved significantly from Day 28 to Day 84 in the afoxolaner-treated group. Mite reductions were significantly higher on Days 28, 56 and 84 in the afoxolaner-treated group compared to the imidacloprid/moxidectin-treated group. The results of this study demonstrated that afoxolaner, given orally, was effective in treating dogs with generalised demodicosis within a two-month period.

  14. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  15. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    Science.gov (United States)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  16. The precautionary principle and pharmaceutical risk management.

    Science.gov (United States)

    Callréus, Torbjörn

    2005-01-01

    Although it is often vigorously contested and has several different formulations, the precautionary principle has in recent decades guided environmental policy making in the face of scientific uncertainty. Originating from a criticism of traditional risk assessment, the key element of the precautionary principle is the justification for acting in the face of uncertain knowledge about risks. In the light of its growing invocation in various areas that are related to public health and recently in relation to drug safety issues, this article presents an introductory review of the main elements of the precautionary principle and some arguments conveyed by its advocates and opponents. A comparison of the characteristics of pharmaceutical risk management and environmental policy making (i.e. the setting within which the precautionary principle evolved), indicates that several important differences exist. If believed to be of relevance, in order to avoid arbitrary and unpredictable decision making, both the interpretation and possible application of the precautionary principle need to be adapted to the conditions of pharmaceutical risk management.

  17. Generalised partition functions: inferences on phase space distributions

    Directory of Open Access Journals (Sweden)

    R. A. Treumann

    2016-06-01

    Full Text Available It is demonstrated that the statistical mechanical partition function can be used to construct various different forms of phase space distributions. This indicates that its structure is not restricted to the Gibbs–Boltzmann factor prescription which is based on counting statistics. With the widely used replacement of the Boltzmann factor by a generalised Lorentzian (also known as the q-deformed exponential function, where κ = 1∕|q − 1|, with κ, q ∈ R both the kappa-Bose and kappa-Fermi partition functions are obtained in quite a straightforward way, from which the conventional Bose and Fermi distributions follow for κ → ∞. For κ ≠ ∞ these are subject to the restrictions that they can be used only at temperatures far from zero. They thus, as shown earlier, have little value for quantum physics. This is reasonable, because physical κ systems imply strong correlations which are absent at zero temperature where apart from stochastics all dynamical interactions are frozen. In the classical large temperature limit one obtains physically reasonable κ distributions which depend on energy respectively momentum as well as on chemical potential. Looking for other functional dependencies, we examine Bessel functions whether they can be used for obtaining valid distributions. Again and for the same reason, no Fermi and Bose distributions exist in the low temperature limit. However, a classical Bessel–Boltzmann distribution can be constructed which is a Bessel-modified Lorentzian distribution. Whether it makes any physical sense remains an open question. This is not investigated here. The choice of Bessel functions is motivated solely by their convergence properties and not by reference to any physical demands. This result suggests that the Gibbs–Boltzmann partition function is fundamental not only to Gibbs–Boltzmann but also to a large class of generalised Lorentzian distributions as well as to the

  18. Enhancing the Therapy Experience Using Principles of Video Game Design.

    Science.gov (United States)

    Folkins, John Wm; Brackenbury, Tim; Krause, Miriam; Haviland, Allison

    2016-02-01

    This article considers the potential benefits that applying design principles from contemporary video games may have on enhancing therapy experiences. Six principles of video game design are presented, and their relevance for enriching clinical experiences is discussed. The motivational and learning benefits of each design principle have been discussed in the education literature as having positive impacts on student motivation and learning and are related here to aspects of clinical practice. The essential experience principle suggests connecting all aspects of the experience around a central emotion or cognitive connection. The discovery principle promotes indirect learning in focused environments. The risk-taking principle addresses the uncertainties clients face when attempting newly learned skills in novel situations. The generalization principle encourages multiple opportunities for skill transfer. The reward system principle directly relates to the scaffolding of frequent and varied feedback in treatment. Last, the identity principle can assist clients in using their newly learned communication skills to redefine self-perceptions. These principles highlight areas for research and interventions that may be used to reinforce or advance current practice.

  19. Birthplace Diversity, Income Inequality and Education Gradients in Generalised Trust: The Relevance of Cognitive Skills in 29 Countries. OECD Education Working Papers, No. 164

    Science.gov (United States)

    Borgonovi, Francesca; Pokropek, Artur

    2017-01-01

    The paper examines between-country differences in the mechanisms through which education could promote generalised trust using data from 29 countries participating in the OECD's Survey of Adult Skills (PIAAC). Results indicate that education is strongly associated with generalised trust and that a large part of this association is mediated by…

  20. The neurobiology of uncertainty: implications for statistical learning.

    Science.gov (United States)

    Hasson, Uri

    2017-01-05

    The capacity for assessing the degree of uncertainty in the environment relies on estimating statistics of temporally unfolding inputs. This, in turn, allows calibration of predictive and bottom-up processing, and signalling changes in temporally unfolding environmental features. In the last decade, several studies have examined how the brain codes for and responds to input uncertainty. Initial neurobiological experiments implicated frontoparietal and hippocampal systems, based largely on paradigms that manipulated distributional features of visual stimuli. However, later work in the auditory domain pointed to different systems, whose activation profiles have interesting implications for computational and neurobiological models of statistical learning (SL). This review begins by briefly recapping the historical development of ideas pertaining to the sensitivity to uncertainty in temporally unfolding inputs. It then discusses several issues at the interface of studies of uncertainty and SL. Following, it presents several current treatments of the neurobiology of uncertainty and reviews recent findings that point to principles that serve as important constraints on future neurobiological theories of uncertainty, and relatedly, SL. This review suggests it may be useful to establish closer links between neurobiological research on uncertainty and SL, considering particularly mechanisms sensitive to local and global structure in inputs, the degree of input uncertainty, the complexity of the system generating the input, learning mechanisms that operate on different temporal scales and the use of learnt information for online prediction.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  1. Safety and efficacy of eculizumab in anti-acetylcholine receptor antibody-positive refractory generalised myasthenia gravis (REGAIN)

    DEFF Research Database (Denmark)

    Howard, James F; Utsugisawa, Kimiaki; Benatar, Michael

    2017-01-01

    BACKGROUND: Complement is likely to have a role in refractory generalised myasthenia gravis, but no approved therapies specifically target this system. Results from a phase 2 study suggested that eculizumab, a terminal complement inhibitor, produced clinically meaningful improvements in patients...... with anti-acetylcholine receptor antibody-positive refractory generalised myasthenia gravis. We further assessed the efficacy and safety of eculizumab in this patient population in a phase 3 trial. METHODS: We did a phase 3, randomised, double-blind, placebo-controlled, multicentre study (REGAIN) in 76...... hospitals and specialised clinics in 17 countries across North America, Latin America, Europe, and Asia. Eligible patients were aged at least 18 years, with a Myasthenia Gravis-Activities of Daily Living (MG-ADL) score of 6 or more, Myasthenia Gravis Foundation of America (MGFA) class II-IV disease...

  2. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  3. Quantum Uncertainty and Fundamental Interactions

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-04-01

    Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.

  4. Vacancies and a generalised melting curve of metals

    International Nuclear Information System (INIS)

    Gorecki, T.

    1979-01-01

    The vacancy mechanism of the melting process is used as a starting point for deriving an expression for the pressure dependence of the melting temperature of metals. The results obtained for the initial slope of the melting curve are compared with experimental data for 45 metals and in most cases the agreement is very good. The nonlinearity of the melting curve and the appearance of a maximum on the melting curve at a pressure approximately equal to the bulk modules is also predicted, with qualitative agreement with experimental data. A relation between bonding energy, atomic volume, and bulk modulus of metals is established. On the basis of this relation and the proposed vacancy mechanism, a generalised equation for the pressure dependence of the melting temperature of metals is derived. (author)

  5. Multiplicative quiver varieties and generalised Ruijsenaars-Schneider models

    Science.gov (United States)

    Chalykh, Oleg; Fairon, Maxime

    2017-11-01

    We study some classical integrable systems naturally associated with multiplicative quiver varieties for the (extended) cyclic quiver with m vertices. The phase space of our integrable systems is obtained by quasi-Hamiltonian reduction from the space of representations of the quiver. Three families of Poisson-commuting functions are constructed and written explicitly in suitable Darboux coordinates. The case m = 1 corresponds to the tadpole quiver and the Ruijsenaars-Schneider system and its variants, while for m > 1 we obtain new integrable systems that generalise the Ruijsenaars-Schneider system. These systems and their quantum versions also appeared recently in the context of supersymmetric gauge theory and cyclotomic DAHAs (Braverman et al. [32,34,35] and Kodera and Nakajima [36]), as well as in the context of the Macdonald theory (Chalykh and Etingof, 2013).

  6. Beyond the relativistic point particle: A reciprocally invariant system and its generalisation

    International Nuclear Information System (INIS)

    Pavsic, Matej

    2009-01-01

    We investigate a reciprocally invariant system proposed by Low and Govaerts et al., whose action contains both the orthogonal and the symplectic forms and is invariant under global O(2,4) intersection Sp(2,4) transformations. We find that the general solution to the classical equations of motion has no linear term in the evolution parameter, τ, but only the oscillatory terms, and therefore cannot represent a particle propagating in spacetime. As a remedy, we consider a generalisation of the action by adopting a procedure similar to that of Bars et al., who introduced the concept of a τ derivative that is covariant under local Sp(2) transformations between the phase space variables x μ (τ) and p μ (τ). This system, in particular, is similar to a rigid particle whose action contains the extrinsic curvature of the world line, which turns out to be helical in spacetime. Another possible generalisation is the introduction of a symplectic potential proposed by Montesinos. We show how the latter approach is related to Kaluza-Klein theories and to the concept of Clifford space, a manifold whose tangent space at any point is Clifford algebra Cl(8), a promising framework for the unification of particles and forces.

  7. Top-down instead of bottom-up estimates of uncertainty in INAA results?

    International Nuclear Information System (INIS)

    Bode, P.; De Nadai Fernandes, E.A.

    2005-01-01

    The initial publication of the ISO Guide to the Expression of Uncertainty in Measurement (GUM) and many related documents has resulted in a worldwide awareness of the importance of a realistic estimate of the value reported after the +/- sign. The evaluation of uncertainty in measurement, as introduced by the GUM, is derived from the principles applied in physical measurements. Many testing laboratories have already experienced large problems in applying these principles in e.g. (bio)chemical measurements, resulting in time-consuming evaluations and costly additional experiments. Other, more pragmatic and less costly approaches have been proposed to obtain a realistic estimate of the range in which the true value of the measurement may be found with a certain degree of probability. One of these approaches, the 'top-down method', is based on the standard deviation in the results of intercomparison data. This approach is much easier for tests for which it is either difficult to establish a full measurement equation, or if e.g. matrix-matching reference materials are absent. It has been demonstrated that the GUM 'bottom-up' approach of evaluating uncertainty in measurement can easily be applied in instrumental neutron activation analysis (INAA) as all significant sources of uncertainty can be evaluated. INAA is therefore a valuable technique to test the validity of the top-down approach. In this contribution, examples of the top-down evaluation of uncertainty in INAA derived from participation in intercomparison rounds and proficiency testing schemes will be presented. The results will be compared with the bottom-up evaluation of uncertainty, and ease of applicability, validity and usefullness of both approaches will be discussed.

  8. Managing Measurement Uncertainty in Building Acoustics

    Directory of Open Access Journals (Sweden)

    Chiara Scrosati

    2015-12-01

    Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single

  9. Generalised and abdominal adiposity are important risk factors for chronic disease in older people: results from a nationally representative survey.

    Science.gov (United States)

    Hirani, V

    2011-06-01

    To look at the trends in prevalence of generalised (body mass index (BMI) ≥ 25 kg/m2) and abdominal obesity (waist circumference (WC) >102 cm, men; > 88 cm, women) among older people from 1993 to 2008, prevalence of chronic disease by overweight/obesity and WC categories in England 2005 and evaluate the association of these measures with chronic diseases. Analyses of nationally representative cross-sectional population surveys, the Health Survey for England (HSE). Non-institutionalised men and women aged ≥ 65 years (in HSE 2005, 1512 men and 1747 women). Height, weight, waist circumference, blood pressure measurements were taken according to standardised HSE protocols. Information collected on socio-demographic, health behaviour and doctor diagnosed health conditions. Generalised obesity and abdominal obesity increased among men and women from 1993 to 2008. In 2005, the HSE 2005 focussed on older people. 72% of men and 68% of women aged over 65 were either overweight or obese. Prevalence of raised WC was higher in women (58%) than in men (46%). The prevalence of diabetes and arthritis was higher in people with generalised obesity in both sexes. Men were more likely to have had a joint replacement and had a higher prevalence of stroke if they were overweight only but women were more likely to have had a joint replacement only if they were obese (13%) and had a higher risk of falls with generalised obesity. The pattern was similar for the prevalence of chronic diseases by raised WC. Multivariate analysis showed that generalised and abdominal obesity was independently associated with risk of hypertension, diabetes and arthritis in both men and women. In women only, there was an association between generalised obesity and having a fall in the last year (OR: 1.5), and between abdominal obesity and having a joint replacement (OR: 1.9, p=0.01). Complications of obesity such as diabetes, hypertension and arthritis, are more common in men and women aged over 65 who are

  10. Do horses generalise between objects during habituation?

    DEFF Research Database (Denmark)

    Christensen, Janne Winther; Zharkikh, Tatjana; Ladevig, Jan

    2008-01-01

    Habituation to frightening stimuli plays an important role in horse training. To investigate the extent to which horses generalise between different visual objects, 2-year-old stallions were habituated to feeding from a container placed inside a test arena and assigned as TEST (n = 12) or REFERENCE...... horses (n = 12). In Experiment 1, TEST horses were habituated to six objects (ball, barrel, board, box, cone, cylinder) presented in sequence in a balanced order. The objects were of similar size but different colour. Each object was placed 0.5 m in front of the feed container, forcing the horses to pass...... the object to get to the food. TEST horses received as many 2 min exposures to each object as required to meet a habituation criterion. We recorded behavioural reactions to the object, latency to feed, total eating time, and heart rate (HR) during all exposures. There was no significant decrease in initial...

  11. Generalised Category Attack—Improving Histogram-Based Attack on JPEG LSB Embedding

    Science.gov (United States)

    Lee, Kwangsoo; Westfeld, Andreas; Lee, Sangjin

    We present a generalised and improved version of the category attack on LSB steganography in JPEG images with straddled embedding path. It detects more reliably low embedding rates and is also less disturbed by double compressed images. The proposed methods are evaluated on several thousand images. The results are compared to both recent blind and specific attacks for JPEG embedding. The proposed attack permits a more reliable detection, although it is based on first order statistics only. Its simple structure makes it very fast.

  12. A generalised Green-Julg theorem for proper groupoids and Banach algebras

    OpenAIRE

    Paravicini, Walther

    2009-01-01

    The Green-Julg theorem states that K_0^G(B) is isomorphic to K_0(L^1(G,B)) for every compact group G and every G-C*-algebra B. We formulate a generalisation of this result to proper groupoids and Banach algebras and deduce that the Bost assembly map is surjective for proper Banach algebras. On the way, we show that the spectral radius of an element in a C_0(X)-Banach algebra can be calculated from the spectral radius in the fibres.

  13. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  14. Modelling of extreme minimum rainfall using generalised extreme value distribution for Zimbabwe

    Directory of Open Access Journals (Sweden)

    Delson Chikobvu

    2015-09-01

    Full Text Available We modelled the mean annual rainfall for data recorded in Zimbabwe from 1901 to 2009. Extreme value theory was used to estimate the probabilities of meteorological droughts. Droughts can be viewed as extreme events which go beyond and/or below normal rainfall occurrences, such as exceptionally low mean annual rainfall. The duality between the distribution of the minima and maxima was exploited and used to fit the generalised extreme value distribution (GEVD to the data and hence find probabilities of extreme low levels of mean annual rainfall. The augmented Dickey Fuller test confirmed that rainfall data were stationary, while the normal quantile-quantile plot indicated that rainfall data deviated from the normality assumption at both ends of the tails of the distribution. The maximum likelihood estimation method and the Bayesian approach were used to find the parameters of the GEVD. The Kolmogorov-Smirnov and Anderson-Darling goodnessof- fit tests showed that the Weibull class of distributions was a good fit to the minima mean annual rainfall using the maximum likelihood estimation method. The mean return period estimate of a meteorological drought using the threshold value of mean annual rainfall of 473 mm was 8 years. This implies that if in the year there is a meteorological drought then another drought of the same intensity or greater is expected after 8 years. It is expected that the use of Bayesian inference may better quantify the level of uncertainty associated with the GEVD parameter estimates than with the maximum likelihood estimation method. The Markov chain Monte Carlo algorithm for the GEVD was applied to construct the model parameter estimates using the Bayesian approach. These findings are significant because results based on non-informative priors (Bayesian method and the maximum likelihood method approach are expected to be similar.

  15. Principles of classical statistical mechanics: A perspective from the notion of complementarity

    International Nuclear Information System (INIS)

    Velazquez Abad, Luisberis

    2012-01-01

    Quantum mechanics and classical statistical mechanics are two physical theories that share several analogies in their mathematical apparatus and physical foundations. In particular, classical statistical mechanics is hallmarked by the complementarity between two descriptions that are unified in thermodynamics: (i) the parametrization of the system macrostate in terms of mechanical macroscopic observablesI=(I i ), and (ii) the dynamical description that explains the evolution of a system towards the thermodynamic equilibrium. As expected, such a complementarity is related to the uncertainty relations of classical statistical mechanics ΔI i Δη i ≥k. Here, k is the Boltzmann constant, η i =∂S(I|θ)/∂I i are the restituting generalized forces derived from the entropy S(I|θ) of a closed system, which is found in an equilibrium situation driven by certain control parameters θ=(θ α ). These arguments constitute the central ingredients of a reformulation of classical statistical mechanics from the notion of complementarity. In this new framework, Einstein postulate of classical fluctuation theory dp(I|θ)∼exp[S(I|θ)/k]dI appears as the correspondence principle between classical statistical mechanics and thermodynamics in the limit k→0, while the existence of uncertainty relations can be associated with the non-commuting character of certain operators. - Highlights: ► There exists a direct analogy between quantum and classical statistical mechanics. ► Statistical form of Le Chatellier principle leads to the uncertainty principle. ► Einstein postulate is simply the correspondence principle. ► Complementary quantities are associated with non-commuting operators.

  16. Spatial and temporal patterns of land surface fluxes from remotely sensed surface temperatures within an uncertainty modelling framework

    Directory of Open Access Journals (Sweden)

    M. F. McCabe

    2005-01-01

    Full Text Available Characterising the development of evapotranspiration through time is a difficult task, particularly when utilising remote sensing data, because retrieved information is often spatially dense, but temporally sparse. Techniques to expand these essentially instantaneous measures are not only limited, they are restricted by the general paucity of information describing the spatial distribution and temporal evolution of evaporative patterns. In a novel approach, temporal changes in land surface temperatures, derived from NOAA-AVHRR imagery and a generalised split-window algorithm, are used as a calibration variable in a simple land surface scheme (TOPUP and combined within the Generalised Likelihood Uncertainty Estimation (GLUE methodology to provide estimates of areal evapotranspiration at the pixel scale. Such an approach offers an innovative means of transcending the patch or landscape scale of SVAT type models, to spatially distributed estimates of model output. The resulting spatial and temporal patterns of land surface fluxes and surface resistance are used to more fully understand the hydro-ecological trends observed across a study catchment in eastern Australia. The modelling approach is assessed by comparing predicted cumulative evapotranspiration values with surface fluxes determined from Bowen ratio systems and using auxiliary information such as in-situ soil moisture measurements and depth to groundwater to corroborate observed responses.

  17. Efficacy and safety of pregabalin in generalised anxiety disorder : A critical review of the literature

    NARCIS (Netherlands)

    Baldwin, David S.; den Boer, Johan A.; Lyndon, Gavin; Emir, Birol; Schweizer, Edward; Haswell, Hannah

    2015-01-01

    The aim of this review is to summarise the literature on the efficacy and safety of pregabalin for the treatment of generalised anxiety disorder (GAD). Of 241 literature citations, 13 clinical trials were identified that were specifically designed to evaluate the efficacy and safety of pregabalin in

  18. Cortical feedback signals generalise across different spatial frequencies of feedforward inputs.

    Science.gov (United States)

    Revina, Yulia; Petro, Lucy S; Muckli, Lars

    2017-09-22

    Visual processing in cortex relies on feedback projections contextualising feedforward information flow. Primary visual cortex (V1) has small receptive fields and processes feedforward information at a fine-grained spatial scale, whereas higher visual areas have larger, spatially invariant receptive fields. Therefore, feedback could provide coarse information about the global scene structure or alternatively recover fine-grained structure by targeting small receptive fields in V1. We tested if feedback signals generalise across different spatial frequencies of feedforward inputs, or if they are tuned to the spatial scale of the visual scene. Using a partial occlusion paradigm, functional magnetic resonance imaging (fMRI) and multivoxel pattern analysis (MVPA) we investigated whether feedback to V1 contains coarse or fine-grained information by manipulating the spatial frequency of the scene surround outside an occluded image portion. We show that feedback transmits both coarse and fine-grained information as it carries information about both low (LSF) and high spatial frequencies (HSF). Further, feedback signals containing LSF information are similar to feedback signals containing HSF information, even without a large overlap in spatial frequency bands of the HSF and LSF scenes. Lastly, we found that feedback carries similar information about the spatial frequency band across different scenes. We conclude that cortical feedback signals contain information which generalises across different spatial frequencies of feedforward inputs. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    Science.gov (United States)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  20. Extracting drug mechanism and pharmacodynamic information from clinical electroencephalographic data using generalised semi-linear canonical correlation analysis

    International Nuclear Information System (INIS)

    Brain, P; Strimenopoulou, F; Ivarsson, M; Wilson, F J; Diukova, A; Wise, R G; Berry, E; Jolly, A; Hall, J E

    2014-01-01

    Conventional analysis of clinical resting electroencephalography (EEG) recordings typically involves assessment of spectral power in pre-defined frequency bands at specific electrodes. EEG is a potentially useful technique in drug development for measuring the pharmacodynamic (PD) effects of a centrally acting compound and hence to assess the likelihood of success of a novel drug based on pharmacokinetic–pharmacodynamic (PK–PD) principles. However, the need to define the electrodes and spectral bands to be analysed a priori is limiting where the nature of the drug-induced EEG effects is initially not known. We describe the extension to human EEG data of a generalised semi-linear canonical correlation analysis (GSLCCA), developed for small animal data. GSLCCA uses data from the whole spectrum, the entire recording duration and multiple electrodes. It provides interpretable information on the mechanism of drug action and a PD measure suitable for use in PK–PD modelling. Data from a study with low (analgesic) doses of the μ-opioid agonist, remifentanil, in 12 healthy subjects were analysed using conventional spectral edge analysis and GSLCCA. At this low dose, the conventional analysis was unsuccessful but plausible results consistent with previous observations were obtained using GSLCCA, confirming that GSLCCA can be successfully applied to clinical EEG data. (paper)

  1. Issues in the Analysis of Focus Groups: Generalisability, Quantifiability, Treatment of Context and Quotations

    Science.gov (United States)

    Vicsek, Lilla

    2010-01-01

    In this paper I discuss some concerns related to the analysis of focus groups: (a) the issue of generalisation; (b) the problems of using numbers and quantifying in the analysis; (c) how the concrete situation of the focus groups could be included in the analysis, and (d) what formats can be used when quoting from focus groups. Problems with…

  2. A weak equivalence principle test on a suborbital rocket

    Energy Technology Data Exchange (ETDEWEB)

    Reasenberg, Robert D; Phillips, James D, E-mail: reasenberg@cfa.harvard.ed [Smithsonian Astrophysical Observatory, Harvard-Smithsonian Center for Astrophysics, Cambridge, MA 02138 (United States)

    2010-05-07

    We describe a Galilean test of the weak equivalence principle, to be conducted during the free fall portion of a sounding rocket flight. The test of a single pair of substances is aimed at a measurement uncertainty of sigma(eta) < 10{sup -16} after averaging the results of eight separate drops. The weak equivalence principle measurement is made with a set of four laser gauges that are expected to achieve 0.1 pm Hz{sup -1/2}. The discovery of a violation (eta not = 0) would have profound implications for physics, astrophysics and cosmology.

  3. Generalised pruritus as a presentation of Grave’s disease

    Directory of Open Access Journals (Sweden)

    Tan CE

    2013-05-01

    Full Text Available Pruritus is a lesser known symptom of hyperthyroidism, particularly in autoimmune thyroid disorders. This is a case report of a 27-year-old woman who presented with generalised pruritus at a primary care clinic. Incidental findings of tachycardia and a goiter led to the investigations of her thyroid status. The thyroid function test revealed elevated serum free T4 and suppressed thyroid stimulating hormone levels. The anti-thyroid antibodies were positive. She was diagnosed with Graves’ disease and treated with carbimazole until her symptoms subsided. Graves’ disease should be considered as an underlying cause for patients presenting with pruritus. A thorough history and complete physical examination are crucial in making an accurate diagnosis. Underlying causes must be determined before treating the symptoms.

  4. Decomposition of almost-Poisson structure of generalised Chaplygin's nonholonomic systems

    International Nuclear Information System (INIS)

    Chang, Liu; Peng, Chang; Shi-Xing, Liu; Yong-Xin, Guo

    2010-01-01

    This paper constructs an almost-Poisson structure for the non-self-adjoint dynamical systems, which can be decomposed into a sum of a Poisson bracket and the other almost-Poisson bracket. The necessary and sufficient condition for the decomposition of the almost-Poisson bracket to be two Poisson ones is obtained. As an application, the almost-Poisson structure for generalised Chaplygin's systems is discussed in the framework of the decomposition theory. It proves that the almost-Poisson bracket for the systems can be decomposed into the sum of a canonical Poisson bracket and another two noncanonical Poisson brackets in some special cases, which is useful for integrating the equations of motion

  5. Wave Energy Converter Annual Energy Production Uncertainty Using Simulations

    Directory of Open Access Journals (Sweden)

    Clayton E. Hiles

    2016-09-01

    Full Text Available Critical to evaluating the economic viability of a wave energy project is: (1 a robust estimate of the electricity production throughout the project lifetime and (2 an understanding of the uncertainty associated with said estimate. Standardization efforts have established mean annual energy production (MAEP as the metric for quantification of wave energy converter (WEC electricity production and the performance matrix approach as the appropriate method for calculation. General acceptance of a method for calculating the MAEP uncertainty has not yet been achieved. Several authors have proposed methods based on the standard engineering approach to error propagation, however, a lack of available WEC deployment data has restricted testing of these methods. In this work the magnitude and sensitivity of MAEP uncertainty is investigated. The analysis is driven by data from simulated deployments of 2 WECs of different operating principle at 4 different locations. A Monte Carlo simulation approach is proposed for calculating the variability of MAEP estimates and is used to explore the sensitivity of the calculation. The uncertainty of MAEP ranged from 2%–20% of the mean value. Of the contributing uncertainties studied, the variability in the wave climate was found responsible for most of the uncertainty in MAEP. Uncertainty in MAEP differs considerably between WEC types and between deployment locations and is sensitive to the length of the input data-sets. This implies that if a certain maximum level of uncertainty in MAEP is targeted, the minimum required lengths of the input data-sets will be different for every WEC-location combination.

  6. Specificity of dysfunctional thinking in children with symptoms of social anxiety, separation anxiety and generalised anxiety

    NARCIS (Netherlands)

    Bogels, S.M.; Snieder, N.; Kindt, M.

    2003-01-01

    The present study investigated whether children with high symptom levels of either social phobia (SP), separation anxiety disorder (SAD), or generalised anxiety disorder (GAD) are characterised by a specific set of dysfunctional interpretations that are consistent with the cognitive model of their

  7. Brief Report: Generalisation of Word-Picture Relations in Children with Autism and Typically Developing Children

    Science.gov (United States)

    Hartley, Calum; Allen, Melissa L.

    2014-01-01

    We investigated whether low-functioning children with autism generalise labels from colour photographs based on sameness of shape, colour, or both. Children with autism and language-matched controls were taught novel words paired with photographs of unfamiliar objects, and then sorted pictures and objects into two buckets according to whether or…

  8. A Game Theoretical Study of Generalised Trust and Reciprocation in Poland : I. Theory and Experimental Design

    Directory of Open Access Journals (Sweden)

    Urszula Markowska-Przybyła

    2014-01-01

    Full Text Available Although studies using experimental game theory have been carried out in various countries, no such major study has occurred in Poland. The study described here aims to investigate generalized trust and reciprocation among Polish students. In the literature, these traits are seen to be positively correlated with economic growth. Poland is regarded as the most successful post-soviet bloc country in transforming to a market economy but the level of generalised trust compared to other postcommunist countries is reported to be low. This study aims to see to what degree this reported level of generalised trust is visible amongst young Poles via experimental game theory, along with a questionnaire. The three games to be played have been described. Bayesian equilibria illustrating behavior observed in previous studies have been derived for two of these games and the experimental procedure has been described. (original abstract

  9. Controlling principles for prior probability assignments in nuclear risk assessment

    International Nuclear Information System (INIS)

    Cook, I.; Unwin, S.D.

    1986-01-01

    As performed conventionally, nuclear probabilistic risk assessment (PRA) may be criticized as utilizing inscrutable and unjustifiably ''precise'' quantitative informed judgment or extrapolation from that judgment. To meet this criticism, controlling principles that govern the formulation of probability densities are proposed, given only the informed input that would be required for a simple bounding analysis. These principles are founded upon information theoretic ideas of maximum uncertainty and cover both cases in which there exists a stochastic model of the phenomenon of interest and cases in which these is no such model. In part, the principles are conventional, and such an approach is justified by appealing to certain analogies in accounting practice and judicial decision making. Examples are given. Appropriate employment of these principles is expected to facilitate substantial progress toward PRA scrutability and transparency

  10. Climate Justice: A Constitutional Approach to Unify the Lex Specialis Principles of International Climate Law

    Directory of Open Access Journals (Sweden)

    Teresa Thorp

    2012-11-01

    Full Text Available Legal principles legitimise ubiquitous social values. They make certain social norms lawful and legitimate. Legal principles may act as governing vectors. They may give effect to a unified and legitimate constitutional framework insofar as a constitution unifies the fundamental principles on which a state or competent authority is governed.Concerning international climate law, however, there is a certain shortcoming. The failure to comprehend a unified constitutional framework of lex specialis principles could debilitate intra and inter-regime governance and lead to uncertainties. At one time, uncertainties incite the law-making process. At another time, they constrain it. Such a shortcoming may lead to inconsistencies in interpreting consequential climate norms. It may thwart dispute resolution and it may impede climate negotiations. To traverse this abyss, the inquiry uses instruments of legal philosophy (the philosophy of language, legal systematics (the study of legal systems, and legal hermeneutics (the legal practice of interpretation to delineate, distinguish and unify lex specialis principles that could form the foundations of a universal constitutional framework of international climate law. In doing so, it shows that climate justice is a function of the quality of the legal system.

  11. Calculation uncertainty of distribution-like parameters in NPP of PAKS

    International Nuclear Information System (INIS)

    Szecsenyi, Zsolt; Korpas, Layos

    2000-01-01

    In the reactor-physical point of view there were two important events in the Nuclear Power Plant of PAKS in this year. The Russian type profiled assemblies were loaded into the PAKS Unit 3, and new limitation system was introduced on the same Unit. It was required to solve a lot of problems because of these both events. One of these problems was the determination of uncertainty of quantities of the new limitation considering the fabrication uncertainties for the profiled assembly. The importance of determination of uncertainty is to guarantee on 99.9% level the avoidance of fuel failure. In this paper the principles of determination of calculation accuracy, applied methods and obtained results are presented in case of distribution-like parameters. A few elements of the method have been presented on earlier symposiums, so in this paper the whole method is just outlined. For example the GPT method was presented in the following paper: Uncertainty analysis of pin wise power distribution of WWER-440 assembly considering fabrication uncertainties. Finally in the summary of this paper additional intrinsic opportunities in the method are presented. (Authors)

  12. Generalised derived limits for radioisotopes of iodine

    International Nuclear Information System (INIS)

    Hughes, J.S.; Haywood, S.M.; Simmonds, J.R.

    1984-04-01

    Generalised Derived Limits (GDLs) are evaluated for iodine-125,129,131,132,133,134,135 in selected materials from the terrestrial and aquatic environments and for discharge to atmosphere. They are intended for use as convenient reference levels against which the results of environmental monitoring can be compared and atmospheric discharges assessed. GDLs are intended for use when the environmental contamination or discharge to atmosphere is less than about 5% of the GDL. If the level of environmental contamination or discharge to the atmosphere exceeds this percentage of the GDL it does not necessarily mean that the dose equivalents to members of the public are approaching the dose equivalent limit. It is rather an indication that it may be appropriate to obtain a more specific derived limit for the particular situation by reviewing the values of the parameters involved in the calculation. GDL values are specified for iodine radionuclides in water, soil, grass, sediments and various foodstuffs derived from the terrestrial and aquatic environments. GDLs are also given for iodine radionuclides on terrestrial surfaces and for their discharge to atmosphere. (author)

  13. Threshold corrections, generalised prepotentials and Eichler integrals

    Directory of Open Access Journals (Sweden)

    Carlo Angelantonj

    2015-08-01

    Full Text Available We continue our study of one-loop integrals associated to BPS-saturated amplitudes in N=2 heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur–Poincaré series in the complex structure modulus. The closure of Niebur–Poincaré series under modular derivatives implies that such integrals derive from holomorphic prepotentials fn, generalising the familiar prepotential of N=2 supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involving the Γ0(N Hauptmodul, a full characterisation of holomorphic prepotentials including their quantum monodromies, as well as concrete formulæ for holomorphic Yukawa couplings.

  14. Evaluation of cross-section uncertainties using physical constraints for 238U, 239Pu

    International Nuclear Information System (INIS)

    De Saint Jean, Cyrille; Privas, Edwin; Archier, Pascal; Noguere, Gilles; Litaize, Olivier; Leconte, Pierre; Bernard, David

    2014-01-01

    Neutron-induced reactions between 0 eV and 20 MeV are based on various physical properties such as nuclear reaction models, microscopic and integral measurements. Most of the time, the evaluation work is done independently between the resolved resonance range and the continuum, giving rise to mismatches for the cross-sections, larger uncertainties on boundary and no cross-correlation between high-energy domain and resonance range. In addition the use of integral experiment is sometimes only related to central values (evaluation is 'working fine' on a dedicated set of benchmarks) and reductions of uncertainties are not straightforward on cross-sections themselves: working fine could be mathematically reflected by a reduced uncertainty. As the CIELO initiative is to bring experts in each field to propose/discuss these matters, after having presented the status of 238 U and 239 Pu cross-sections covariances evaluation (for JEFF-3.2 as well as the WPEC SG34 subgroup), this paper will present several methodologies that may be used to avoid such effects on covariances. A first idea based on the use of experiments overlapping two energy domains appeared in the near past. It was reviewed and extended to the use of systematic uncertainties (normalisation for example) and for integral experiments as well. In addition, we propose a methodology taking into account physical constraints on an overlapping energy domain where both nuclear reaction models are used (continuity of both cross-sections and derivatives for example). The use of Lagrange multiplier (related to these constraints) in a classical generalised least square procedure will be exposed. Some academic examples will then be presented for both point-wise and multi-group cross-sections to present the methodologies. In addition, new results for 239 Pu will be presented on resonance range and higher energies to reduce capture and fission cross-section uncertainties by using integral experiments (JEZEBEL experiment as

  15. Realistic Approach of the Relations of Uncertainty of Heisenberg

    Directory of Open Access Journals (Sweden)

    Paul E. Sterian

    2013-01-01

    Full Text Available Due to the requirements of the principle of causality in the theory of relativity, one cannot make a device for the simultaneous measuring of the canonical conjugate variables in the conjugate Fourier spaces. Instead of admitting that a particle’s position and its conjugate momentum cannot be accurately measured at the same time, we consider the only probabilities which can be determined when working at subatomic level to be valid. On the other hand, based on Schwinger's action principle and using the quadridimensional form of the unitary transformation generator function of the quantum operators in the paper, the general form of the evolution equation for these operators is established. In the nonrelativistic case one obtains the Heisenberg's type evolution equations which can be particularized to derive Heisenberg's uncertainty relations. The analysis of the uncertainty relations as implicit evolution equations allows us to put into evidence the intrinsic nature of the correlation expressed by these equations in straight relations with the measuring process. The independence of the quantisation postulate from the causal evolution postulate of quantum mechanics is also put into discussion.

  16. Living with uncertainty: from the precautionary principle to the methodology of ongoing normative assessment

    International Nuclear Information System (INIS)

    Dupuy, J.P.; Grinbaum, A.

    2005-01-01

    The analysis of our epistemic situation regarding singular events, such as abrupt climate change, shows essential limitations in the traditional modes of dealing with uncertainty. Typical cognitive barriers lead to the paralysis of action. What is needed is taking seriously the reality of the future. We argue for the application of the methodology of ongoing normative assessment. We show that it is, paradoxically, a matter of forming a project on the basis of a fixed future which one does not want, and this in a coordinated way at the level of social institutions. Ongoing assessment may be viewed as a prescription to live with uncertainty, in a particular sense of the term, in order for a future catastrophe not to occur. The assessment is necessarily normative in that it must include the anticipation of a retrospective ethical judgment on present choices (notion of moral luck). (authors)

  17. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  18. Mapping shape to visuomotor mapping: learning and generalisation of sensorimotor behaviour based on contextual information.

    Directory of Open Access Journals (Sweden)

    Loes C J van Dam

    2015-03-01

    Full Text Available Humans can learn and store multiple visuomotor mappings (dual-adaptation when feedback for each is provided alternately. Moreover, learned context cues associated with each mapping can be used to switch between the stored mappings. However, little is known about the associative learning between cue and required visuomotor mapping, and how learning generalises to novel but similar conditions. To investigate these questions, participants performed a rapid target-pointing task while we manipulated the offset between visual feedback and movement end-points. The visual feedback was presented with horizontal offsets of different amounts, dependent on the targets shape. Participants thus needed to use different visuomotor mappings between target location and required motor response depending on the target shape in order to "hit" it. The target shapes were taken from a continuous set of shapes, morphed between spiky and circular shapes. After training we tested participants performance, without feedback, on different target shapes that had not been learned previously. We compared two hypotheses. First, we hypothesised that participants could (explicitly extract the linear relationship between target shape and visuomotor mapping and generalise accordingly. Second, using previous findings of visuomotor learning, we developed a (implicit Bayesian learning model that predicts generalisation that is more consistent with categorisation (i.e. use one mapping or the other. The experimental results show that, although learning the associations requires explicit awareness of the cues' role, participants apply the mapping corresponding to the trained shape that is most similar to the current one, consistent with the Bayesian learning model. Furthermore, the Bayesian learning model predicts that learning should slow down with increased numbers of training pairs, which was confirmed by the present results. In short, we found a good correspondence between the

  19. Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

    2006-09-01

    This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

  20. Isotropic LQC and LQC-inspired models with a massless scalar field as generalised Brans-Dicke theories

    Science.gov (United States)

    Rama, S. Kalyana

    2018-06-01

    We explore whether generalised Brans-Dicke theories, which have a scalar field Φ and a function ω (Φ ), can be the effective actions leading to the effective equations of motion of the LQC and the LQC-inspired models, which have a massless scalar field σ and a function f( m). We find that this is possible for isotropic cosmology. We relate the pairs (σ , f) and (Φ , ω ) and, using examples, illustrate these relations. We find that near the bounce of the LQC evolutions for which f(m) = sin m, the corresponding field Φ → 0 and the function ω (Φ ) ∝ Φ ^2. We also find that the class of generalised Brans-Dicke theories, which we had found earlier to lead to non singular isotropic evolutions, may be written as an LQC-inspired model. The relations found here in the isotropic cases do not apply to the anisotropic cases, which perhaps require more general effective actions.

  1. The normative basis of the Precautionary Principle

    Energy Technology Data Exchange (ETDEWEB)

    Schomberg, Rene von [European Commission, Directorate General for Research, Brussels (Belgium)

    2006-09-15

    Precautionary measures are provisional measures by nature, and need to be regularly reviewed when scientific information either calls for relaxation or strengthening of those measures. Within the EU context, these provisional measures do not have a prefixed 'expiry' date: one can only lift precautionary measures if scientific knowledge has progressed to a point that one would be able to translate (former) uncertainties in terms of risk and adverse effects in terms of defined, consensual levels of harm/damage. Precautionary frameworks facilitate in particular deliberation at the science/policy/society interfaces to which risk management is fully connected. Applying the precautionary principle is to be seen as a normative risk management exercise which builds upon scientific risk assessments. An ongoing scientific and normative deliberation at the science/policy interface involves a shift in science centred debates on the probability of risks towards a science informed debate on uncertainties and plausible adverse effects: this means that decisions should not only be based on available data but on a broad scientific knowledge base including a variety of scientific disciplines. The invocation, implementation and application of the precautionary principle follows a progressive line of different levels of deliberations (which obviously can be interconnected to each other but are distinguished here for analytical purposes). I have listed these levels of deliberation in a table. The table provides a model for guiding all the relevant normative levels of deliberation which are all needed in order to eventually make the legitimate conclusions on the acceptability of products or processes. The table provides a progressive line of those levels of deliberations from the initial invocation of the precautionary principle at the political level down to level of risk management decisions but at the same time show their inter relatedness. Although the table may suggest a

  2. The normative basis of the Precautionary Principle

    International Nuclear Information System (INIS)

    Schomberg, Rene von

    2006-01-01

    Precautionary measures are provisional measures by nature, and need to be regularly reviewed when scientific information either calls for relaxation or strengthening of those measures. Within the EU context, these provisional measures do not have a prefixed 'expiry' date: one can only lift precautionary measures if scientific knowledge has progressed to a point that one would be able to translate (former) uncertainties in terms of risk and adverse effects in terms of defined, consensual levels of harm/damage. Precautionary frameworks facilitate in particular deliberation at the science/policy/society interfaces to which risk management is fully connected. Applying the precautionary principle is to be seen as a normative risk management exercise which builds upon scientific risk assessments. An ongoing scientific and normative deliberation at the science/policy interface involves a shift in science centred debates on the probability of risks towards a science informed debate on uncertainties and plausible adverse effects: this means that decisions should not only be based on available data but on a broad scientific knowledge base including a variety of scientific disciplines. The invocation, implementation and application of the precautionary principle follows a progressive line of different levels of deliberations (which obviously can be interconnected to each other but are distinguished here for analytical purposes). I have listed these levels of deliberation in a table. The table provides a model for guiding all the relevant normative levels of deliberation which are all needed in order to eventually make the legitimate conclusions on the acceptability of products or processes. The table provides a progressive line of those levels of deliberations from the initial invocation of the precautionary principle at the political level down to level of risk management decisions but at the same time show their inter relatedness. Although the table may suggest a particular

  3. Utility of natural generalised inverse technique in the interpretation of dyke structures

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, M.M.M.; Murty, T.V.R.; Rao, P.R.; Lakshminarayana, S.; Subrahmanyam, A.S.; Murthy, K.S.R.

    environs along the central west coast of India: analysis using EOF, J. Geophys.Res.,91(1986) 8523 -8526. 9 Marquardt D W, An algorithm for least-squares estimation of non-linear parameters, J. Soc. Indust. Appl. Math, 11 (1963) 431-441. INDIAN J. MAR... technique in reconstruction of gravity anomalies due to a fault, Indian J. Pure. Appl. Math., 34 (2003) 31-47. 16 Ramana Murty T V, Somayajulu Y K & Murty C S, Reconstruction of sound speed profile through natural generalised inverse technique, Indian J...

  4. Cosmic rays and tests of fundamental principles

    Science.gov (United States)

    Gonzalez-Mestres, Luis

    2011-03-01

    It is now widely acknowledged that cosmic rays experiments can test possible new physics directly generated at the Planck scale or at some other fundamental scale. By studying particle properties at energies far beyond the reach of any man-made accelerator, they can yield unique checks of basic principles. A well-known example is provided by possible tests of special relativity at the highest cosmic-ray energies. But other essential ingredients of standard theories can in principle be tested: quantum mechanics, uncertainty principle, energy and momentum conservation, effective space-time dimensions, hamiltonian and lagrangian formalisms, postulates of cosmology, vacuum dynamics and particle propagation, quark and gluon confinement, elementariness of particles…Standard particle physics or string-like patterns may have a composite origin able to manifest itself through specific cosmic-ray signatures. Ultra-high energy cosmic rays, but also cosmic rays at lower energies, are probes of both "conventional" and new Physics. Status, prospects, new ideas, and open questions in the field are discussed.

  5. Cosmic rays and tests of fundamental principles

    International Nuclear Information System (INIS)

    Gonzalez-Mestres, Luis

    2011-01-01

    It is now widely acknowledged that cosmic rays experiments can test possible new physics directly generated at the Planck scale or at some other fundamental scale. By studying particle properties at energies far beyond the reach of any man-made accelerator, they can yield unique checks of basic principles. A well-known example is provided by possible tests of special relativity at the highest cosmic-ray energies. But other essential ingredients of standard theories can in principle be tested: quantum mechanics, uncertainty principle, energy and momentum conservation, effective space-time dimensions, hamiltonian and lagrangian formalisms, postulates of cosmology, vacuum dynamics and particle propagation, quark and gluon confinement, elementariness of particles... Standard particle physics or string-like patterns may have a composite origin able to manifest itself through specific cosmic-ray signatures. Ultra-high energy cosmic rays, but also cosmic rays at lower energies, are probes of both 'conventional' and new Physics. Status, prospects, new ideas, and open questions in the field are discussed.

  6. Semantic 3d City Model to Raster Generalisation for Water Run-Off Modelling

    Science.gov (United States)

    Verbree, E.; de Vries, M.; Gorte, B.; Oude Elberink, S.; Karimlou, G.

    2013-09-01

    Water run-off modelling applied within urban areas requires an appropriate detailed surface model represented by a raster height grid. Accurate simulations at this scale level have to take into account small but important water barriers and flow channels given by the large-scale map definitions of buildings, street infrastructure, and other terrain objects. Thus, these 3D features have to be rasterised such that each cell represents the height of the object class as good as possible given the cell size limitations. Small grid cells will result in realistic run-off modelling but with unacceptable computation times; larger grid cells with averaged height values will result in less realistic run-off modelling but fast computation times. This paper introduces a height grid generalisation approach in which the surface characteristics that most influence the water run-off flow are preserved. The first step is to create a detailed surface model (1:1.000), combining high-density laser data with a detailed topographic base map. The topographic map objects are triangulated to a set of TIN-objects by taking into account the semantics of the different map object classes. These TIN objects are then rasterised to two grids with a 0.5m cell-spacing: one grid for the object class labels and the other for the TIN-interpolated height values. The next step is to generalise both raster grids to a lower resolution using a procedure that considers the class label of each cell and that of its neighbours. The results of this approach are tested and validated by water run-off model runs for different cellspaced height grids at a pilot area in Amersfoort (the Netherlands). Two national datasets were used in this study: the large scale Topographic Base map (BGT, map scale 1:1.000), and the National height model of the Netherlands AHN2 (10 points per square meter on average). Comparison between the original AHN2 height grid and the semantically enriched and then generalised height grids shows

  7. Spatial generalised linear mixed models based on distances.

    Science.gov (United States)

    Melo, Oscar O; Mateu, Jorge; Melo, Carlos E

    2016-10-01

    Risk models derived from environmental data have been widely shown to be effective in delineating geographical areas of risk because they are intuitively easy to understand. We present a new method based on distances, which allows the modelling of continuous and non-continuous random variables through distance-based spatial generalised linear mixed models. The parameters are estimated using Markov chain Monte Carlo maximum likelihood, which is a feasible and a useful technique. The proposed method depends on a detrending step built from continuous or categorical explanatory variables, or a mixture among them, by using an appropriate Euclidean distance. The method is illustrated through the analysis of the variation in the prevalence of Loa loa among a sample of village residents in Cameroon, where the explanatory variables included elevation, together with maximum normalised-difference vegetation index and the standard deviation of normalised-difference vegetation index calculated from repeated satellite scans over time. © The Author(s) 2013.

  8. Optimising, generalising and integrating educational practice using neuroscience

    Science.gov (United States)

    Colvin, Robert

    2016-07-01

    Practical collaboration at the intersection of education and neuroscience research is difficult because the combined discipline encompasses both the activity of microscopic neurons and the complex social interactions of teachers and students in a classroom. Taking a pragmatic view, this paper discusses three education objectives to which neuroscience can be effectively applied: optimising, generalising and integrating instructional techniques. These objectives are characterised by: (1) being of practical importance; (2) building on existing education and cognitive research; and (3) being infeasible to address based on behavioural experiments alone. The focus of the neuroscientific aspect of collaborative research should be on the activity of the brain before, during and after learning a task, as opposed to performance of a task. The objectives are informed by literature that highlights possible pitfalls with educational neuroscience research, and are described with respect to the static and dynamic aspects of brain physiology that can be measured by current technology.

  9. Generalised two target localisation using passive monopulse radar

    KAUST Repository

    Jardak, Seifallah

    2017-04-07

    The simultaneous lobing technique, also known as monopulse technique, has been widely used for fast target localisation and tracking purposes. Many works focused on accurately localising one or two targets lying within a narrow beam centred around the monopulse antenna boresight. In this study, a new approach is proposed, which uses the outputs of four antennas to rapidly localise two point targets present in the hemisphere. If both targets have the same elevation angle, the proposed scheme cannot detect them. To detect such targets, a second set of antennas is required. In this study, to detect two targets at generalised locations, the antenna array is divided into multiple overlapping sets each of four antennas. Two algorithms are proposed to combine the outputs from multiple sets and improve the detection performance. Simulation results show that the algorithm is able to localise both targets with <;2° mean square error in azimuth and elevation.

  10. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  11. Confronting Uncertainty in Life Cycle Assessment Used for Decision Support

    DEFF Research Database (Denmark)

    Herrmann, Ivan Tengbjerg; Hauschild, Michael Zwicky; Sohn, Michael D.

    2014-01-01

    the decision maker (DM) in making the best possible choice for the environment. At present, some DMs do not trust the LCA to be a reliable decisionsupport tool—often because DMs consider the uncertainty of an LCA to be too large. The standard evaluation of uncertainty in LCAs is an ex-post approach that can...... regarding which type of LCA study to employ for the decision context at hand. This taxonomy enables the derivation of an LCA classification matrix to clearly identify and communicate the type of a given LCA. By relating the LCA classification matrix to statistical principles, we can also rank the different......The aim of this article is to help confront uncertainty in life cycle assessments (LCAs) used for decision support. LCAs offer a quantitative approach to assess environmental effects of products, technologies, and services and are conducted by an LCA practitioner or analyst (AN) to support...

  12. Generalised pustular psoriasis, psoriatic arthritis and nephrotic syndrome associated with systemic amyloidosis.

    Science.gov (United States)

    David, M; Abraham, D; Weinberger, A; Feuerman, E J

    1982-09-01

    The case report is presented of a psoriatic patient with arthropathy, generalised pustular psoriasis and nephrotic syndrome, in whom systemic amyloidosis developed. The literature reports 13 cases of psoriasis associated with amyloidosis, 3 of whom suffered from pustular psoriasis as does our case. With the addition of our case, 12 of these 14 had concomitant arthropathy. This seems to suggest that arthritis is an important factor in the appearance of amyloidosis. Rectal biopsy and/or renal biopsy may be helpful in establishing the diagnosis of amyloidosis relatively early in patients with psoriatic arthritis.

  13. Situational and Generalised Conduct Problems and Later Life Outcomes: Evidence from a New Zealand Birth Cohort

    Science.gov (United States)

    Fergusson, David M.; Boden, Joseph M.; Horwood, L. John

    2009-01-01

    Background: There is considerable evidence suggesting that many children show conduct problems that are specific to a given context (home; school). What is less well understood is the extent to which children with situation-specific conduct problems show similar outcomes to those with generalised conduct problems. Methods: Data were gathered as…

  14. Methodology for the assessment of measuring uncertainties of articulated arm coordinate measuring machines

    International Nuclear Information System (INIS)

    Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Fontaine, Jean François; Coquet, Richard

    2014-01-01

    Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed. (paper)

  15. Application of fuzzy system theory in addressing the presence of uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Yusmye, A. Y. N. [Institute of Engineering Mathematics, Universiti Malaysia Perlis Kampus Pauh Putra, 02600, Arau, Perlis (Malaysia); Goh, B. Y.; Adnan, N. F.; Ariffin, A. K. [Department of Mechanical and Materials, Faculty of Engineering and Built Environment Universiti Kebangsaan Malaysia 43600 UKM Bangi, Selangor (Malaysia)

    2015-02-03

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  16. Application of fuzzy system theory in addressing the presence of uncertainties

    International Nuclear Information System (INIS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-01-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method

  17. Monte Carlo approaches for uncertainty quantification of criticality for system dimensions

    International Nuclear Information System (INIS)

    Kiedrowski, B.C.; Brown, F.B.

    2013-01-01

    One of the current challenges in nuclear engineering computations is the issue of performing uncertainty analysis for either calculations or experimental measurements. This paper specifically focuses on the issue of estimating the uncertainties arising from geometric tolerances. For this paper, two techniques for uncertainty quantification are studied. The first is the forward propagation technique, which can be thought of as a 'brute force' approach; uncertain system parameters are randomly sampled, the calculation is run, and uncertainties are found from the empirically obtained distribution of results. This approach need make no approximations in principle, but is very computationally expensive. The other approach investigated is the adjoint-based approach; system sensitivities are computed via a single Monte Carlo calculation and those are used with a covariance matrix to provide a linear estimate of the uncertainty. Demonstration calculations are performed with the MCNP6 code for both techniques. The 2 techniques are tested on 2 cases: the first case is a solid, bare cylinder of Pu-metal while the second case is a can of plutonium nitrate solution. The results show that the forward and adjoint approaches appear to agree in some cases where the responses are not non-linearly correlated. In other cases, the uncertainties in the effective multiplication k disagree for reasons not yet known

  18. Investigation of the cognitive variables associated with worry in children with Generalised Anxiety Disorder and their parents.

    Science.gov (United States)

    Donovan, Caroline L; Holmes, Monique C; Farrell, Lara J

    2016-03-01

    Intolerance of uncertainty (IU), negative beliefs about worry (NBW), positive beliefs about worry (PBW), negative problem orientation (NPO) and cognitive avoidance (CA) have been found to be integral in the conceptualisation of Generalised Anxiety Disorder (GAD) in adults, yet they have rarely been investigated in children with GAD. This study sought to determine (a) whether IU, NBW, PBW, NPO and CA differ between children diagnosed with GAD and non-anxious children and (b) to examine whether IU, NBW, PBW, NPO and CA differ between parents of children diagnosed with GAD and parents of children without an anxiety disorder. Participants were 50 children (aged 7-12 years), plus one of their parents. The 25 GAD children and 25 non-anxious children were matched on age and gender. Parents and children completed clinical diagnostic interviews, as well as a battery of questionnaires measuring worry, IU, NBW, PBW, NPO and CA. Children with GAD endorsed significantly higher levels of worry, IU, NBW, NPO and CA, but not PBW compared to non-anxious children. Parents of children with GAD did not differ from parents of non-anxious children on any of the variables. The study was limited by it's use of modified adult measures for some variables and a lack of heterogeneity in the sample. The cognitive variables of IU, NBW, NPO and CA may also be important in the conceptualisation and treatment of GAD in children as they are in adults. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Estimating incidence from prevalence in generalised HIV epidemics: methods and validation.

    Directory of Open Access Journals (Sweden)

    Timothy B Hallett

    2008-04-01

    Full Text Available HIV surveillance of generalised epidemics in Africa primarily relies on prevalence at antenatal clinics, but estimates of incidence in the general population would be more useful. Repeated cross-sectional measures of HIV prevalence are now becoming available for general populations in many countries, and we aim to develop and validate methods that use these data to estimate HIV incidence.Two methods were developed that decompose observed changes in prevalence between two serosurveys into the contributions of new infections and mortality. Method 1 uses cohort mortality rates, and method 2 uses information on survival after infection. The performance of these two methods was assessed using simulated data from a mathematical model and actual data from three community-based cohort studies in Africa. Comparison with simulated data indicated that these methods can accurately estimates incidence rates and changes in incidence in a variety of epidemic conditions. Method 1 is simple to implement but relies on locally appropriate mortality data, whilst method 2 can make use of the same survival distribution in a wide range of scenarios. The estimates from both methods are within the 95% confidence intervals of almost all actual measurements of HIV incidence in adults and young people, and the patterns of incidence over age are correctly captured.It is possible to estimate incidence from cross-sectional prevalence data with sufficient accuracy to monitor the HIV epidemic. Although these methods will theoretically work in any context, we have able to test them only in southern and eastern Africa, where HIV epidemics are mature and generalised. The choice of method will depend on the local availability of HIV mortality data.

  20. Adapting Metacognitive Therapy to Children with Generalised Anxiety Disorder

    DEFF Research Database (Denmark)

    Esbjørn, Barbara Hoff; Normann, Nicoline; Reinholdt-Dunne, Marie Louise

    2015-01-01

    -c) with generalised anxiety disorder (GAD) and create suggestions for an adapted manual. The adaptation was based on the structure and techniques used in MCT for adults with GAD. However, the developmental limitations of children were taken into account. For instance, therapy was aided with worksheets, practical......The metacognitive model and therapy has proven to be a promising theory and intervention for emotional disorders in adults. The model has also received empirical support in normal and clinical child samples. The purpose of the present study was to adapt metacognitive therapy to children (MCT...... exercises and delivered in a group format. Overall, the intervention relied heavily on practising MCT techniques in vivo with therapist assistance. A detailed description of how the manual was adapted for this age group is given, and examples from a group of four children are presented in a case series...

  1. Uncertainties in gas dispersion at the Bruce heavy water plant

    International Nuclear Information System (INIS)

    Alp, E.; Ciccone, A.

    1995-07-01

    There have been concerns regarding the uncertainties in atmospheric dispersion of gases released from the Bruce Heavy Water Plant (BHWP). The concern arises due to the toxic nature of H 2 S, and its combustion product SO 2 . In this study, factors that contribute to the uncertainties, such as the effect of the shoreline setting, the potentially heavy gas nature of H 2 S releases, and concentration fluctuations, have been investigated. The basic physics of each of these issues has been described along with fundamental modelling principles. Recommendations have been provided on available computer models that would be suitable for modelling gas dispersion in the vicinity of the BHWP. (author). 96 refs., 4 tabs., 25 figs

  2. Uncertainties in gas dispersion at the Bruce heavy water plant

    Energy Technology Data Exchange (ETDEWEB)

    Alp, E; Ciccone, A [Concord Environmental Corp., Downsview, ON (Canada)

    1995-07-01

    There have been concerns regarding the uncertainties in atmospheric dispersion of gases released from the Bruce Heavy Water Plant (BHWP). The concern arises due to the toxic nature of H{sub 2}S, and its combustion product SO{sub 2}. In this study, factors that contribute to the uncertainties, such as the effect of the shoreline setting, the potentially heavy gas nature of H{sub 2}S releases, and concentration fluctuations, have been investigated. The basic physics of each of these issues has been described along with fundamental modelling principles. Recommendations have been provided on available computer models that would be suitable for modelling gas dispersion in the vicinity of the BHWP. (author). 96 refs., 4 tabs., 25 figs.

  3. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.

  4. Uncertainty evaluation of thickness and warp of a silicon wafer measured by a spectrally resolved interferometer

    Science.gov (United States)

    Praba Drijarkara, Agustinus; Gergiso Gebrie, Tadesse; Lee, Jae Yong; Kang, Chu-Shik

    2018-06-01

    Evaluation of uncertainty of thickness and gravity-compensated warp of a silicon wafer measured by a spectrally resolved interferometer is presented. The evaluation is performed in a rigorous manner, by analysing the propagation of uncertainty from the input quantities through all the steps of measurement functions, in accordance with the ISO Guide to the Expression of Uncertainty in Measurement. In the evaluation, correlation between input quantities as well as uncertainty attributed to thermal effect, which were not included in earlier publications, are taken into account. The temperature dependence of the group refractive index of silicon was found to be nonlinear and varies widely within a wafer and also between different wafers. The uncertainty evaluation described here can be applied to other spectral interferometry applications based on similar principles.

  5. Decoherence effect on quantum-memory-assisted entropic uncertainty relations

    Science.gov (United States)

    Ming, Fei; Wang, Dong; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-01-01

    Uncertainty principle significantly provides a bound to predict precision of measurement with regard to any two incompatible observables, and thereby plays a nontrivial role in quantum precision measurement. In this work, we observe the dynamical features of the quantum-memory-assisted entropic uncertainty relations (EUR) for a pair of incompatible measurements in an open system characterized by local generalized amplitude damping (GAD) noises. Herein, we derive the dynamical evolution of the entropic uncertainty with respect to the measurement affecting by the canonical GAD noises when particle A is initially entangled with quantum memory B. Specifically, we examine the dynamics of EUR in the frame of three realistic scenarios: one case is that particle A is affected by environmental noise (GAD) while particle B as quantum memory is free from any noises, another case is that particle B is affected by the external noise while particle A is not, and the last case is that both of the particles suffer from the noises. By analytical methods, it turns out that the uncertainty is not full dependent of quantum correlation evolution of the composite system consisting of A and B, but the minimal conditional entropy of the measured subsystem. Furthermore, we present a possible physical interpretation for the behavior of the uncertainty evolution by means of the mixedness of the observed system; we argue that the uncertainty might be dramatically correlated with the systematic mixedness. Furthermore, we put forward a simple and effective strategy to reduce the measuring uncertainty of interest upon quantum partially collapsed measurement. Therefore, our explorations might offer an insight into the dynamics of the entropic uncertainty relation in a realistic system, and be of importance to quantum precision measurement during quantum information processing.

  6. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  7. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  8. Study of quantum hadronic states using new optimum principles and new coherent production mechanisms

    International Nuclear Information System (INIS)

    Ion, D. B.; Ion, M. L.; Ion-Mihai, R.

    2002-01-01

    We introduced a new kind of quantum entropy for quantum scattering: conjugated nonextensivity entropy S Jθ bar (p,q). Using this new kind of nonextensive entropy we studied the nonextensive quantum scattering states of the hadronic interactions. We proved that probability distributions produced at quantum equilibrium coincide with optimal distributions given by the principle of minimum distance in the space of quantum scattering states. Using optimal states we proved new uncertainty relations and new entropic bands: For experimental tests we used the available phase shifts for the pion-nucleus scatterings and also for the pion-nucleon scatterings. Experimental tests of entropic bands and principle of maximum entropy for conjugated nonextensivity entropy are compared with entropic bands for usual entropy of joint probability S Jθ bar (p) and for pion-nucleus scatterings. Also given are the experimental tests of entropic bands and principle of maximum entropy for conjugated nonextensivity entropy compared with entropic bands for usual entropy of joint probability S Jθ bar (p) and for pion-nucleon scatterings.Our experimental tests proved the existence of the principle of limited entropic uncertainty in hadronic scattering. The experimental tests showed clearly that quantum elastic scattering is well described by the principle of minimum distance in the space of quantum states. By these results we obtained strong evidence for the nonextensivity of the hadronic scattering statistics. (authors)

  9. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  10. Test the principle of maximum entropy in constant sum 2×2 game: Evidence in experimental economics

    International Nuclear Information System (INIS)

    Xu, Bin; Zhang, Hongen; Wang, Zhijian; Zhang, Jianbo

    2012-01-01

    By using laboratory experimental data, we test the uncertainty of strategy type in various competing environments with two-person constant sum 2×2 game in the social system. It firstly shows that, in these competing game environments, the outcome of human's decision-making obeys the principle of the maximum entropy. -- Highlights: ► Test the uncertainty in two-person constant sum games with experimental data. ► On game level, the constant sum game fits the principle of maximum entropy. ► On group level, all empirical entropy values are close to theoretical maxima. ► The results can be different for the games that are not constant sum game.

  11. Uncertainty Analysis of the Temperature–Resistance Relationship of Temperature Sensing Fabric

    Directory of Open Access Journals (Sweden)

    Muhammad Dawood Husain

    2016-11-01

    Full Text Available This paper reports the uncertainty analysis of the temperature–resistance (TR data of the newly developed temperature sensing fabric (TSF, which is a double-layer knitted structure fabricated on an electronic flat-bed knitting machine, made of polyester as a basal yarn, and embedded with fine metallic wire as sensing element. The measurement principle of the TSF is identical to temperature resistance detector (RTD; that is, change in resistance due to change in temperature. The regression uncertainty (uncertainty within repeats and repeatability uncertainty (uncertainty among repeats were estimated by analysing more than 300 TR experimental repeats of 50 TSF samples. The experiments were performed under dynamic heating and cooling environments on a purpose-built test rig within the temperature range of 20–50 °C. The continuous experimental data was recorded through LabVIEW-based graphical user interface. The result showed that temperature and resistance values were not only repeatable but reproducible, with only minor variations. The regression uncertainty was found to be less than ±0.3 °C; the TSF sample made of Ni and W wires showed regression uncertainty of <±0.13 °C in comparison to Cu-based TSF samples (>±0.18 °C. The cooling TR data showed considerably reduced values (±0.07 °C of uncertainty in comparison with the heating TR data (±0.24 °C. The repeatability uncertainty was found to be less than ±0.5 °C. By increasing the number of samples and repeats, the uncertainties may be reduced further. The TSF could be used for continuous measurement of the temperature profile on the surface of the human body.

  12. Lepton mixing predictions including Majorana phases from Δ(6n2 flavour symmetry and generalised CP

    Directory of Open Access Journals (Sweden)

    Stephen F. King

    2014-09-01

    Full Text Available Generalised CP transformations are the only known framework which allows to predict Majorana phases in a flavour model purely from symmetry. For the first time generalised CP transformations are investigated for an infinite series of finite groups, Δ(6n2=(Zn×Zn⋊S3. In direct models the mixing angles and Dirac CP phase are solely predicted from symmetry. The Δ(6n2 flavour symmetry provides many examples of viable predictions for mixing angles. For all groups the mixing matrix has a trimaximal middle column and the Dirac CP phase is 0 or π. The Majorana phases are predicted from residual flavour and CP symmetries where α21 can take several discrete values for each n and the Majorana phase α31 is a multiple of π. We discuss constraints on the groups and CP transformations from measurements of the neutrino mixing angles and from neutrinoless double-beta decay and find that predictions for mixing angles and all phases are accessible to experiments in the near future.

  13. Lepton mixing predictions including Majorana phases from Δ(6n2) flavour symmetry and generalised CP

    International Nuclear Information System (INIS)

    King, Stephen F.; Neder, Thomas

    2014-01-01

    Generalised CP transformations are the only known framework which allows to predict Majorana phases in a flavour model purely from symmetry. For the first time generalised CP transformations are investigated for an infinite series of finite groups, Δ(6n 2 )=(Z n ×Z n )⋊S 3 . In direct models the mixing angles and Dirac CP phase are solely predicted from symmetry. The Δ(6n 2 ) flavour symmetry provides many examples of viable predictions for mixing angles. For all groups the mixing matrix has a trimaximal middle column and the Dirac CP phase is 0 or π. The Majorana phases are predicted from residual flavour and CP symmetries where α 21 can take several discrete values for each n and the Majorana phase α 31 is a multiple of π. We discuss constraints on the groups and CP transformations from measurements of the neutrino mixing angles and from neutrinoless double-beta decay and find that predictions for mixing angles and all phases are accessible to experiments in the near future

  14. FROM PHENOMENA AND LAWS OF NATURE TO INITIAL DATA SYMMETRY PRINCIPLES (EXPERIENCE OF RELATIONSHIP OF NATURAL SCIENCE AND THEOLOGY

    Directory of Open Access Journals (Sweden)

    Victor Nikolaevich Pervushin

    2015-01-01

    Full Text Available The aim of the investigation is to show a role of principles of symmetry of the initial data in formation of the consistent physical theory in a context of the newest advances in cosmology and physics of elementary particles.Methods. Methodological problems of modernity are considered on the basis of the retrospective analysis of physical theories, history of theology, comparison and generalisation of knowledge, the facts and positions from scientific, philosophical and religious spheres.Results and scientific novelty. The problems of consistency and completeness of scientific knowledge and convergence of the maintenance of religious texts and the observant scientific data in the physics and cosmology are discussed by the example of modern cosmologic models of the description of the Universe. It is proved that such convergence is claimed and actual not only concerning classification of physical processes in the Universe, including its origin from vacuum, but also in area of ontology and at forming of logics of scientific researches.Former and newest scientific achievements in the physics and cosmology are reinterpreted in a context of Hilbert geometrodynamics, added with a choice of relative standards of lengths and principles measurement of conformal symmetry.Practical significance. The author sees the further prospect of development of the scientific theory in a priority of conformal symmetry of a totality of any initial research data. So, in accordance with conformal symmetry, elementary objects of space-time are twistors that mathematically equivalent to cubits or to quantum generalisations of bits – information units. The general theory of knowledge eventually conducts to the fundamental theory of the information which, probably, will accept the name of quantum informodynamics, by analogy with quantum chromodynamics.

  15. Understanding and applying principles of social cognition and decision making in adaptive environmental governance

    Directory of Open Access Journals (Sweden)

    Daniel A. DeCaro

    2017-03-01

    Full Text Available Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1 encouraging collaborative problem solving, (2 garnering social acceptance and commitment, and (3 cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people's decision making that cloud their judgment and create conflict. These systems must also satisfy people's fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance.

  16. Multiple periodic-soliton solutions of the (3+1)-dimensional generalised shallow water equation

    Science.gov (United States)

    Li, Ye-Zhou; Liu, Jian-Guo

    2018-06-01

    Based on the extended variable-coefficient homogeneous balance method and two new ansätz functions, we construct auto-Bäcklund transformation and multiple periodic-soliton solutions of (3 {+} 1)-dimensional generalised shallow water equations. Completely new periodic-soliton solutions including periodic cross-kink wave, periodic two-solitary wave and breather type of two-solitary wave are obtained. In addition, cross-kink three-soliton and cross-kink four-soliton solutions are derived. Furthermore, propagation characteristics and interactions of the obtained solutions are discussed and illustrated in figures.

  17. New exact travelling wave solutions of generalised sinh- Gordon and (2 + 1-dimensional ZK-BBM equations

    Directory of Open Access Journals (Sweden)

    Sachin Kumar

    2012-10-01

    Full Text Available Exact travelling wave solutions have been established for generalised sinh-Gordon andgeneralised (2+1 dimensional ZK-BBM equations by using GG      expansion method whereG  G( satisfies a second-order linear ordinary differential equation. The travelling wave solutionsare expressed by hyperbolic, trigonometric and rational functions.

  18. Marine environmental protection, sustainability and the precautionary principle

    International Nuclear Information System (INIS)

    Johnston, P.; Santillo, D.; Stringer, R.

    1999-01-01

    The global oceans provide a diverse array of ecosystem services which cannot be replaced by technological means and are therefore of potentially infinite value. While valuation of ecosystem services is a useful qualitative metric, unresolved uncertainties limit its application in the regulatory and policy domain. This paper evaluates current human activities in terms of their conformity to four principles of sustainability. Violation of any one of the principles indicates that a given activity is unsustainable and that controlling measures are required. Examples of human uses of the oceans can be evaluated using these principles, taking into account also the transgenerational obligations of the current global population. When three major issues concerning the oceans: Land based activities, fisheries and climatic change are examined in this way, they may easily be shown to be globally unsustainable. It is argued that effective environmental protection can best be achieved through the application of a precautionary approach. (author)

  19. Proceedings of a workshop on dealing with uncertainties in the hydroelectric energy business. CD-ROM ed.

    International Nuclear Information System (INIS)

    2004-01-01

    This workshop was attended by experts in Canadian and international hydroelectric utilities to exchange information on current practices and opportunities for improvement or future cooperation. The discussions focused on reducing the uncertainties associated with hydroelectric power production. Although significant improvements have been made in the efficiency, reliability and safety of hydroelectric power production, the sector is still challenged by the uncertainty of water supply which depends greatly on weather conditions. Energy markets pose another challenge to power producers in terms of energy supply, energy demand and energy prices. The workshop focused on 3 themes: (1) weather and hydrologic uncertainty, (2) market uncertainty, and (3) decision making models using uncertainty principles surrounding water resource planning and operation. The workshop featured 22 presentations of which 11 have been indexed separately for inclusion in this database. refs., tabs., figs

  20. Uncertainty analysis of 137Cs and 90Sr activity in borehole water from a waste disposal site

    International Nuclear Information System (INIS)

    Dafauti, Sunita; Pulhani, Vandana; Datta, D.; Hegde, A.G.

    2005-01-01

    Uncertainty quantification (UQ) is the quantitative characterization and use of uncertainty in experimental applications. There are two distinct types of uncertainty variability which can be quantified in principle using classical probability theory and lack of knowledge which requires more than classical probability theory for its quantification. Fuzzy set theory was applied to quantify the second type of uncertainty associated with the measurement of activity due to 137 Cs and 90 Sr present in bore-well water samples from a waste disposal site. The upper and lower limits of concentration were computed and it may be concluded from the analysis that the alpha cut technique of fuzzy set theory is a good nonprecise estimator of these types of bounds. (author)

  1. The precautionary principle as a rational decision criterion; Foere var-prinsippet som rasjonelt beslutningsgrunnlag

    Energy Technology Data Exchange (ETDEWEB)

    Hovi, Jon

    2001-12-01

    The paper asks if the precautionary principle may be seen as a rational decision criterion. Six main questions are discussed. 1. Does the principle basically represent a particular set of political options or is it a genuine decision criterion? 2. If it is the latter, can it be reduced to any of the existing criteria for decision making under uncertainty? 3. In what kinds of situation is the principle applicable? 4. What is the relation between the precautionary principle and other principles for environmental regulation? 5. How plausible is the principle's claim that the burden of proof should be reversed? 6. Do the proponents of environmental regulation carry no burden of proof at all? A main conclusion is that, for now at least, the principle contains too many unclear elements to satisfy the requirements of precision and consistency that should reasonably be satisfied by a rational decision criterion. (author)

  2. Strengthen forensic entomology in court--the need for data exploration and the validation of a generalised additive mixed model.

    Science.gov (United States)

    Baqué, Michèle; Amendt, Jens

    2013-01-01

    Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.

  3. Test the principle of maximum entropy in constant sum 2×2 game: Evidence in experimental economics

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Bin, E-mail: xubin211@zju.edu.cn [Experimental Social Science Laboratory, Zhejiang University, Hangzhou, 310058 (China); Public Administration College, Zhejiang Gongshang University, Hangzhou, 310018 (China); Zhang, Hongen, E-mail: hongen777@163.com [Department of Physics, Zhejiang University, Hangzhou, 310027 (China); Wang, Zhijian, E-mail: wangzj@zju.edu.cn [Experimental Social Science Laboratory, Zhejiang University, Hangzhou, 310058 (China); Zhang, Jianbo, E-mail: jbzhang08@zju.edu.cn [Department of Physics, Zhejiang University, Hangzhou, 310027 (China)

    2012-03-19

    By using laboratory experimental data, we test the uncertainty of strategy type in various competing environments with two-person constant sum 2×2 game in the social system. It firstly shows that, in these competing game environments, the outcome of human's decision-making obeys the principle of the maximum entropy. -- Highlights: ► Test the uncertainty in two-person constant sum games with experimental data. ► On game level, the constant sum game fits the principle of maximum entropy. ► On group level, all empirical entropy values are close to theoretical maxima. ► The results can be different for the games that are not constant sum game.

  4. Schrodinger's Uncertainty Principle?

    Indian Academy of Sciences (India)

    Research Institute,· mainly on applications of optical and statistical ... serves to be better known in the classroom. Let us recall the basic algebraic steps in the text book proof. We consider the wave function (which has a free real parameter a) (x + iap)1jJ == x1jJ(x) + ia( -in81jJ/8x) == 4>( x), The hat sign over x and p reminds ...

  5. PIV uncertainty quantification by image matching

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Wieneke, Bernhard

    2013-01-01

    A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087–105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the

  6. Analytical Solution of the Schrödinger Equation with Spatially Varying Effective Mass for Generalised Hylleraas Potential

    International Nuclear Information System (INIS)

    Debnath, S.; Maji, Smarajit; Meyur, Sanjib

    2014-01-01

    We have obtained exact solution of the effective mass Schrödinger equation for the generalised Hylleraas potential. The exact bound state energy eigenvalues and corresponding eigenfunctions are presented. The bound state eigenfunctions are obtained in terms of the hypergeometric functions. Results are also given for the special case of potential parameter.

  7. Mutations in THAP1 (DYT6) and generalised dystonia with prominent spasmodic dysphonia: a genetic screening study

    DEFF Research Database (Denmark)

    Djarmati, Ana; Schneider, Susanne A; Lohmann, Katja

    2009-01-01

    -onset generalised dystonia with spasmodic dysphonia. This combination of symptoms might be a characteristic feature of DYT6 dystonia and could be useful in the differential diagnosis of DYT1, DYT4, DYT12, and DYT17 dystonia. In addition to the identified mutations, a rare non-coding substitution in THAP1 might...

  8. Optimism in the face of uncertainty supported by a statistically-designed multi-armed bandit algorithm.

    Science.gov (United States)

    Kamiura, Moto; Sano, Kohei

    2017-10-01

    The principle of optimism in the face of uncertainty is known as a heuristic in sequential decision-making problems. Overtaking method based on this principle is an effective algorithm to solve multi-armed bandit problems. It was defined by a set of some heuristic patterns of the formulation in the previous study. The objective of the present paper is to redefine the value functions of Overtaking method and to unify the formulation of them. The unified Overtaking method is associated with upper bounds of confidence intervals of expected rewards on statistics. The unification of the formulation enhances the universality of Overtaking method. Consequently we newly obtain Overtaking method for the exponentially distributed rewards, numerically analyze it, and show that it outperforms UCB algorithm on average. The present study suggests that the principle of optimism in the face of uncertainty should be regarded as the statistics-based consequence of the law of large numbers for the sample mean of rewards and estimation of upper bounds of expected rewards, rather than as a heuristic, in the context of multi-armed bandit problems. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. The quantum moment how Planck, Bohr, Einstein, and Eisenberg taught us to love uncertainty

    CERN Document Server

    Crease, Robert P

    2014-01-01

    The discovery of the quantum—the idea, born in the early 1900s in a remote corner of physics, that energy comes in finite packets instead of infinitely divisible quantities—planted a rich set of metaphors in the popular imagination. Quantum imagery and language now bombard us like an endless stream of photons. Phrases such as multiverses, quantum leaps, alternate universes, the uncertainty principle, and Schrödinger's cat get reinvented continually in cartoons and movies, coffee mugs and T-shirts, and fiction and philosophy, reinterpreted by each new generation of artists and writers. Is a "quantum leap" big or small? How uncertain is the uncertainty principle? Is this barrage of quantum vocabulary pretentious and wacky, or a fundamental shift in the way we think? All the above, say Robert P. Crease and Alfred Scharff Goldhaber in this pathbreaking book. The authors—one a philosopher, the other a physicist—draw on their training and six years of co-teaching to dramatize the quantum’s rocky path f...

  10. Understanding and applying principles of social cognition and decision making in adaptive environmental governance

    Science.gov (United States)

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance...

  11. Superstatistical generalised Langevin equation: non-Gaussian viscoelastic anomalous diffusion

    Science.gov (United States)

    Ślęzak, Jakub; Metzler, Ralf; Magdziarz, Marcin

    2018-02-01

    Recent advances in single particle tracking and supercomputing techniques demonstrate the emergence of normal or anomalous, viscoelastic diffusion in conjunction with non-Gaussian distributions in soft, biological, and active matter systems. We here formulate a stochastic model based on a generalised Langevin equation in which non-Gaussian shapes of the probability density function and normal or anomalous diffusion have a common origin, namely a random parametrisation of the stochastic force. We perform a detailed analysis demonstrating how various types of parameter distributions for the memory kernel result in exponential, power law, or power-log law tails of the memory functions. The studied system is also shown to exhibit a further unusual property: the velocity has a Gaussian one point probability density but non-Gaussian joint distributions. This behaviour is reflected in the relaxation from a Gaussian to a non-Gaussian distribution observed for the position variable. We show that our theoretical results are in excellent agreement with stochastic simulations.

  12. Principle of minimum distance in space of states as new principle in quantum physics

    International Nuclear Information System (INIS)

    Ion, D. B.; Ion, M. L. D.

    2007-01-01

    The mathematician Leonhard Euler (1707-1783) appears to have been a philosophical optimist having written: 'Since the fabric of universe is the most perfect and is the work of the most wise Creator, nothing whatsoever take place in this universe in which some relation of maximum or minimum does not appear. Wherefore, there is absolutely no doubt that every effect in universe can be explained as satisfactory from final causes themselves the aid of the method of Maxima and Minima, as can from the effective causes'. Having in mind this kind of optimism in the papers mentioned in this work we introduced and investigated the possibility to construct a predictive analytic theory of the elementary particle interaction based on the principle of minimum distance in the space of quantum states (PMD-SQS). So, choosing the partial transition amplitudes as the system variational variables and the distance in the space of the quantum states as a measure of the system effectiveness, we obtained the results presented in this paper. These results proved that the principle of minimum distance in space of quantum states (PMD-SQS) can be chosen as variational principle by which we can find the analytic expressions of the partial transition amplitudes. In this paper we present a description of hadron-hadron scattering via principle of minimum distance PMD-SQS when the distance in space of states is minimized with two directional constraints: dσ/dΩ(±1) = fixed. Then by using the available experimental (pion-nucleon and kaon-nucleon) phase shifts we obtained not only consistent experimental tests of the PMD-SQS optimality, but also strong experimental evidences for new principles in hadronic physics such as: Principle of nonextensivity conjugation via the Riesz-Thorin relation (1/2p + 1/2q = 1) and a new Principle of limited uncertainty in nonextensive quantum physics. The strong experimental evidence obtained here for the nonextensive statistical behavior of the [J,

  13. Generalised universality of gauge thresholds in heterotic vacua with and without supersymmetry

    CERN Document Server

    Angelantonj, Carlo; Tsulaia, Mirian

    2015-01-01

    We study one-loop quantum corrections to gauge couplings in heterotic vacua with spontaneous supersymmetry breaking. Although in non-supersymmetric constructions these corrections are not protected and are typically model dependent, we show how a universal behaviour of threshold differences, typical of supersymmetric vacua, may still persist. We formulate specific conditions on the way supersymmetry should be broken for this to occur. Our analysis implies a generalised notion of threshold universality even in the case of unbroken supersymmetry, whenever extra charged massless states appear at enhancement points in the bulk of moduli space. Several examples with universality, including non-supersymmetric chiral models in four dimensions, are presented.

  14. H∞ state estimation of generalised neural networks with interval time-varying delays

    Science.gov (United States)

    Saravanakumar, R.; Syed Ali, M.; Cao, Jinde; Huang, He

    2016-12-01

    This paper focuses on studying the H∞ state estimation of generalised neural networks with interval time-varying delays. The integral terms in the time derivative of the Lyapunov-Krasovskii functional are handled by the Jensen's inequality, reciprocally convex combination approach and a new Wirtinger-based double integral inequality. A delay-dependent criterion is derived under which the estimation error system is globally asymptotically stable with H∞ performance. The proposed conditions are represented by linear matrix inequalities. Optimal H∞ norm bounds are obtained easily by solving convex problems in terms of linear matrix inequalities. The advantage of employing the proposed inequalities is illustrated by numerical examples.

  15. Generalisation of the test theory of special relativity to non-inertial frames

    International Nuclear Information System (INIS)

    Abolghasem, G.H.; Khajehpour, M.R.H.; Mansouri, R.

    1989-01-01

    We present a generalised test theory of special relativity, using a non-inertial frame. Within the framework of the special theory of relativity the transport and Einstein synchronisations are equivalent on a rigidly rotating disc. But in any theory with a preferred frame, such an equivalence does not hold. The time difference resulting from the two synchronisation procedures is a measurable quantity within the reach of existing clock systems on the Earth. The final result contains a term which depends on the angular velocity of the rotating system, and hence measures an absolute effect. This term is of crucial importance in our test theory of special relativity. (Author)

  16. Principle of accelerator mass spectrometry

    International Nuclear Information System (INIS)

    Matsuzaki, Hiroyuki

    2007-01-01

    The principle of accelerator mass spectrometry (AMS) is described mainly on technical aspects: hardware construction of AMS, measurement of isotope ratio, sensitivity of measurement (measuring limit), measuring accuracy, and application of data. The content may be summarized as follows: rare isotope (often long-lived radioactive isotope) can be detected by various use of the ion energy obtained by the acceleration of ions, a measurable isotope ratio is one of rare isotope to abundant isotopes, and a measured value of isotope ratio is uncertainty to true one. Such a fact must be kept in mind on the use of AMS data to application research. (M.H.)

  17. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  18. Arthropathy in long-term cured acromegaly is characterised by osteophytes without joint space narrowing: a comparison with generalised osteoarthritis

    NARCIS (Netherlands)

    Wassenaar, M. J. E.; Biermasz, N. R.; Bijsterbosch, J.; Pereira, A. M.; Meulenbelt, I.; Smit, J. W. A.; Roelfsema, F.; Kroon, H. M.; Romijn, J. A.; Kloppenburg, M.

    2011-01-01

    To compare the distribution of osteophytes and joint space narrowing (JSN) between patients with acromegaly and primary generalised osteoarthritis to gain insight into the pathophysiological process of growth hormone (GH) and insulin-like growth factor type I (IGF-I)-mediated osteoarthritis. We

  19. A precautionary principle for dual use research in the life sciences.

    Science.gov (United States)

    Kuhlau, Frida; Höglund, Anna T; Evers, Kathinka; Eriksson, Stefan

    2011-01-01

    Most life science research entails dual-use complexity and may be misused for harmful purposes, e.g. biological weapons. The Precautionary Principle applies to special problems characterized by complexity in the relationship between human activities and their consequences. This article examines whether the principle, so far mainly used in environmental and public health issues, is applicable and suitable to the field of dual-use life science research. Four central elements of the principle are examined: threat, uncertainty, prescription and action. Although charges against the principle exist - for example that it stifles scientific development, lacks practical applicability and is poorly defined and vague - the analysis concludes that a Precautionary Principle is applicable to the field. Certain factors such as credibility of the threat, availability of information, clear prescriptive demands on responsibility and directives on how to act, determine the suitability and success of a Precautionary Principle. Moreover, policy-makers and researchers share a responsibility for providing and seeking information about potential sources of harm. A central conclusion is that the principle is meaningful and useful if applied as a context-dependent moral principle and allowed flexibility in its practical use. The principle may then inspire awareness-raising and the establishment of practical routines which appropriately reflect the fact that life science research may be misused for harmful purposes. © 2009 Blackwell Publishing Ltd.

  20. QCD amplitudes with 2 initial spacelike legs via generalised BCFW recursion

    Energy Technology Data Exchange (ETDEWEB)

    Kutak, Krzysztof; Hameren, Andreas van; Serino, Mirko [The H. Niewodniczański Institute of Nuclear Physics, Polish Academy of Sciences, ul. Radzikowskiego 152, 31-342, Cracow (Poland)

    2017-02-02

    We complete the generalisation of the BCFW recursion relation to the off-shell case, allowing for the computation of tree level scattering amplitudes for full High Energy Factorisation (HEF), i.e. with both incoming partons having a non-vanishing transverse momentum. We provide explicit results for color-ordered amplitudes with two off-shell legs in massless QCD up to 4 point, continuing the program begun in two previous papers. For the 4-fermion amplitudes, which are not BCFW-recursible, we perform a diagrammatic computation, so as to offer a complete set of expressions. We explicitly show and discuss some plots of the squared 2→2 matrix elements as functions of the differences in rapidity and azimuthal angle of the final state particles.

  1. Time-Varying Uncertainty in Shock and Vibration Applications Using the Impulse Response

    Directory of Open Access Journals (Sweden)

    J.B. Weathers

    2012-01-01

    Full Text Available Design of mechanical systems often necessitates the use of dynamic simulations to calculate the displacements (and their derivatives of the bodies in a system as a function of time in response to dynamic inputs. These types of simulations are especially prevalent in the shock and vibration community where simulations associated with models having complex inputs are routine. If the forcing functions as well as the parameters used in these simulations are subject to uncertainties, then these uncertainties will propagate through the models resulting in uncertainties in the outputs of interest. The uncertainty analysis procedure for these kinds of time-varying problems can be challenging, and in many instances, explicit data reduction equations (DRE's, i.e., analytical formulas, are not available because the outputs of interest are obtained from complex simulation software, e.g. FEA programs. Moreover, uncertainty propagation in systems modeled using nonlinear differential equations can prove to be difficult to analyze. However, if (1 the uncertainties propagate through the models in a linear manner, obeying the principle of superposition, then the complexity of the problem can be significantly simplified. If in addition, (2 the uncertainty in the model parameters do not change during the simulation and the manner in which the outputs of interest respond to small perturbations in the external input forces is not dependent on when the perturbations are applied, then the number of calculations required can be greatly reduced. Conditions (1 and (2 characterize a Linear Time Invariant (LTI uncertainty model. This paper seeks to explain one possible approach to obtain the uncertainty results based on these assumptions.

  2. libmpdata++ 1.0: a library of parallel MPDATA solvers for systems of generalised transport equations

    Science.gov (United States)

    Jaruga, A.; Arabas, S.; Jarecka, D.; Pawlowska, H.; Smolarkiewicz, P. K.; Waruszewski, M.

    2015-04-01

    This paper accompanies the first release of libmpdata++, a C++ library implementing the multi-dimensional positive-definite advection transport algorithm (MPDATA) on regular structured grid. The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; a shallow-water system compared with analytical solution (originally derived for a 2-D case); and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.

  3. libmpdata++ 0.1: a library of parallel MPDATA solvers for systems of generalised transport equations

    Science.gov (United States)

    Jaruga, A.; Arabas, S.; Jarecka, D.; Pawlowska, H.; Smolarkiewicz, P. K.; Waruszewski, M.

    2014-11-01

    This paper accompanies first release of libmpdata++, a C++ library implementing the Multidimensional Positive-Definite Advection Transport Algorithm (MPDATA). The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include: homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; shallow-water system compared with analytical solution (originally derived for a 2-D case); and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.

  4. Precautionary principle, economic and energy systems and social equity

    International Nuclear Information System (INIS)

    Carvalho, Joaquim Francisco de; Mercedes, Sonia Seger P.; Sauer, Ildo L.

    2010-01-01

    In this paper the precautionary principle is reviewed alongside the process of international implementation. Adoption of the precautionary principle is advocated to deal with energy choices as a mechanism to account for potential climate change impacts, notwithstanding the debate on scientific uncertainty on the links between solar activity, greenhouse gas concentration and climate. However, it is also recognized that the widespread application of the precautionary principle to energy choices does not seem to be taking place in the real world. Relevant concrete barriers are identified stemming from the intrinsic logic governing the hegemonic economic system, driving the energy choices by economic surplus and rent generation potential, the existence of social asymmetries inside and among societies as well as by the absence of democratic global governance mechanisms, capable of dealing with climate change issues. Such perception seems to have been reinforced by the outcome of the United Nations Climate Change Conference, held in Copenhagen in December 2009.

  5. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    Science.gov (United States)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  6. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  7. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio

    In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the

  8. The linear stability of the Schwarzschild solution to gravitational perturbations in the generalised wave gauge

    OpenAIRE

    Johnson, Thomas

    2018-01-01

    In a recent seminal paper \\cite{D--H--R} of Dafermos, Holzegel and Rodnianski the linear stability of the Schwarzschild family of black hole solutions to the Einstein vacuum equations was established by imposing a double null gauge. In this paper we shall prove that the Schwarzschild family is linearly stable as solutions to the Einstein vacuum equations by imposing instead a generalised wave gauge: all sufficiently regular solutions to the system of equations that result from linearising the...

  9. Can Caring Create Prejudice? An Investigation of Positive and Negative Intergenerational Contact in Care Settings and the Generalisation of Blatant and Subtle Age Prejudice to Other Older People.

    Science.gov (United States)

    Drury, Lisbeth; Abrams, Dominic; Swift, Hannah J; Lamont, Ruth A; Gerocova, Katarina

    2017-01-01

    Caring is a positive social act, but can it result in negative attitudes towards those cared for, and towards others from their wider social group? Based on intergroup contact theory, we tested whether care workers' (CWs) positive and negative contact with old-age care home residents (CHRs) predicts prejudiced attitudes towards that group, and whether this generalises to other older people. Fifty-six CWs were surveyed about their positive and negative contact with CHRs and their blatant and subtle attitudes (humanness attributions) towards CHRs and older adults. We tested indirect paths from contact with CHRs to attitudes towards older adults via attitudes towards CHRs. Results showed that neither positive nor negative contact generalised blatant ageism. However, the effect of negative, but not positive, contact on the denial of humanness to CHRs generalised to subtle ageism towards older adults. This evidence has practical implications for management of CWs' work experiences and theoretical implications, suggesting that negative contact with a subgroup generalises the attribution of humanness to superordinate groups. Because it is difficult to identify and challenge subtle prejudices such as dehumanisation, it may be especially important to reduce negative contact. © 2016 The Authors. Journal of Community & Applied Social Psychology Published by John Wiley & Sons Ltd.

  10. Space-time uncertainty and approaches to D-brane field theory

    International Nuclear Information System (INIS)

    Yoneya, Tamiaki

    2008-01-01

    In connection with the space-time uncertainty principle which gives a simple qualitative characterization of non-local or non-commutative nature of short-distance space-time structure in string theory, the author's recent approaches toward field theories for D-branes are briefly outlined, putting emphasis on some key ideas lying in the background. The final section of the present report is devoted partially to a tribute to Yukawa on the occasion of the centennial of his birth. (author)

  11. Regulatory decision making in the presence of uncertainty in the context of the disposal of long lived radioactive wastes. Third report of the Working group on principles and criteria for radioactive waste disposal

    International Nuclear Information System (INIS)

    1997-10-01

    Plans for disposing of radioactive wastes have raised a number of unique and mostly philosophical problems, mainly due to the very long time-scales which have to be considered. While there is general agreement on disposal concepts and on many aspects of a safety philosophy, consensus on a number of issues remains to be achieved. The IAEA established a subgroup under the International Radioactive Waste Management Advisory Committee (INWAC). The subgroup started its work in 1991 as the ''INWAC Subgroup on Principles and Criteria for Radioactive Waste Disposal''. With the reorganization in 1995 of IAEA senior advisory committees in the nuclear safety area, the title of the group was changed to ''Working Group on Principles and Criteria for Radioactive Waste Disposal''. The working group is intended to provide an open forum for: (1) the discussion and resolution of contentious issues, especially those with an international component, in the area of principles and criteria for safe disposal of waste; (2) the review and analysis of new ideas and concepts in the subject area; (3) establishing areas of consensus; (4) the consideration of issues related to safety principles and criteria in the IAEA's Radioactive Waste Safety Standards (RADWASS) programme; (5) the exchange of information on national safety criteria and policies for radioactive waste disposal. This is the third report of the working group and it deals with the subject of regulatory decision making under conditions of uncertainty which is a matter of concern with respect to disposal of radioactive wastes underground. 14 refs

  12. Few group collapsing of covariance matrix data based on a conservation principle

    International Nuclear Information System (INIS)

    Hiruta, H.; Palmiotti, G.; Salvatores, M.; Arcilla, R. Jr.; Oblozinsky, P.; McKnight, R.D.

    2008-01-01

    A new algorithm for a rigorous collapsing of covariance data is proposed, derived, implemented, and tested. The method is based on a conservation principle that allows preserving at a broad energy group structure the uncertainty calculated in a fine group energy structure for a specific integral parameter, using as weights the associated sensitivity coefficients

  13. Interpretation of the extreme physical information principle in terms of shift information

    International Nuclear Information System (INIS)

    Vstovsky, G.V.

    1995-01-01

    It is shown that Fisher information (FI) can be considered as a limiting case of a related form of Kullback information---a shift information (SI). The compatibility of the use of SI with a basic physical principle of uncertainty is demonstrated. The scope of FI based theory is extended to the nonlinear Klein-Gordon equation

  14. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  15. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes

  16. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  17. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    International Nuclear Information System (INIS)

    Cooke, Roger; MacDonell, Margaret

    2007-01-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  18. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  19. Cosmological principles. II. Physical principles

    International Nuclear Information System (INIS)

    Harrison, E.R.

    1974-01-01

    The discussion of cosmological principle covers the uniformity principle of the laws of physics, the gravitation and cognizability principles, and the Dirac creation, chaos, and bootstrap principles. (U.S.)

  20. Electricity restructuring : acting on principles

    International Nuclear Information System (INIS)

    Down, E.; Hoover, G.; Howatson, A.; Rheaume, G.

    2003-01-01

    In the second briefing of this series, the authors explored public policy decisions and political intervention, and their effect on electricity restructuring. Continuous and vigilant regulatory oversight of the electricity industry in Canada is required. The need for improved public policy to reduce uncertainty for private investors who wish to enter the market was made clear using case studies from the United Kingdom, California, Alberta, and Ontario. Clarity and consistency must be the two guiding principles for public policy decisions and political intervention in the sector. By clarity, the authors meant that rules, objectives, and timelines of the restructuring process are clear to all market participants. Market rules, implementation, and consumer expectations must be consistent. refs., 3 figs

  1. Entropic uncertainty relations in the Heisenberg XXZ model and its controlling via filtering operations

    Science.gov (United States)

    Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-04-01

    The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.

  2. The extent and risk of knee injuries in children aged 9-14 with Generalised Joint Hypermobility and knee joint hypermobility

    DEFF Research Database (Denmark)

    Junge, Tina; Runge, Lisbeth; Juul-Kristensen, Birgit

    2015-01-01

    BACKGROUND: Generalised Joint Hypermobility (GJH) is suggested as an aetiological factor for knee injuries in adolescents and adults. It is presumed that GJH causes decreased joint stability, thereby increasing the risk of knee injuries during challenging situations like jumping and landing. The ...

  3. The creep analysis of shell structures using generalised models

    International Nuclear Information System (INIS)

    Boyle, J.T.; Spence, J.

    1981-01-01

    In this paper a new, more complete estimate of the accuracy of the stationary creep model is given for the general case through the evaluation of exact and approximate energy surfaces. In addition, the stationary model is extended to include more general non-stationary (combined elastic-creep) behaviour and to include the possibility of material deterioration through damage. The resulting models are then compared to existing exact solutions for several shell structures - e.g. a thin pressurised cylinder, a curved pipe in bending and an S-bellows under axial extension with large deflections. In each case very good agreement is obtained. Although requiring similar computing effort, so that the same solution techniques can be utilised, the calculation times are shown to be significantly reduced using the generalised approach. In conclusion, it has been demonstrated that a new simple mechanical model of a thin shell in creep, with or without material deterioration can be constructed; the model is assessed in detail and successfully compared to existing solutions. (orig./HP)

  4. The angle-angular momentum and entropic uncertainty relations for quantum scattering

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    1999-01-01

    Recently the entropic uncertainty relations are obtained in a more general form by using Tsallis-like entropies for the quantum scattering. Hence, using Riesz theorem, the state-independent entropic angle-angular momentum uncertainty relations are proved for the Tsallis-like scattering entropies of spinless particles. The generalized entropic inequalities for the Tsallis-like entropies are presented. The two upper bounds are optimal bounds and can be obtained via Lagrange multipliers by extremizing the Tsallis-like entropies subject to the normalization constraints, respectively. The proof of the lower bound is provided by considering the condition that the angular distribution of probability, P(x) has, everywhere, a finite magnitude. Next, by using the Riesz Theorem a general result was obtained, appearing as inequalities valid for the case of hadron-hadron scattering. An important entropic uncertainty relation for the scattering of spinless particle was thus obtained. For σ el and dσ/dΩ, fixed from experiment, we proved that the optimal scattering entropies are the maximum possible entropies in the scattering process. In as previous paper it was shown that the experimental values of the entropies for the pion--nucleus scatterings are systematically described by the optimal entropies, at all available pion kinetic energies. In this sense the obtained results can also be considered as new experimental signatures for the validity of the principle of minimum distance in space of scattering states. The extension of the optimal state analysis to the generalized non-extensive statistics case, as well as, a test of the entropic inequalities, can be obtained in similar way by using non-extensive optimal entropies. Since this kind of analysis is more involved the numerical examples will be given in a following more extended paper. Finally, we believe that the results obtained here are encouraging for further investigations of the entropic uncertainty relations as well

  5. A Subjective Logic Formalisation of the Principle of Polyrepresentation for Information Needs

    DEFF Research Database (Denmark)

    Lioma, Christina; Larsen, Birger; Schütze, Hinrich

    2010-01-01

    of the principle of Polyrepresentation, a theoretical framework for reasoning about different representations arising from interactive information retrieval in a given context. Although the principle of Polyrepresentation has received attention from many researchers, not much empirical work has been done based...... on it. One reason may be that it has not yet been formalised mathematically. In this paper we propose an up-to-date and flexible mathematical formalisation of the principle of Polyrepresentation for information needs. Specifically, we apply Subjective Logic to model different representations...... of information needs as beliefs marked by degrees of uncertainty. We combine such beliefs using different logical operators, and we discuss these combinations with respect to different retrieval scenarios and situations. A formal model is introduced and discussed, with illustrative applications to the modelling...

  6. Scientific principles for the identification of endocrine-disrupting chemicals: a consensus statement

    DEFF Research Database (Denmark)

    Solecki, Roland; Kortenkamp, Andreas; Bergman, Åke

    2017-01-01

    from different disciplines discussed principles and open questions on ED identification as outlined in a draft consensus paper at an expert meeting hosted by the German Federal Institute for Risk Assessment (BfR) in Berlin, Germany on 11-12 April 2016. Participants reached a consensus regarding...... scientific principles for the identification of EDs. The paper discusses the consensus reached on background, definition of an ED and related concepts, sources of uncertainty, scientific principles important for ED identification, and research needs. It highlights the difficulty in retrospectively...... reconstructing ED exposure, insufficient range of validated test systems for EDs, and some issues impacting on the evaluation of the risk from EDs, such as non-monotonic dose-response and thresholds, modes of action, and exposure assessment. This report provides the consensus statement on EDs agreed among all...

  7. Issues of validity and generalisability in the Grade 12 English Home Language examination

    Directory of Open Access Journals (Sweden)

    du Plessis, Colleen Lynne

    2014-12-01

    Full Text Available Very little research has been devoted to evaluating the national English Home Language (HL curriculum and assessment system. Not only is there a lack of clarity on whether the language subject is being offered at an adequately high level to meet the declared objectives of the curriculum, but the reliability of the results obtained by Grade 12 learners in the exit-level examination has been placed under suspicion. To shed some light on the issue, this study takes a close look at the language component of the school-leaving examination covering the period 2008-2012, to see whether evidence of high language ability can be generated through the current selection of task types and whether the inferred ability can be generalised to non-examination contexts. Of primary interest here are the validity of the construct on which the examination is built and the sub-abilities that are being measured, as well as the validity of the scoring. One of the key findings of the study is that the language papers cannot be considered indicators of advanced and differential language ability, only of basic and general proficiency. The lack of specifications in the design of the examination items and construction of the marking memoranda undermine the validity and reliability of the assessment. As a consequence hereof, the inferences made on the basis of the scores obtained by examinees are highly subjective and cannot be generalised to other domains of language use. The study hopes to draw attention to the importance of the format and design of the examination papers in maintaining educational standards.

  8. Evaluating the uncertainty of input quantities in measurement models

    Science.gov (United States)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  9. Uncertainties in risk assessment at USDOE facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  10. Uncertainties in risk assessment at USDOE facilities

    International Nuclear Information System (INIS)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms open-quote risk assessment close-quote and open-quote risk management close-quote are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of open-quotes... the most significant data and uncertainties...close quotes in an assessment. Significant data and uncertainties are open-quotes...those that define and explain the main risk conclusionsclose quotes. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation

  11. Generalised additive modelling approach to the fermentation process of glutamate.

    Science.gov (United States)

    Liu, Chun-Bo; Li, Yun; Pan, Feng; Shi, Zhong-Ping

    2011-03-01

    In this work, generalised additive models (GAMs) were used for the first time to model the fermentation of glutamate (Glu). It was found that three fermentation parameters fermentation time (T), dissolved oxygen (DO) and oxygen uptake rate (OUR) could capture 97% variance of the production of Glu during the fermentation process through a GAM model calibrated using online data from 15 fermentation experiments. This model was applied to investigate the individual and combined effects of T, DO and OUR on the production of Glu. The conditions to optimize the fermentation process were proposed based on the simulation study from this model. Results suggested that the production of Glu can reach a high level by controlling concentration levels of DO and OUR to the proposed optimization conditions during the fermentation process. The GAM approach therefore provides an alternative way to model and optimize the fermentation process of Glu. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  12. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  13. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  14. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  15. The Principle of Pooled Calibrations and Outlier Retainment Elucidates Optimum Performance of Ion Chromatography

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Mikolajczak, Maria; Wojtachnio-Zawada, Katarzyna Olga

    A new principle of statistical data treatment is presented. Since the majority of scientists and costumers are interested in determination of the true amount of analyte in real samples, the focus of attention should be directed towards the concept of accuracy rather than precision. By exploiting...... that the principle of pooled calibrations provides a more realistic picture of the analytical performance with the drawback however, that generally higher levels of uncertainties should be accepted, as compared to contemporary literature values. The implications to the science of analytical chemistry in general...

  16. The principle of pooled calibrations and outlier retainment elucidates optimum performance of ion chromatography

    DEFF Research Database (Denmark)

    Andersen, Jens; Mikolajczak, Maria; Wojtachnio-Zawada, Katarzyna Olga

    2012-01-01

    A principle with quality assurance of ion chromatography (IC) is presented. Since the majority of scientists and costumers are interested in the determination of the true amount of analyte in real samples, the focus of attention should be directed towards the concept of accuracy rather than...... investigations of method validations where it was found that the principle of pooled calibrations provides a more realistic picture of the analytical performance with the drawback, however, that generally higher levels of uncertainties should be accepted, as compared to contemporary literature values...

  17. The GUP and quantum Raychaudhuri equation

    Science.gov (United States)

    Vagenas, Elias C.; Alasfar, Lina; Alsaleh, Salwa M.; Ali, Ahmed Farag

    2018-06-01

    In this paper, we compare the quantum corrections to the Schwarzschild black hole temperature due to quadratic and linear-quadratic generalised uncertainty principle, with the corrections from the quantum Raychaudhuri equation. The reason for this comparison is to connect the deformation parameters β0 and α0 with η which is the parameter that characterises the quantum Raychaudhuri equation. The derived relation between the parameters appears to depend on the relative scale of the system (black hole), which could be read as a beta function equation for the quadratic deformation parameter β0. This study shows a correspondence between the two phenomenological approaches and indicates that quantum Raychaudhuri equation implies the existence of a crystal-like structure of spacetime.

  18. Generalisation of the method of images for the calculation of inviscid potential flow past several arbitrarily moving parallel circular cylinders

    Czech Academy of Sciences Publication Activity Database

    Kharlamov, Alexander A.; Filip, Petr

    2012-01-01

    Roč. 77, č. 1 (2012), s. 77-85 ISSN 0022-0833 Institutional research plan: CEZ:AV0Z20600510 Keywords : circular cylinders * cylinder between two walls * generalised method of images * ideal fluid * potential flow Subject RIV: BK - Fluid Dynamics Impact factor: 1.075, year: 2012

  19. The generalised Marchenko equation and the canonical structure of the A.K.N.S.-Z.S. inverse method

    International Nuclear Information System (INIS)

    Dodd, R.K.; Bullough, R.K.

    1979-01-01

    A generalised Marchenko equation is derived for a 2 X 2 matrix inverse method and it is used to show that, for the subset of equations solvable by the method which can be constructed as defining the flows of Hamiltonians, the inverse transform is a canonical (homogeneous contact) transformation. Baecklund transformations are re-examined from this point of view. (Auth.)

  20. Sensitivity and uncertainty analysis for functionals of the time-dependent nuclide density field

    International Nuclear Information System (INIS)

    Williams, M.L.; Weisbin, C.R.

    1978-04-01

    An approach to extend the present ORNL sensitivity program to include functionals of the time-dependent nuclide density field is developed. An adjoint equation for the nuclide field was derived previously by using generalized perturbation theory; the present derivation makes use of a variational principle and results in the same equation. The physical significance of this equation is discussed and compared to that of the time-dependent neutron adjoint equation. Computational requirements for determining sensitivity profiles and uncertainties for functionals of the time-dependent nuclide density vector are developed within the framework of the existing FORSS system; in this way the current capability is significantly extended. The development, testing, and use of an adjoint version of the ORIGEN isotope generation and depletion code are documented. Finally, a sample calculation is given which estimates the uncertainty in the plutonium inventory at shutdown of a PWR due to assumed uncertainties in uranium and plutonium cross sections. 8 figures, 4 tables

  1. Dealing with uncertainties in the nanotech workplace practice: making the precautionary approach operational.

    Science.gov (United States)

    van Broekhuizen, Pieter

    2011-02-01

    If the risk management for the professional use of dispersive nanomaterials is hampered by a lack of reliable information, the reliable manager and the policy makers have to chose to make the precautionary principle operational for nanotech workplace. This study presents some tools that can be useful for the health & safety manager and for nanotech workers to deal with uncertainties in the nano-workplace.

  2. The precautionary principle as a provisional instrument in environmental policy: The Montreal Protocol case study

    International Nuclear Information System (INIS)

    Jacobs, J. Roger

    2014-01-01

    Highlights: • I examine whether a policy invoked under the Precautionary Principle can move beyond provisional status. • I review the certainty of conclusions based upon the Global Ozone Research and Monitoring Project. • There is high certainty that anthropogenic ozone depletion has health consequences in polar regions. • Current research focuses on long term projections of risk that perpetuates high uncertainty. • Establishment of a community to generate Assessments acts to perpetuate the period of uncertainty. - Abstract: Environmental studies identify possible threats to the health of the public or the environment when the scientific certainty of risk is low, but the potential cost is high. Governments may respond by invoking the Precautionary Principle, holding that scientific certainty is not required to take actions that reduce possible risk. EU guidelines suggest that precautionary measures remain provisional until sufficient scientific certainty is generated. Here I study the Scientific Assessments produced for the Montreal Protocol, and the scientific community that generates them, and ask whether a long-standing program of scientific investigation and monitoring can generate sufficient scientific certainty to move beyond dependence on the Precautionary Principle. When the Montreal Protocol was ratified, many scientists strongly suspected that anthropogenic substances like chlorofluorocarbons were depleting stratospheric ozone. Although the risk was uncertain, the perceived cost to public health of ozone depletion was high. A quarter century after formulating the Montreal Protocol, science can define the conditions for ozone depletion with great certainty, but uncertainty remains in determining the scale and distribution of the attributable increase in damaging ultra-violet (UV) radiation. Organisations, such as NASA, and scientists that contribute to the Scientific Assessments comprise the community in which the scientific consensus of risk is

  3. Scenario-based fitted Q-iteration for adaptive control of water reservoir systems under uncertainty

    Science.gov (United States)

    Bertoni, Federica; Giuliani, Matteo; Castelletti, Andrea

    2017-04-01

    Over recent years, mathematical models have largely been used to support planning and management of water resources systems. Yet, the increasing uncertainties in their inputs - due to increased variability in the hydrological regimes - are a major challenge to the optimal operations of these systems. Such uncertainty, boosted by projected changing climate, violates the stationarity principle generally used for describing hydro-meteorological processes, which assumes time persisting statistical characteristics of a given variable as inferred by historical data. As this principle is unlikely to be valid in the future, the probability density function used for modeling stochastic disturbances (e.g., inflows) becomes an additional uncertain parameter of the problem, which can be described in a deterministic and set-membership based fashion. This study contributes a novel method for designing optimal, adaptive policies for controlling water reservoir systems under climate-related uncertainty. The proposed method, called scenario-based Fitted Q-Iteration (sFQI), extends the original Fitted Q-Iteration algorithm by enlarging the state space to include the space of the uncertain system's parameters (i.e., the uncertain climate scenarios). As a result, sFQI embeds the set-membership uncertainty of the future inflow scenarios in the action-value function and is able to approximate, with a single learning process, the optimal control policy associated to any scenario included in the uncertainty set. The method is demonstrated on a synthetic water system, consisting of a regulated lake operated for ensuring reliable water supply to downstream users. Numerical results show that the sFQI algorithm successfully identifies adaptive solutions to operate the system under different inflow scenarios, which outperform the control policy designed under historical conditions. Moreover, the sFQI policy generalizes over inflow scenarios not directly experienced during the policy design

  4. Children aged 4-8 years treated with parent training and child therapy because of conduct problems: generalisation effects to day-care and school settings.

    Science.gov (United States)

    Drugli, May Britt; Larsson, Bo

    2006-10-01

    In this study, generalisation effects to day-care/school settings were examined in an outpatient clinic sample of 127 children aged 4-8 years treated because of oppositional conduct problems in the home with parent training (PT) and parent training combined with child therapy (CT) ("Incredible Years"). Before treatment all children scored above the 90th percentile on the Eyberg Child Behavior Inventory (ECBI) for home problems, and met criteria for a possible or a confirmed diagnosis of either an oppositional defiant (ODD) or a conduct (CD) disorder. Further, 83% of the children showed clinical levels of conduct problems both at home and in day-care/school before treatment. Although most children improved at home, the majority still showed clinical levels of conduct problems in day-care/school settings after treatment and 1-year later. Combined PT and CT produced the most powerful and significant generalisation effects across the treatment period, however these improvements were not maintained 1-year later for most areas. The results of the present study, therefore, underline the need to target conduct problems not only exhibited at home but also in day-care/school settings, and to develop strategies to maintain positive generalisation effects after treatment for this age and problem-group.

  5. MODELS OF AIR TRAFFIC CONTROLLERS ERRORS PREVENTION IN TERMINAL CONTROL AREAS UNDER UNCERTAINTY CONDITIONS

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-03-01

    Full Text Available Purpose: the aim of this study is to research applied models of air traffic controllers’ errors prevention in terminal control areas (TMA under uncertainty conditions. In this work the theoretical framework descripting safety events and errors of air traffic controllers connected with the operations in TMA is proposed. Methods: optimisation of terminal control area formal description based on the Threat and Error management model and the TMA network model of air traffic flows. Results: the human factors variables associated with safety events in work of air traffic controllers under uncertainty conditions were obtained. The Threat and Error management model application principles to air traffic controller operations and the TMA network model of air traffic flows were proposed. Discussion: Information processing context for preventing air traffic controller errors, examples of threats in work of air traffic controllers, which are relevant for TMA operations under uncertainty conditions.

  6. Tides in a body librating about a spin-orbit resonance: generalisation of the Darwin-Kaula theory

    Science.gov (United States)

    Frouard, Julien; Efroimsky, Michael

    2017-09-01

    The Darwin-Kaula theory of bodily tides is intended for celestial bodies rotating without libration. We demonstrate that this theory, in its customary form, is inapplicable to a librating body. Specifically, in the presence of libration in longitude, the actual spectrum of Fourier tidal modes differs from the conventional spectrum rendered by the Darwin-Kaula theory for a nonlibrating celestial object. This necessitates derivation of formulae for the tidal torque and the tidal heating rate, that are applicable under libration. We derive the tidal spectrum for longitudinal forced libration with one and two main frequencies, generalisation to more main frequencies being straightforward. (By main frequencies we understand those emerging due to the triaxiality of the librating body.) Separately, we consider a case of free libration at one frequency (once again, generalisation to more frequencies being straightforward). We also calculate the tidal torque. This torque provides correction to the triaxiality-caused physical libration. Our theory is not self-consistent: we assume that the tidal torque is much smaller than the permanent-triaxiality-caused torque, so the additional libration due to tides is much weaker than the main libration due to the permanent triaxiality. Finally, we calculate the tidal dissipation rate in a body experiencing forced libration at the main mode, or free libration at one frequency, or superimposed forced and free librations.

  7. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  8. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  9. Collaborative framework for PIV uncertainty quantification: the experimental database

    International Nuclear Information System (INIS)

    Neal, Douglas R; Sciacchitano, Andrea; Scarano, Fulvio; Smith, Barton L

    2015-01-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  10. Intelligent Information Retrieval: Diagnosing Information Need. Part II. Uncertainty Expansion in a Prototype of a Diagnostic IR Tool.

    Science.gov (United States)

    Cole, Charles; Cantero, Pablo; Sauve, Diane

    1998-01-01

    Outlines a prototype of an intelligent information-retrieval tool to facilitate information access for an undergraduate seeking information for a term paper. Topics include diagnosing the information need, Kuhlthau's information-search-process model, Shannon's mathematical theory of communication, and principles of uncertainty expansion and…

  11. NATURAL-SCIENCE EDUCATION: SCIENTIFIC AND RELIGIOUS KNOWLEDGE CORRELATION IN THE VIEW OF A SYMMETRY PRINCIPLE. PART I. THE CONTENT OF A SYMMETRY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Vitaly L. Gapontsev

    2015-01-01

    Full Text Available The aim of the investigation is to disclose the content of a symmetry principle; to show system hierarchy of its forms, developed in the course of evolution of scientific knowledge, a society and development of individual consciousness of the person. Methods. Based on the analysis of existing scientific sources, comparison, synthesis and generalisation of its content, the role of symmetry was found out in the course of historical formation of scientific disciplines, arrangement of an empirical set of the facts and its subsequent registration in the form of strict deductive systems. Results. It is proved that the concept «a symmetry principle» (V. I. Vernadsky was the first to coin this concept into the circulation objectifies now the highest level of scientific knowledge. Following E. Vigner’s works, it is said that set of forms of symmetry determines structure of scientific knowledge. On the one hand, these forms have got a deep empirical basis and a close connection with figurative perception of the validity; on the other – they have strict mathematical definitions and generate particular principles of symmetry of Mathematics and Physics based on axiomatic constructions of exact disciplines. Stages of formation and development of a number of scientific disciplines such as Mathematics, Physics, Chemistry and Biology are compared; the peculiarities and common features of its evolution are designated. Invariants and corresponding symmetries in formation of individual consciousness of the person are allocated. Scientific novelty. Developing V. I. Vernadsky’s idea, as he used only the short characteristic of a general scientific principle of symmetry, the authors of the present study consider symmetry forms in various branches of knowledge as particular displays of the given principle. Based on the principle of symmetry as a set of symmetry forms, this principle allows the authors to take a fresh look at the decision of methodological

  12. Physical insight into the thermodynamic uncertainty relation using Brownian motion in tilted periodic potentials

    Science.gov (United States)

    Hyeon, Changbong; Hwang, Wonseok

    2017-07-01

    Using Brownian motion in periodic potentials V (x ) tilted by a force f , we provide physical insight into the thermodynamic uncertainty relation, a recently conjectured principle for statistical errors and irreversible heat dissipation in nonequilibrium steady states. According to the relation, nonequilibrium output generated from dissipative processes necessarily incurs an energetic cost or heat dissipation q , and in order to limit the output fluctuation within a relative uncertainty ɛ , at least 2 kBT /ɛ2 of heat must be dissipated. Our model shows that this bound is attained not only at near-equilibrium [f ≪V'(x ) ] but also at far-from-equilibrium [f ≫V'(x ) ] , more generally when the dissipated heat is normally distributed. Furthermore, the energetic cost is maximized near the critical force when the barrier separating the potential wells is about to vanish and the fluctuation of Brownian particles is maximized. These findings indicate that the deviation of heat distribution from Gaussianity gives rise to the inequality of the uncertainty relation, further clarifying the meaning of the uncertainty relation. Our derivation of the uncertainty relation also recognizes a bound of nonequilibrium fluctuations that the variance of dissipated heat (σq2) increases with its mean (μq), and it cannot be smaller than 2 kBT μq .

  13. Generalised chronic musculoskeletal pain as a rational reaction to a life situation?

    Science.gov (United States)

    Steen, E; Haugli, L

    2000-11-01

    While the biomedical model is still the leading paradigm within modern medicine and health care, and people with generalised chronic musculoskeletal pain are frequent users of health care services, their diagnoses are rated as having the lowest prestige among health care personnel. An epistemological framework for understanding relations between body, emotions, mind and meaning is presented. An approach based on a phenomenological epistemology is discussed as a supplement to actions based on the biomedical model. Within the phenomenological frame of understanding, the body is viewed as a subject and carrier of meaning, and therefore chronic pain can be interpreted as a rational reaction to the totality of a person's life situation. Search for possible hidden individual meanings in painful muscles presupposes meeting health personnel who view the person within a holistic frame of reference.

  14. Geometric Generalisation of Surrogate Model-Based Optimisation to Combinatorial and Program Spaces

    Directory of Open Access Journals (Sweden)

    Yong-Hyuk Kim

    2014-01-01

    Full Text Available Surrogate models (SMs can profitably be employed, often in conjunction with evolutionary algorithms, in optimisation in which it is expensive to test candidate solutions. The spatial intuition behind SMs makes them naturally suited to continuous problems, and the only combinatorial problems that have been previously addressed are those with solutions that can be encoded as integer vectors. We show how radial basis functions can provide a generalised SM for combinatorial problems which have a geometric solution representation, through the conversion of that representation to a different metric space. This approach allows an SM to be cast in a natural way for the problem at hand, without ad hoc adaptation to a specific representation. We test this adaptation process on problems involving binary strings, permutations, and tree-based genetic programs.

  15. When we stay alone: Resist in uncertainty. Emancipation and empowerment practices

    Directory of Open Access Journals (Sweden)

    Ester Jordana Lluch

    2013-03-01

    Full Text Available In view of the uncertainty that surrounds us, there appears the need to resist in the university without any diagnosis of the present. For it, we can depart from the critiques to two of the features that have been characterized the modern university: autonomy and emancipation. We will to try to glimpse what practices of resistance could take place here and now, re-elaborating critically both principles using the reflections of Jacques Rancière and Michel Foucault. 

  16. Assessment of the uncertainty associated with systematic errors in digital instruments: an experimental study on offset errors

    International Nuclear Information System (INIS)

    Attivissimo, F; Giaquinto, N; Savino, M; Cataldo, A

    2012-01-01

    This paper deals with the assessment of the uncertainty due to systematic errors, particularly in A/D conversion-based instruments. The problem of defining and assessing systematic errors is briefly discussed, and the conceptual scheme of gauge repeatability and reproducibility is adopted. A practical example regarding the evaluation of the uncertainty caused by the systematic offset error is presented. The experimental results, obtained under various ambient conditions, show that modelling the variability of systematic errors is more problematic than suggested by the ISO 5725 norm. Additionally, the paper demonstrates the substantial difference between the type B uncertainty evaluation, obtained via the maximum entropy principle applied to manufacturer's specifications, and the type A (experimental) uncertainty evaluation, which reflects actually observable reality. Although it is reasonable to assume a uniform distribution of the offset error, experiments demonstrate that the distribution is not centred and that a correction must be applied. In such a context, this work motivates a more pragmatic and experimental approach to uncertainty, with respect to the directions of supplement 1 of GUM. (paper)

  17. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Directory of Open Access Journals (Sweden)

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  18. Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty

    International Nuclear Information System (INIS)

    Porter, D.W.

    1995-01-01

    Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information form non-invasive and minimal invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety, margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the authors have developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further, applications include an army depot at Letterkenney, PA and commercial industrial sites

  19. Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty

    International Nuclear Information System (INIS)

    Porter, D.W.

    1996-01-01

    Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information from non-invasive and minimally invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the author has developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further applications include an army depot at Letterkenney, PA and commercial industrial sites

  20. Blockchain to Rule the Waves - Nascent Design Principles for Reducing Risk and Uncertainty in Decentralized Environments

    OpenAIRE

    Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman; Palmund, Søren

    2017-01-01

    Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings of a Design Science Research project involving the construction and evaluation of an information technology artifact in collaboration with Maersk, a leading international shipping company, where cen...