WorldWideScience

Sample records for generalised uncertainty principle

  1. The uncertainty principle

    International Nuclear Information System (INIS)

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  2. Economic uncertainty principle?

    OpenAIRE

    Alexander Harin

    2006-01-01

    The economic principle of (hidden) uncertainty is presented. New probability formulas are offered. Examples of solutions of three types of fundamental problems are reviewed.; Principe d'incertitude économique? Le principe économique d'incertitude (cachée) est présenté. De nouvelles formules de chances sont offertes. Les exemples de solutions des trois types de problèmes fondamentaux sont reconsidérés.

  3. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  4. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  5. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  6. Quantum Action Principle with Generalized Uncertainty Principle

    OpenAIRE

    Gu, Jie

    2013-01-01

    One of the common features in all promising candidates of quantum gravity is the existence of a minimal length scale, which naturally emerges with a generalized uncertainty principle, or equivalently a modified commutation relation. Schwinger's quantum action principle was modified to incorporate this modification, and was applied to the calculation of the kernel of a free particle, partly recovering the result previously studied using path integral.

  7. On the uncertainty principle. V

    International Nuclear Information System (INIS)

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  8. Gamma-Ray Telescope and Uncertainty Principle

    Science.gov (United States)

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  9. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  10. Expanding Uncertainty Principle to Certainty-Uncertainty Principles with Neutrosophy and Quad-stage Method

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-03-01

    Full Text Available The most famous contribution of Heisenberg is uncertainty principle. But the original uncertainty principle is improper. Considering all the possible situations (including the case that people can create laws and applying Neutrosophy and Quad-stage Method, this paper presents "certainty-uncertainty principles" with general form and variable dimension fractal form. According to the classification of Neutrosophy, "certainty-uncertainty principles" can be divided into three principles in different conditions: "certainty principle", namely a particle’s position and momentum can be known simultaneously; "uncertainty principle", namely a particle’s position and momentum cannot be known simultaneously; and neutral (fuzzy "indeterminacy principle", namely whether or not a particle’s position and momentum can be known simultaneously is undetermined. The special cases of "certain ty-uncertainty principles" include the original uncertainty principle and Ozawa inequality. In addition, in accordance with the original uncertainty principle, discussing high-speed particle’s speed and track with Newton mechanics is unreasonable; but according to "certaintyuncertainty principles", Newton mechanics can be used to discuss the problem of gravitational defection of a photon orbit around the Sun (it gives the same result of deflection angle as given by general relativity. Finally, for the reason that in physics the principles, laws and the like that are regardless of the principle (law of conservation of energy may be invalid; therefore "certaintyuncertainty principles" should be restricted (or constrained by principle (law of conservation of energy, and thus it can satisfy the principle (law of conservation of energy.

  11. A Variation on Uncertainty Principle and Logarithmic Uncertainty Principle for Continuous Quaternion Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    Mawardi Bahri

    2017-01-01

    Full Text Available The continuous quaternion wavelet transform (CQWT is a generalization of the classical continuous wavelet transform within the context of quaternion algebra. First of all, we show that the directional quaternion Fourier transform (QFT uncertainty principle can be obtained using the component-wise QFT uncertainty principle. Based on this method, the directional QFT uncertainty principle using representation of polar coordinate form is easily derived. We derive a variation on uncertainty principle related to the QFT. We state that the CQWT of a quaternion function can be written in terms of the QFT and obtain a variation on uncertainty principle related to the CQWT. Finally, we apply the extended uncertainty principles and properties of the CQWT to establish logarithmic uncertainty principles related to generalized transform.

  12. Human perception and the uncertainty principle

    International Nuclear Information System (INIS)

    Harney, R.C.

    1976-01-01

    The concept of the uncertainty principle that position and momentum cannot be simultaneously specified to arbitrary accuracy is somewhat difficult to reconcile with experience. This note describes order-of-magnitude calculations which quantify the inadequacy of human perception with regards to direct observation of the breakdown of the trajectory concept implied by the uncertainty principle. Even with the best optical microscope, human vision is inadequate by three orders of magnitude. 1 figure

  13. Quantum wells and the generalized uncertainty principle

    International Nuclear Information System (INIS)

    Blado, Gardo; Owens, Constance; Meyers, Vincent

    2014-01-01

    The finite and infinite square wells are potentials typically discussed in undergraduate quantum mechanics courses. In this paper, we discuss these potentials in the light of the recent studies of the modification of the Heisenberg uncertainty principle into a generalized uncertainty principle (GUP) as a consequence of attempts to formulate a quantum theory of gravity. The fundamental concepts of the minimal length scale and the GUP are discussed and the modified energy eigenvalues and transmission coefficient are derived. (paper)

  14. Dilaton cosmology and the modified uncertainty principle

    International Nuclear Information System (INIS)

    Majumder, Barun

    2011-01-01

    Very recently Ali et al. (2009) proposed a new generalized uncertainty principle (with a linear term in Plank length which is consistent with doubly special relativity and string theory. The classical and quantum effects of this generalized uncertainty principle (termed as modified uncertainty principle or MUP) are investigated on the phase space of a dilatonic cosmological model with an exponential dilaton potential in a flat Friedmann-Robertson-Walker background. Interestingly, as a consequence of MUP, we found that it is possible to get a late time acceleration for this model. For the quantum mechanical description in both commutative and MUP framework, we found the analytical solutions of the Wheeler-DeWitt equation for the early universe and compare our results. We have used an approximation method in the case of MUP.

  15. A revision of the generalized uncertainty principle

    International Nuclear Information System (INIS)

    Bambi, Cosimo

    2008-01-01

    The generalized uncertainty principle arises from the Heisenberg uncertainty principle when gravity is taken into account, so the leading order correction to the standard formula is expected to be proportional to the gravitational constant G N = L 2 Pl . On the other hand, the emerging picture suggests a set of departures from the standard theory which demand a revision of all the arguments used to deduce heuristically the new rule. In particular, one can now argue that the leading order correction to the Heisenberg uncertainty principle is proportional to the first power of the Planck length L Pl . If so, the departures from ordinary quantum mechanics would be much less suppressed than what is commonly thought

  16. The action uncertainty principle and quantum gravity

    Science.gov (United States)

    Mensky, Michael B.

    1992-02-01

    Results of the path-integral approach to the quantum theory of continuous measurements have been formulated in a preceding paper in the form of an inequality of the type of the uncertainty principle. The new inequality was called the action uncertainty principle, AUP. It was shown that the AUP allows one to find in a simple what outputs of the continuous measurements will occur with high probability. Here a more simple form of the AUP will be formulated, δ S≳ħ. When applied to quantum gravity, it leads in a very simple way to the Rosenfeld inequality for measurability of the average curvature.

  17. A review of the generalized uncertainty principle

    International Nuclear Information System (INIS)

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-01-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. (review)

  18. Generalized uncertainty principle and quantum gravity phenomenology

    Science.gov (United States)

    Bosso, Pasquale

    The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.

  19. Towards Thermodynamics with Generalized Uncertainty Principle

    International Nuclear Information System (INIS)

    Moussa, Mohamed; Farag Ali, Ahmed

    2014-01-01

    Various frameworks of quantum gravity predict a modification in the Heisenberg uncertainty principle to a so-called generalized uncertainty principle (GUP). Introducing quantum gravity effect makes a considerable change in the density of states inside the volume of the phase space which changes the statistical and thermodynamical properties of any physical system. In this paper we investigate the modification in thermodynamic properties of ideal gases and photon gas. The partition function is calculated and using it we calculated a considerable growth in the thermodynamical functions for these considered systems. The growth may happen due to an additional repulsive force between constitutes of gases which may be due to the existence of GUP, hence predicting a considerable increase in the entropy of the system. Besides, by applying GUP on an ideal gas in a trapped potential, it is found that GUP assumes a minimum measurable value of thermal wavelength of particles which agrees with discrete nature of the space that has been derived in previous studies from the GUP

  20. The action uncertainty principle for continuous measurements

    Science.gov (United States)

    Mensky, Michael B.

    1996-02-01

    The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa( t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δ F(t)A(p,q,t) in the Hamiltonian where the function δ F (generalized fictitious force) is restricted by the AUP ∫|δ F(t)| Δa( t) d t ≲ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior.

  1. The action uncertainty principle for continuous measurements

    International Nuclear Information System (INIS)

    Mensky, M.B.

    1996-01-01

    The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa(t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δF(t) A(p,q,t) in the Hamiltonian where the function δF (generalized fictitious force) is restricted by the AUP ∫ vertical stroke δF(t) vertical stroke Δa(t)d t< or∼ℎ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of ℎ. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior. (orig.)

  2. Some Implications of Two Forms of the Generalized Uncertainty Principle

    Directory of Open Access Journals (Sweden)

    Mohammed M. Khalil

    2014-01-01

    Full Text Available Various theories of quantum gravity predict the existence of a minimum length scale, which leads to the modification of the standard uncertainty principle to the Generalized Uncertainty Principle (GUP. In this paper, we study two forms of the GUP and calculate their implications on the energy of the harmonic oscillator and the hydrogen atom more accurately than previous studies. In addition, we show how the GUP modifies the Lorentz force law and the time-energy uncertainty principle.

  3. Lorentz violation and generalized uncertainty principle

    Science.gov (United States)

    Lambiase, Gaetano; Scardigli, Fabio

    2018-04-01

    Investigations on possible violation of Lorentz invariance have been widely pursued in the last decades, both from theoretical and experimental sides. A comprehensive framework to formulate the problem is the standard model extension (SME) proposed by A. Kostelecky, where violation of Lorentz invariance is encoded into specific coefficients. Here we present a procedure to link the deformation parameter β of the generalized uncertainty principle to the SME coefficients of the gravity sector. The idea is to compute the Hawking temperature of a black hole in two different ways. The first way involves the deformation parameter β , and therefore we get a deformed Hawking temperature containing the parameter β . The second way involves a deformed Schwarzschild metric containing the Lorentz violating terms s¯μ ν of the gravity sector of the SME. The comparison between the two different techniques yields a relation between β and s¯μ ν. In this way bounds on β transferred from s¯μ ν are improved by many orders of magnitude when compared with those derived in other gravitational frameworks. Also the opposite possibility of bounds transferred from β to s¯μ ν is briefly discussed.

  4. Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?

    Science.gov (United States)

    Robertson, Bill

    2016-01-01

    Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…

  5. Limited entropic uncertainty as new principle of quantum physics

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    2001-01-01

    The Uncertainty Principle (UP) of quantum mechanics discovered by Heisenberg, which constitute the corner-stone of quantum physics, asserts that: there is an irreducible lower bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables. In order to avoid this state-dependence many authors proposed to use the information entropy as a measure of the uncertainty instead of above standard quantitative formulation of the Heisenberg uncertainty principle. In this paper the Principle of Limited Entropic Uncertainty (LEU-Principle), as a new principle in quantum physics, is proved. Then, consistent experimental tests of the LEU-principle, obtained by using the available 49 sets of the pion-nucleus phase shifts, are presented for both, extensive (q=1) and nonextensive (q=0.5 and q=2.0) cases. Some results obtained by the application of LEU-Principle to the diffraction phenomena are also discussed. The main results and conclusions of our paper can be summarized as follows: (i) We introduced a new principle in quantum physics namely the Principle of Limited Entropic Uncertainty (LEU-Principle). This new principle includes in a more general and exact form not only the old Heisenberg uncertainty principle but also introduce an upper limit on the magnitude of the uncertainty in the quantum physics. The LEU-Principle asserts that: 'there is an irreducible lower bound as well as an upper bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables for any extensive and nonextensive (q ≥ 0) quantum systems'; (ii) Two important concrete realizations of the LEU-Principle are explicitly obtained in this paper, namely: (a) the LEU-inequalities for the quantum scattering of spinless particles and (b) the LEU-inequalities for the diffraction on single slit of width 2a. In particular from our general results, in the limit y → +1 we recover in an exact form all the results previously reported. In our paper an

  6. Uncertainty Principles on Two Step Nilpotent Lie Groups

    Indian Academy of Sciences (India)

    Abstract. We extend an uncertainty principle due to Cowling and Price to two step nilpotent Lie groups, which generalizes a classical theorem of Hardy. We also prove an analogue of Heisenberg inequality on two step nilpotent Lie groups.

  7. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  8. Generalized uncertainty principles, effective Newton constant and regular black holes

    OpenAIRE

    Li, Xiang; Ling, Yi; Shen, You-Gen; Liu, Cheng-Zhou; He, Hong-Sheng; Xu, Lan-Fang

    2016-01-01

    In this paper, we explore the quantum spacetimes that are potentially connected with the generalized uncertainty principles. By analyzing the gravity-induced quantum interference pattern and the Gedanken for weighting photon, we find that the generalized uncertainty principles inspire the effective Newton constant as same as our previous proposal. A characteristic momentum associated with the tidal effect is suggested, which incorporates the quantum effect with the geometric nature of gravity...

  9. The role of general relativity in the uncertainty principle

    International Nuclear Information System (INIS)

    Padmanabhan, T.

    1986-01-01

    The role played by general relativity in quantum mechanics (especially as regards the uncertainty principle) is investigated. It is confirmed that the validity of time-energy uncertainty does depend on gravitational time dilation. It is also shown that there exists an intrinsic lower bound to the accuracy with which acceleration due to gravity can be measured. The motion of equivalence principle in quantum mechanics is clarified. (author)

  10. Uncertainty principle for angular position and angular momentum

    International Nuclear Information System (INIS)

    Franke-Arnold, Sonja; Barnett, Stephen M; Yao, Eric; Leach, Jonathan; Courtial, Johannes; Padgett, Miles

    2004-01-01

    The uncertainty principle places fundamental limits on the accuracy with which we are able to measure the values of different physical quantities (Heisenberg 1949 The Physical Principles of the Quantum Theory (New York: Dover); Robertson 1929 Phys. Rev. 34 127). This has profound effects not only on the microscopic but also on the macroscopic level of physical systems. The most familiar form of the uncertainty principle relates the uncertainties in position and linear momentum. Other manifestations include those relating uncertainty in energy to uncertainty in time duration, phase of an electromagnetic field to photon number and angular position to angular momentum (Vaccaro and Pegg 1990 J. Mod. Opt. 37 17; Barnett and Pegg 1990 Phys. Rev. A 41 3427). In this paper, we report the first observation of the last of these uncertainty relations and derive the associated states that satisfy the equality in the uncertainty relation. We confirm the form of these states by detailed measurement of the angular momentum of a light beam after passage through an appropriate angular aperture. The angular uncertainty principle applies to all physical systems and is particularly important for systems with cylindrical symmetry

  11. The Uncertainty Principle in the Presence of Quantum Memory

    Science.gov (United States)

    Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato

    2010-03-01

    One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.

  12. Lacunary Fourier Series and a Qualitative Uncertainty Principle for ...

    Indian Academy of Sciences (India)

    We define lacunary Fourier series on a compact connected semisimple Lie group . If f ∈ L 1 ( G ) has lacunary Fourier series and vanishes on a non empty open subset of , then we prove that vanishes identically. This result can be viewed as a qualitative uncertainty principle.

  13. Uncertainty principle in loop quantum cosmology by Moyal formalism

    Science.gov (United States)

    Perlov, Leonid

    2018-03-01

    In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.

  14. Gauge theories under incorporation of a generalized uncertainty principle

    International Nuclear Information System (INIS)

    Kober, Martin

    2010-01-01

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  15. “Stringy” coherent states inspired by generalized uncertainty principle

    Science.gov (United States)

    Ghosh, Subir; Roy, Pinaki

    2012-05-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  16. “Stringy” coherent states inspired by generalized uncertainty principle

    International Nuclear Information System (INIS)

    Ghosh, Subir; Roy, Pinaki

    2012-01-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  17. Data assimilation and uncertainty analysis of environmental assessment problems--an application of Stochastic Transfer Function and Generalised Likelihood Uncertainty Estimation techniques

    International Nuclear Information System (INIS)

    Romanowicz, Renata; Young, Peter C.

    2003-01-01

    Stochastic Transfer Function (STF) and Generalised Likelihood Uncertainty Estimation (GLUE) techniques are outlined and applied to an environmental problem concerned with marine dose assessment. The goal of both methods in this application is the estimation and prediction of the environmental variables, together with their associated probability distributions. In particular, they are used to estimate the amount of radionuclides transferred to marine biota from a given source: the British Nuclear Fuel Ltd (BNFL) repository plant in Sellafield, UK. The complexity of the processes involved, together with the large dispersion and scarcity of observations regarding radionuclide concentrations in the marine environment, require efficient data assimilation techniques. In this regard, the basic STF methods search for identifiable, linear model structures that capture the maximum amount of information contained in the data with a minimal parameterisation. They can be extended for on-line use, based on recursively updated Bayesian estimation and, although applicable to only constant or time-variable parameter (non-stationary) linear systems in the form used in this paper, they have the potential for application to non-linear systems using recently developed State Dependent Parameter (SDP) non-linear STF models. The GLUE based-methods, on the other hand, formulate the problem of estimation using a more general Bayesian approach, usually without prior statistical identification of the model structure. As a result, they are applicable to almost any linear or non-linear stochastic model, although they are much less efficient both computationally and in their use of the information contained in the observations. As expected in this particular environmental application, it is shown that the STF methods give much narrower confidence limits for the estimates due to their more efficient use of the information contained in the data. Exploiting Monte Carlo Simulation (MCS) analysis

  18. The Bertlmann-Martin Inequalities and the Uncertainty Principle

    International Nuclear Information System (INIS)

    Ighezou, F.Z.; Kerris, A.T.; Lombard, R.J.

    2008-01-01

    A lower bound to (r) 1s is established from the Thomas-Reiche-Kuhn sum rule applied to the reduced equation for the s-states. It is linked to the average value of (r 2 ) 1s We discuss, on few examples, how the use of approximate value for (r 2 ) 1s , derived from the generalized Bertlmann and Martin inequalities, preserves the lower bound character of (r) 1s . Finally, by using the uncertainty principle and the uncertainty in the radial position, we derive a low bound to the ground state kinetic energy

  19. Unconditional security of quantum key distribution and the uncertainty principle

    International Nuclear Information System (INIS)

    Koashi, Masato

    2006-01-01

    An approach to the unconditional security of quantum key distribution protocols is presented, which is based on the uncertainty principle. The approach applies to every case that has been treated via the argument by Shor and Preskill, but it is not necessary to find quantum error correcting codes. It can also treat the cases with uncharacterized apparatuses. The proof can be applied to cases where the secret key rate is larger than the distillable entanglement

  20. Universal uncertainty principle in the measurement operator formalism

    International Nuclear Information System (INIS)

    Ozawa, Masanao

    2005-01-01

    Heisenberg's uncertainty principle has been understood to set a limitation on measurements; however, the long-standing mathematical formulation established by Heisenberg, Kennard, and Robertson does not allow such an interpretation. Recently, a new relation was found to give a universally valid relation between noise and disturbance in general quantum measurements, and it has become clear that the new relation plays a role of the first principle to derive various quantum limits on measurement and information processing in a unified treatment. This paper examines the above development on the noise-disturbance uncertainty principle in the model-independent approach based on the measurement operator formalism, which is widely accepted to describe a class of generalized measurements in the field of quantum information. We obtain explicit formulae for the noise and disturbance of measurements given by measurement operators, and show that projective measurements do not satisfy the Heisenberg-type noise-disturbance relation that is typical in the gamma-ray microscope thought experiments. We also show that the disturbance on a Pauli operator of a projective measurement of another Pauli operator constantly equals √2, and examine how this measurement violates the Heisenberg-type relation but satisfies the new noise-disturbance relation

  1. The 'Herbivory Uncertainty Principle': application in a cerrado site

    Directory of Open Access Journals (Sweden)

    CA Gadotti

    Full Text Available Researchers may alter the ecology of their studied organisms, even carrying out apparently beneficial activities, as in herbivory studies, when they may alter herbivory damage. We tested whether visit frequency altered herbivory damage, as predicted by the 'Herbivory Uncertainty Principle'. In a cerrado site, we established 80 quadrats, in which we sampled all woody individuals. We used four visit frequencies (high, medium, low, and control, quantifying, at the end of three months, herbivory damage for each species in each treatment. We did not corroborate the 'Herbivory Uncertainty Principle', since visiting frequency did not alter herbivory damage, at least when the whole plant community was taken into account. However, when we analysed each species separately, four out of 11 species presented significant differences in herbivory damage, suggesting that the researcher is not independent of its measurements. The principle could be tested in other ecological studies in which it may occur, such as those on animal behaviour, human ecology, population dynamics, and conservation.

  2. Continuous quantum measurements and the action uncertainty principle

    Science.gov (United States)

    Mensky, Michael B.

    1992-09-01

    The path-integral approach to quantum theory of continuous measurements has been developed in preceding works of the author. According to this approach the measurement amplitude determining probabilities of different outputs of the measurement can be evaluated in the form of a restricted path integral (a path integral “in finite limits”). With the help of the measurement amplitude, maximum deviation of measurement outputs from the classical one can be easily determined. The aim of the present paper is to express this variance in a simpler and transparent form of a specific uncertainty principle (called the action uncertainty principle, AUP). The most simple (but weak) form of AUP is δ S≳ℏ, where S is the action functional. It can be applied for simple derivation of the Bohr-Rosenfeld inequality for measurability of gravitational field. A stronger (and having wider application) form of AUP (for ideal measurements performed in the quantum regime) is |∫{/' t″ }(δ S[ q]/δ q( t))Δ q( t) dt|≃ℏ, where the paths [ q] and [Δ q] stand correspondingly for the measurement output and for the measurement error. It can also be presented in symbolic form as Δ(Equation) Δ(Path) ≃ ℏ. This means that deviation of the observed (measured) motion from that obeying the classical equation of motion is reciprocally proportional to the uncertainty in a path (the latter uncertainty resulting from the measurement error). The consequence of AUP is that improving the measurement precision beyond the threshold of the quantum regime leads to decreasing information resulting from the measurement.

  3. Generalized uncertainty principle, quantum gravity and Horava-Lifshitz gravity

    International Nuclear Information System (INIS)

    Myung, Yun Soo

    2009-01-01

    We investigate a close connection between generalized uncertainty principle (GUP) and deformed Horava-Lifshitz (HL) gravity. The GUP commutation relations correspond to the UV-quantum theory, while the canonical commutation relations represent the IR-quantum theory. Inspired by this UV/IR quantum mechanics, we obtain the GUP-corrected graviton propagator by introducing UV-momentum p i =p 0i (1+βp 0 2 ) and compare this with tensor propagators in the HL gravity. Two are the same up to p 0 4 -order.

  4. The Precautionary Principle and statistical approaches to uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2004-01-01

    is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures...... for setting safe exposure levels are essentially derived from these classical statistical ideas, and we outline how uncertainties in the exposure and response measurements affect the no observed adverse effect level, the Benchmark approach and the "Hockey Stick" model. A particular problem concerns model...

  5. On the principled assignment of probabilities for uncertainty analysis

    International Nuclear Information System (INIS)

    Unwin, S.D.; Cook, I.

    1986-01-01

    The authors sympathize with those who raise the questions of inscrutability and over-precision in connection with probabilistic techniques as currently implemented in nuclear PRA. This inscrutability also renders the probabilistic approach, as practiced, open to abuse. They believe that the appropriate remedy is not the discarding of the probabilistic representation of uncertainty in favour of a more simply structured, but logically inconsistent approach such as that of bounding analysis. This would be like forbidding the use of arithmetic in order to prevent the issuing of fraudulent company prospectuses. The remedy, in this analogy, is the enforcement of accounting standards for the valuation of inventory, rates of depreciation etc. They require an analogue of such standards in the PRA domain. What is needed is not the interdiction of probabilistic judgment, but the interdiction of private, inscrutable judgment. Some principles may be conventional in character, as are certain accounting principles. They expound a set of controlling principles which they suggest should govern the formulation of probabilities in nuclear risk analysis. A fuller derivation and consideration of these principles can be found

  6. Reducing Uncertainty: Implementation of Heisenberg Principle to Measure Company Performance

    Directory of Open Access Journals (Sweden)

    Anna Svirina

    2015-08-01

    Full Text Available The paper addresses the problem of uncertainty reduction in estimation of future company performance, which is a result of wide range of enterprise's intangible assets probable efficiency. To reduce this problem, the paper suggests to use quantum economy principles, i.e. implementation of Heisenberg principle to measure efficiency and potential of intangible assets of the company. It is proposed that for intangibles it is not possible to estimate both potential and efficiency at a certain time point. To provide a proof for these thesis, the data on resources potential and efficiency from mid-Russian companies was evaluated within deterministic approach, which did not allow to evaluate probability of achieving certain resource efficiency, and quantum approach, which allowed to estimate the central point around which the probable efficiency of resources in concentrated. Visualization of these approaches was performed by means of LabView software. It was proven that for tangible assets performance estimation a deterministic approach should be used; while for intangible assets the quantum approach allows better quality of future performance prediction. On the basis of these findings we proposed the holistic approach towards estimation of company resource efficiency in order to reduce uncertainty in modeling company performance.

  7. Generalised Filtering

    Directory of Open Access Journals (Sweden)

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  8. What is the uncertainty principle of non-relativistic quantum mechanics?

    Science.gov (United States)

    Riggs, Peter J.

    2018-05-01

    After more than ninety years of discussions over the uncertainty principle, there is still no universal agreement on what the principle states. The Robertson uncertainty relation (incorporating standard deviations) is given as the mathematical expression of the principle in most quantum mechanics textbooks. However, the uncertainty principle is not merely a statement of what any of the several uncertainty relations affirm. It is suggested that a better approach would be to present the uncertainty principle as a statement about the probability distributions of incompatible variables and the resulting restrictions on quantum states.

  9. Horizon Wavefunction of Generalized Uncertainty Principle Black Holes

    Directory of Open Access Journals (Sweden)

    Luciano Manfredi

    2016-01-01

    Full Text Available We study the Horizon Wavefunction (HWF description of a Generalized Uncertainty Principle inspired metric that admits sub-Planckian black holes, where the black hole mass m is replaced by M=m1+β/2MPl2/m2. Considering the case of a wave-packet shaped by a Gaussian distribution, we compute the HWF and the probability PBH that the source is a (quantum black hole, that is, that it lies within its horizon radius. The case β0, where a minimum in PBH is encountered, thus meaning that every particle has some probability of decaying to a black hole. Furthermore, for sufficiently large β we find that every particle is a quantum black hole, in agreement with the intuitive effect of increasing β, which creates larger M and RH terms. This is likely due to a “dimensional reduction” feature of the model, where the black hole characteristics for sub-Planckian black holes mimic those in (1+1 dimensions and the horizon size grows as RH~M-1.

  10. The Hayes principles: learning from the national pilot of information technology and core generalisable theory in informatics.

    Science.gov (United States)

    de Lusignan, Simon; Krause, Paul

    2010-01-01

    There has been much criticism of the NHS national programme for information technology (IT); it has been an expensive programme and some elements appear to have achieved little. The Hayes report was written as an independent review of health and social care IT in England. To identify key principles for health IT implementation which may have relevance beyond the critique of NHS IT. We elicit ten principles from the Hayes report, which if followed may result in more effective IT implementation in health care. They divide into patient-centred, subsidiarity and strategic principles. The patient-centred principles are: 1) the patient must be at the centre of all information systems; 2) the provision of patient-level operational data should form the foundation - avoid the dataset mentality; 3) store health data as close to the patient as possible; 4) enable the patient to take a more active role with their health data within a trusted doctor-patient relationship. The subsidiarity principles set out to balance the local and health-system-wide needs: 5) standardise centrally - patients must be able to benefit from interoperability; 6) provide a standard procurement package and an approved process that ensures safety standards and provision of interoperable systems; 7) authorise a range of local suppliers so that health providers can select the system best meeting local needs; 8) allow local migration from legacy systems, as and when improved functionality for patients is available. And finally the strategic principles: 9) evaluate health IT systems in terms of measureable benefits to patients; 10) strategic planning of systems should reflect strategic goals for the health of patients/the population. Had the Hayes principles been embedded within our approach to health IT, and in particular to medical record implementation, we might have avoided many of the costly mistakes with the UK national programme. However, these principles need application within the modern IT

  11. The Hayes principles: learning from the national pilot of information technology and core generalisable theory in informatics

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2010-06-01

    Conclusions Had the Hayes principles been embedded within our approach to health IT, and in particular to medical record implementation, we might have avoided many of the costly mistakes with the UK national programme. However, these principles need application within the modern IT environment. Closeness to the patient must not be interpreted as physical but instead as a virtual patient-centred space; data will be secure within the cloud and we should dump the vault and infrastructure mentality. Health IT should be developed as an adaptive ecosystem.

  12. A connection between the Uncertainty Principles on the real line and on the circle

    OpenAIRE

    Andersen, Nils Byrial

    2013-01-01

    The purpose of this short note is to exhibit a new connection between the Heisenberg Uncertainty Principle on the line and the Breitenberger Uncertainty Principle on the circle, by considering the commutator of the multiplication and difference operators on Bernstein functions

  13. Supersymmetry Breaking as a new source for the Generalized Uncertainty Principle

    OpenAIRE

    Faizal, Mir

    2016-01-01

    In this letter, we will demonstrate that the breaking of supersymmetry by a non-anticommutative deformation can be used to generate the generalized uncertainty principle. We will analyze the physical reasons for this observation, in the framework of string theory. We also discuss the relation between the generalized uncertainty principle and the Lee–Wick field theories.

  14. Supersymmetry breaking as a new source for the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Faizal, Mir, E-mail: mirfaizalmir@gmail.com

    2016-06-10

    In this letter, we will demonstrate that the breaking of supersymmetry by a non-anticommutative deformation can be used to generate the generalized uncertainty principle. We will analyze the physical reasons for this observation, in the framework of string theory. We also discuss the relation between the generalized uncertainty principle and the Lee–Wick field theories.

  15. The Precautionary Principle and statistical approaches to uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2003-01-01

    Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification...

  16. The Precautionary Principle and Statistical Approaches to Uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2005-01-01

    Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification...

  17. Generalized uncertainty principle as a consequence of the effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Faizal, Mir, E-mail: mirfaizalmir@gmail.com [Irving K. Barber School of Arts and Sciences, University of British Columbia – Okanagan, Kelowna, British Columbia V1V 1V7 (Canada); Department of Physics and Astronomy, University of Lethbridge, Lethbridge, Alberta T1K 3M4 (Canada); Ali, Ahmed Farag, E-mail: ahmed.ali@fsc.bu.edu.eg [Department of Physics, Faculty of Science, Benha University, Benha, 13518 (Egypt); Netherlands Institute for Advanced Study, Korte Spinhuissteeg 3, 1012 CG Amsterdam (Netherlands); Nassar, Ali, E-mail: anassar@zewailcity.edu.eg [Department of Physics, Zewail City of Science and Technology, 12588, Giza (Egypt)

    2017-02-10

    We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.

  18. Generalized uncertainty principle as a consequence of the effective field theory

    Directory of Open Access Journals (Sweden)

    Mir Faizal

    2017-02-01

    Full Text Available We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.

  19. Theoretical formulation of finite-dimensional discrete phase spaces: I. Algebraic structures and uncertainty principles

    International Nuclear Information System (INIS)

    Marchiolli, M.A.; Ruzzi, M.

    2012-01-01

    We propose a self-consistent theoretical framework for a wide class of physical systems characterized by a finite space of states which allows us, within several mathematical virtues, to construct a discrete version of the Weyl–Wigner–Moyal (WWM) formalism for finite-dimensional discrete phase spaces with toroidal topology. As a first and important application from this ab initio approach, we initially investigate the Robertson–Schrödinger (RS) uncertainty principle related to the discrete coordinate and momentum operators, as well as its implications for physical systems with periodic boundary conditions. The second interesting application is associated with a particular uncertainty principle inherent to the unitary operators, which is based on the Wiener–Khinchin theorem for signal processing. Furthermore, we also establish a modified discrete version for the well-known Heisenberg–Kennard–Robertson (HKR) uncertainty principle, which exhibits additional terms (or corrections) that resemble the generalized uncertainty principle (GUP) into the context of quantum gravity. The results obtained from this new algebraic approach touch on some fundamental questions inherent to quantum mechanics and certainly represent an object of future investigations in physics. - Highlights: ► We construct a discrete version of the Weyl–Wigner–Moyal formalism. ► Coherent states for finite-dimensional discrete phase spaces are established. ► Discrete coordinate and momentum operators are properly defined. ► Uncertainty principles depend on the topology of finite physical systems. ► Corrections for the discrete Heisenberg uncertainty relation are also obtained.

  20. Experimental Realization of Popper's Experiment: Violation of Uncertainty Principle?

    Science.gov (United States)

    Kim, Yoon-Ho; Yu, Rong; Shih, Yanhua

    An entangled pair of photon 1 and 2 are emitted in opposite directions along the positive and negative x-axis. A narrow slit is placed in the path of photon 1 which provides precise knowledge about its position along the y-axis and because of the quantum entanglement this in turn provides precise knowledge of the position y of its twin, photon 2. Does photon 2 experience a greater uncertainty in its momentum, i.e., a greater Δpy, due to the precise knowledge of its position y? This is the historical thought experiment of Sir Karl Popper which was aimed to undermine the Copenhagen interpretation in favor of a realistic viewpoint of quantum mechanics. Thispaper reports an experimental realization of the Popper's experiment. One may not agree with Popper's position on quantum mechanics; however, it calls for a correct understanding and interpretation of the experimental results.

  1. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    Science.gov (United States)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  2. Generalized uncertainty principle and entropy of three-dimensional rotating acoustic black hole

    International Nuclear Information System (INIS)

    Zhao, HuiHua; Li, GuangLiang; Zhang, LiChun

    2012-01-01

    Using the new equation of state density from the generalized uncertainty principle, we investigate statistics entropy of a 3-dimensional rotating acoustic black hole. When λ introduced in the generalized uncertainty principle takes a specific value, we obtain an area entropy and a correction term associated with the acoustic black hole. In this method, there does not exist any divergence and one needs not the small mass approximation in the original brick-wall model. -- Highlights: ► Statistics entropy of a 3-dimensional rotating acoustic black hole is studied. ► We obtain an area entropy and a correction term associated with it. ► We make λ introduced in the generalized uncertainty principle take a specific value. ► There does not exist any divergence in this method.

  3. Verification of the uncertainty principle by using diffraction of light waves

    International Nuclear Information System (INIS)

    Nikolic, D; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the acquisition of the experimental data and their further analysis, we used a computer. Because of its simplicity this experiment is very suitable for demonstration, as well as for a quantitative exercise at universities and final year of high school studies.

  4. Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.

    Science.gov (United States)

    Rogers, Michael D

    2003-06-01

    Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.

  5. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  6. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  7. Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities

    International Nuclear Information System (INIS)

    Tawfik, A.

    2013-01-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible

  8. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    Science.gov (United States)

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…

  9. The Quark-Gluon Plasma Equation of State and the Generalized Uncertainty Principle

    Directory of Open Access Journals (Sweden)

    L. I. Abou-Salem

    2015-01-01

    Full Text Available The quark-gluon plasma (QGP equation of state within a minimal length scenario or Generalized Uncertainty Principle (GUP is studied. The Generalized Uncertainty Principle is implemented on deriving the thermodynamics of ideal QGP at a vanishing chemical potential. We find a significant effect for the GUP term. The main features of QCD lattice results were quantitatively achieved in case of nf=0, nf=2, and nf=2+1 flavors for the energy density, the pressure, and the interaction measure. The exciting point is the large value of bag pressure especially in case of nf=2+1 flavor which reflects the strong correlation between quarks in this bag which is already expected. One can notice that the asymptotic behavior which is characterized by Stephan-Boltzmann limit would be satisfied.

  10. The most general form of deformation of the Heisenberg algebra from the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Masood, Syed [Department of Physics, International Islamic University, H-10 Sector, Islamabad (Pakistan); Faizal, Mir, E-mail: mirfaizalmir@gmail.com [Irving K. Barber School of Arts and Sciences, University of British Columbia – Okanagan, Kelowna, BC V1V 1V7 (Canada); Department of Physics and Astronomy, University of Lethbridge, Lethbridge, AB T1K 3M4 (Canada); Zaz, Zaid [Department of Electronics and Communication Engineering, University of Kashmir, Srinagar, Kashmir, 190006 (India); Ali, Ahmed Farag [Department of Physics, Faculty of Science, Benha University, Benha, 13518 (Egypt); Raza, Jamil [Department of Physics, International Islamic University, H-10 Sector, Islamabad (Pakistan); Shah, Mushtaq B. [Department of Physics, National Institute of Technology, Srinagar, Kashmir, 190006 (India)

    2016-12-10

    In this paper, we will propose the most general form of the deformation of Heisenberg algebra motivated by the generalized uncertainty principle. This deformation of the Heisenberg algebra will deform all quantum mechanical systems. The form of the generalized uncertainty principle used to motivate these results will be motivated by the space fractional quantum mechanics, and non-locality in quantum mechanical systems. We also analyse a specific limit of this generalized deformation for one dimensional system, and in that limit, a nonlocal deformation of the momentum operator generates a local deformation of all one dimensional quantum mechanical systems. We analyse the low energy effects of this deformation on a harmonic oscillator, Landau levels, Lamb shift, and potential barrier. We also demonstrate that this deformation leads to a discretization of space.

  11. Blockchain to Rule the Waves - Nascent Design Principles for Reducing Risk and Uncertainty in Decentralized Environments

    DEFF Research Database (Denmark)

    Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman

    2017-01-01

    Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings...... of a Design Science Research project involving the construction and evaluation of an information technology artifact in collaboration with Maersk, a leading international shipping company, where central documents in shipping, such as the Bill of Lading, are turned into a smart contract on blockchain. Based...... on our insights from the project, we provide first evidence for preliminary design principles for applications that aim to mitigate the transactional risk and uncertainty in decentralized environments using blockchain. Both the artifact and the first evidence for emerging design principles are novel...

  12. The most general form of deformation of the Heisenberg algebra from the generalized uncertainty principle

    International Nuclear Information System (INIS)

    Masood, Syed; Faizal, Mir; Zaz, Zaid; Ali, Ahmed Farag; Raza, Jamil; Shah, Mushtaq B.

    2016-01-01

    In this paper, we will propose the most general form of the deformation of Heisenberg algebra motivated by the generalized uncertainty principle. This deformation of the Heisenberg algebra will deform all quantum mechanical systems. The form of the generalized uncertainty principle used to motivate these results will be motivated by the space fractional quantum mechanics, and non-locality in quantum mechanical systems. We also analyse a specific limit of this generalized deformation for one dimensional system, and in that limit, a nonlocal deformation of the momentum operator generates a local deformation of all one dimensional quantum mechanical systems. We analyse the low energy effects of this deformation on a harmonic oscillator, Landau levels, Lamb shift, and potential barrier. We also demonstrate that this deformation leads to a discretization of space.

  13. Generalized uncertainty principle and the maximum mass of ideal white dwarfs

    Energy Technology Data Exchange (ETDEWEB)

    Rashidi, Reza, E-mail: reza.rashidi@srttu.edu

    2016-11-15

    The effects of a generalized uncertainty principle on the structure of an ideal white dwarf star is investigated. The equation describing the equilibrium configuration of the star is a generalized form of the Lane–Emden equation. It is proved that the star always has a finite size. It is then argued that the maximum mass of such an ideal white dwarf tends to infinity, as opposed to the conventional case where it has a finite value.

  14. A Simplified Proof of Uncertainty Principle for Quaternion Linear Canonical Transform

    Directory of Open Access Journals (Sweden)

    Mawardi Bahri

    2016-01-01

    Full Text Available We provide a short and simple proof of an uncertainty principle associated with the quaternion linear canonical transform (QLCT by considering the fundamental relationship between the QLCT and the quaternion Fourier transform (QFT. We show how this relation allows us to derive the inverse transform and Parseval and Plancherel formulas associated with the QLCT. Some other properties of the QLCT are also studied.

  15. Generalized Uncertainty Principle and Black Hole Entropy of Higher-Dimensional de Sitter Spacetime

    International Nuclear Information System (INIS)

    Zhao Haixia; Hu Shuangqi; Zhao Ren; Li Huaifan

    2007-01-01

    Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking black hole entropy. In particular, many researchers have expressed a vested interest in the coefficient of the logarithmic term of the black hole entropy correction term. In this paper, we calculate the correction value of the black hole entropy by utilizing the generalized uncertainty principle and obtain the correction term caused by the generalized uncertainty principle. Because in our calculation we think that the Bekenstein-Hawking area theorem is still valid after considering the generalized uncertainty principle, we derive that the coefficient of the logarithmic term of the black hole entropy correction term is positive. This result is different from the known result at present. Our method is valid not only for four-dimensional spacetimes but also for higher-dimensional spacetimes. In the whole process, the physics idea is clear and calculation is simple. It offers a new way for studying the entropy correction of the complicated spacetime.

  16. On the connection between complementarity and uncertainty principles in the Mach–Zehnder interferometric setting

    International Nuclear Information System (INIS)

    Bosyk, G M; Portesi, M; Holik, F; Plastino, A

    2013-01-01

    We revisit the connection between the complementarity and uncertainty principles of quantum mechanics within the framework of Mach–Zehnder interferometry. We focus our attention on the trade-off relation between complementary path information and fringe visibility. This relation is equivalent to the uncertainty relation of Schrödinger and Robertson for a suitably chosen pair of observables. We show that it is equivalent as well to the uncertainty inequality provided by Landau and Pollak. We also study the relationship of this trade-off relation with a family of entropic uncertainty relations based on Rényi entropies. There is no equivalence in this case, but the different values of the entropic parameter do define regimes that provides us with a tool to discriminate between non-trivial states of minimum uncertainty. The existence of such regimes agrees with previous results of Luis (2011 Phys. Rev. A 84 034101), although their meaning was not sufficiently clear. We discuss the origin of these regimes with the intention of gaining a deeper understanding of entropic measures. (paper)

  17. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  18. Acute generalised exanthematous pustulosis

    Directory of Open Access Journals (Sweden)

    Criton S

    2001-01-01

    Full Text Available Acute generalised exanthernatous pustulosis (AGEP is a condition characterised by sudden onset of non-follicular aseptic pustules all over the body. It is distinct from pustular psoriasis with characteristic morphology, histopathology and evolution.

  19. Acute generalised exanthematous pustulosis.

    Science.gov (United States)

    Criton, S; Sofia, B

    2001-01-01

    Acute generalised exanthernatous pustulosis (AGEP) is a condition characterised by sudden onset of non-follicular aseptic pustules all over the body. It is distinct from pustular psoriasis with characteristic morphology, histopathology and evolution.

  20. Vacuum thermalization of high intensity laser beams and the uncertainty principle

    International Nuclear Information System (INIS)

    Gupta, R.P.; Bhakar, B.S.; Panarella, E.

    1983-01-01

    This chapter phenomenologically calculates the cross section for photon-photon scattering in high intensity laser beams. The consequence of the Heisenberg uncertainty principle must be taken account in any photon-photon scattering calculation when many photons are present within the uncertainty volume. An exact determination of the number of scattering centers present in the scattering region is precluded when high intensity laser beams are involved in the scattering. Predictions are presented which suggest an upper limit to which the coherent photon densities can be increased either during amplification or focusing before scattering becomes predominant. The results of multiphoton ionization of gases, and laser induced CTR plasmas of the future, may be significantly affected due to the enhancement of the photon scattering investigated

  1. Determining the minimal length scale of the generalized uncertainty principle from the entropy-area relationship

    International Nuclear Information System (INIS)

    Kim, Wontae; Oh, John J.

    2008-01-01

    We derive the formula of the black hole entropy with a minimal length of the Planck size by counting quantum modes of scalar fields in the vicinity of the black hole horizon, taking into account the generalized uncertainty principle (GUP). This formula is applied to some intriguing examples of black holes - the Schwarzschild black hole, the Reissner-Nordstrom black hole, and the magnetically charged dilatonic black hole. As a result, it is shown that the GUP parameter can be determined by imposing the black hole entropy-area relationship, which has a Planck length scale and a universal form within the near-horizon expansion

  2. Completeness, special functions and uncertainty principles over q-linear grids

    International Nuclear Information System (INIS)

    Abreu, LuIs Daniel

    2006-01-01

    We derive completeness criteria for sequences of functions of the form f(xλ n ), where λ n is the nth zero of a suitably chosen entire function. Using these criteria, we construct complete nonorthogonal systems of Fourier-Bessel functions and their q-analogues, as well as other complete sets of q-special functions. We discuss connections with uncertainty principles over q-linear grids and the completeness of certain sets of q-Bessel functions is used to prove that, if a function f and its q-Hankel transform both vanish at the points {q -n } ∞ n=1 , 0 n } ∞ n=-∞

  3. Before and beyond the precautionary principle: Epistemology of uncertainty in science and law

    International Nuclear Information System (INIS)

    Tallacchini, Mariachiara

    2005-01-01

    The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society

  4. Generalised anxiety disorder

    Directory of Open Access Journals (Sweden)

    Bojana Avguštin Avčin

    2013-10-01

    Full Text Available Generalised anxiety disorder is characterised by persistent, excessive and difficult-to-control worry, which may be accompanied by several psychic and somatic symptoms, including suicidality. Generalized anxiety disorder is the most common psychiatric disorder in the primary care, although it is often underrecognised and undertreated. Generalized anxiety disorder is typically a chronic condition with low short- and medium-term remission rates. Clinical presentations often include depression, somatic illness, pain, fatigue and problems sleeping. The evaluation of prognosis is complicated by frequent comorbidity with other anxiety disorders and depression, which worsen the long-term outcome and accompanying burden of disability. The two main treatments for generalised anxiety disorder are medications and psychotherapy. Selective serotonin reuptake inhibitors and serotonin-norepinephrine reuptake inhibitors represent first-line psychopharmacologic treatment for generalised anxiety disorder. The most extensively studied psychotherapy for anxiety is cognitive behavioural therapy which has demonstrated efficacy throughout controlled studies.

  5. Covariant energy–momentum and an uncertainty principle for general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Cooperstock, F.I., E-mail: cooperst@uvic.ca [Department of Physics and Astronomy, University of Victoria, P.O. Box 3055, Victoria, B.C. V8W 3P6 (Canada); Dupre, M.J., E-mail: mdupre@tulane.edu [Department of Mathematics, Tulane University, New Orleans, LA 70118 (United States)

    2013-12-15

    We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.

  6. An Inconvenient Deliberation. The Precautionary Principle's Contribution to the Uncertainties Surrounding Climate Change Liability

    International Nuclear Information System (INIS)

    Haritz, M.M.

    2011-01-01

    There is increasing evidence to suggest that adaptation to the inevitable is as relevant to climate change policymaking as mitigation efforts. Both mitigation and adaptation, as well as the unavoidable damage occurring both now and that is predicted to occur, all involve costs at the expense of diverse climate change victims. The allocation of responsibilities - implicit in terms of the burden-sharing mechanisms that currently exist in public and private governance - demands recourse under liability law, especially as it has become clear that most companies will only start reducing emissions if verifiable costs of the economic consequences of climate change, including the likelihood of liability, outweigh the costs of taking precautionary measures. This vitally important book asks: Can the precautionary principle make uncertainty judiciable in the context of liability for the consequences of climate change, and, if so, to what extent? Drawing on the full range of pertinent existing literature and case law, the author examines the precautionary principle both in terms of its content and application and in the context of liability law. She analyses the indirect means offered by existing legislation being used by environmental groups and affected individuals before the courts to challenge both companies and regulators as responsible agents of climate change damage. In the process of responding to its fundamental question, the analysis explores such further questions as the following: (a) What is the role of the precautionary principle in resolving uncertainty in scientific risk assessment when faced with inconclusive evidence, and how does it affect decision-making, particularly in the regulatory choices concerning climate change? To this end, what is the concrete content of the precautionary principle?; (b) How does liability law generally handle scientific uncertainty? What different types of liability exist, and how are they equipped to handle a climate change

  7. Effect of Generalized Uncertainty Principle on Main-Sequence Stars and White Dwarfs

    Directory of Open Access Journals (Sweden)

    Mohamed Moussa

    2015-01-01

    Full Text Available This paper addresses the effect of generalized uncertainty principle, emerged from different approaches of quantum gravity within Planck scale, on thermodynamic properties of photon, nonrelativistic ideal gases, and degenerate fermions. A modification in pressure, particle number, and energy density are calculated. Astrophysical objects such as main-sequence stars and white dwarfs are examined and discussed as an application. A modification in Lane-Emden equation due to a change in a polytropic relation caused by the presence of quantum gravity is investigated. The applicable range of quantum gravity parameters is estimated. The bounds in the perturbed parameters are relatively large but they may be considered reasonable values in the astrophysical regime.

  8. Quantum corrections to the thermodynamics of Schwarzschild-Tangherlini black hole and the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Z.W.; Zu, X.T. [University of Electronic Science and Technology of China, School of Physical Electronics, Chengdu (China); Li, H.L. [University of Electronic Science and Technology of China, School of Physical Electronics, Chengdu (China); Shenyang Normal University, College of Physics Science and Technology, Shenyang (China); Yang, S.Z. [China West Normal University, Physics and Space Science College, Nanchong (China)

    2016-04-15

    We investigate the thermodynamics of Schwarzschild-Tangherlini black hole in the context of the generalized uncertainty principle (GUP). The corrections to the Hawking temperature, entropy and the heat capacity are obtained via the modified Hamilton-Jacobi equation. These modifications show that the GUP changes the evolution of the Schwarzschild-Tangherlini black hole. Specially, the GUP effect becomes susceptible when the radius or mass of the black hole approaches the order of Planck scale, it stops radiating and leads to a black hole remnant. Meanwhile, the Planck scale remnant can be confirmed through the analysis of the heat capacity. Those phenomena imply that the GUP may give a way to solve the information paradox. Besides, we also investigate the possibilities to observe the black hole at the Large Hadron Collider (LHC), and the results demonstrate that the black hole cannot be produced in the recent LHC. (orig.)

  9. Thermodynamics of a class of regular black holes with a generalized uncertainty principle

    Science.gov (United States)

    Maluf, R. V.; Neves, Juliano C. S.

    2018-05-01

    In this article, we present a study on thermodynamics of a class of regular black holes. Such a class includes Bardeen and Hayward regular black holes. We obtained thermodynamic quantities like the Hawking temperature, entropy, and heat capacity for the entire class. As part of an effort to indicate some physical observable to distinguish regular black holes from singular black holes, we suggest that regular black holes are colder than singular black holes. Besides, contrary to the Schwarzschild black hole, that class of regular black holes may be thermodynamically stable. From a generalized uncertainty principle, we also obtained the quantum-corrected thermodynamics for the studied class. Such quantum corrections provide a logarithmic term for the quantum-corrected entropy.

  10. Massive vector particles tunneling from black holes influenced by the generalized uncertainty principle

    Directory of Open Access Journals (Sweden)

    Xiang-Qian Li

    2016-12-01

    Full Text Available This study considers the generalized uncertainty principle, which incorporates the central idea of large extra dimensions, to investigate the processes involved when massive spin-1 particles tunnel from Reissner–Nordstrom and Kerr black holes under the effects of quantum gravity. For the black hole, the quantum gravity correction decelerates the increase in temperature. Up to O(1Mf2, the corrected temperatures are affected by the mass and angular momentum of the emitted vector bosons. In addition, the temperature of the Kerr black hole becomes uneven due to rotation. When the mass of the black hole approaches the order of the higher dimensional Planck mass Mf, it stops radiating and yields a black hole remnant.

  11. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    Science.gov (United States)

    Deffner, Sebastian; Campbell, Steve

    2017-11-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.

  12. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    International Nuclear Information System (INIS)

    Deffner, Sebastian; Campbell, Steve

    2017-01-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam–Tamm and the Margolus–Levitin bounds on the quantum speed limit , and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach , where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader. (topical review)

  13. Generalised anxiety disorder

    OpenAIRE

    Gale, Christopher K; Millichamp, Jane

    2011-01-01

    Generalised anxiety disorder is characterised by persistent, excessive and difficult-to-control worry, which may be accompanied by several psychic and somatic symptoms, including suicidality. Generalized anxiety disorder is the most common psychiatric disorder in the primary care, although it is often underrecognised and undertreated. Generalized anxiety disorder is typically a chronic condition with low short- and medium-term remission rates. Clinical presentations often include depression, ...

  14. Generalised twisted partition functions

    CERN Document Server

    Petkova, V B

    2001-01-01

    We consider the set of partition functions that result from the insertion of twist operators compatible with conformal invariance in a given 2D Conformal Field Theory (CFT). A consistency equation, which gives a classification of twists, is written and solved in particular cases. This generalises old results on twisted torus boundary conditions, gives a physical interpretation of Ocneanu's algebraic construction, and might offer a new route to the study of properties of CFT.

  15. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    Science.gov (United States)

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  16. Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.

    Science.gov (United States)

    Hsieh, I-Hui; Saberi, Kourosh

    2016-02-01

    How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction.

  17. f(R in Holographic and Agegraphic Dark Energy Models and the Generalized Uncertainty Principle

    Directory of Open Access Journals (Sweden)

    Barun Majumder

    2013-01-01

    Full Text Available We studied a unified approach with the holographic, new agegraphic, and f(R dark energy model to construct the form of f(R which in general is responsible for the curvature driven explanation of the very early inflation along with presently observed late time acceleration. We considered the generalized uncertainty principle in our approach which incorporated the corrections in the entropy-area relation and thereby modified the energy densities for the cosmological dark energy models considered. We found that holographic and new agegraphic f(R gravity models can behave like phantom or quintessence models in the spatially flat FRW universe. We also found a distinct term in the form of f(R which goes as R 3 / 2 due to the consideration of the GUP modified energy densities. Although the presence of this term in the action can be important in explaining the early inflationary scenario, Capozziello et al. recently showed that f(R ~ R 3 / 2 leads to an accelerated expansion, that is, a negative value for the deceleration parameter q which fits well with SNeIa and WMAP data.

  18. Generalised Batho correction factor

    International Nuclear Information System (INIS)

    Siddon, R.L.

    1984-01-01

    There are various approximate algorithms available to calculate the radiation dose in the presence of a heterogeneous medium. The Webb and Fox product over layers formulation of the generalised Batho correction factor requires determination of the number of layers and the layer densities for each ray path. It has been shown that the Webb and Fox expression is inefficient for the heterogeneous medium which is expressed as regions of inhomogeneity rather than layers. The inefficiency of the layer formulation is identified as the repeated problem of determining for each ray path which inhomogeneity region corresponds to a particular layer. It has been shown that the formulation of the Batho correction factor as a product over inhomogeneity regions avoids that topological problem entirely. The formulation in terms of a product over regions simplifies the computer code and reduces the time required to calculate the Batho correction factor for the general heterogeneous medium. (U.K.)

  19. Generalising the staircase models

    International Nuclear Information System (INIS)

    Dorey, P.; Ravanini, F.

    1993-01-01

    Systems of integral equations are proposed which generalise those previously encountered in connection with the so-called staircase models. Under the assumption that these equations describe the finite-size effects of relativistic field theories via the thermodynamic Bethe ansatz, analytical and numerical evidence is given for the existence of a variety of new roaming renormalisation group trajectories. For each positive integer k and s=0, .., k-1, these is a one-parameter family of trajectories, passing close by the coset conformal field theories G (k) xG (nk+s) /G ((n+1)k+s) before finally flowing to a massive theory for s=0, or to another coset model for s.=|0. (orig.)

  20. Practical application of the ALARA principle in management of the nuclear legacy: optimization under uncertainty

    International Nuclear Information System (INIS)

    Smith, Graham; Sneve, Malgorzata K.

    2008-01-01

    Full text: Radiological protection has a long and distinguished history in taking a balanced approach to optimization. Both utilitarian and individual interests and perspectives are addressed through a process of constrained optimisation, with optimisation intended to lead to the most benefit to the most people, and constraints being operative to limit the degree of inequity among the individuals exposed. At least, expressed simplistically, that is what the recommendations on protection are intended to achieve. This paper examines the difficulties in achieving that objective, based on consideration of the active role of optimisation in regulatory supervision of the historic nuclear legacy. This example is chosen because the application of the ALARA principle has important implications for some very major projects whose objective is remediation of existing legacy facilities. But it is also relevant because timely, effective and cost efficient completion of those projects has implications for confidence in the future development of nuclear power and other uses of radioactive materials. It is also an interesting example because legacy management includes mitigation of some major short and long term hazards, but those mitigating measures themselves involve operations with their own risk, cost and benefit profiles. Like any other complex activity, a legacy management project has to be broken down into logistically feasible parts. However, from a regulatory perspective, simultaneous application of ALARA to worker protection, major accident risk mitigation and long-term environmental and human health protection presents its own challenges. Major uncertainties which exacerbate the problem arise from ill-characterised source terms, estimation of the likelihood of unlikely failures in operational processes, and prospective assessment of radiological impacts over many hundreds of years and longer. The projects themselves are set to run over decades, during which time the

  1. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  2. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  3. Quantification of uncertainty in first-principles predicted mechanical properties of solids: Application to solid ion conductors

    Science.gov (United States)

    Ahmad, Zeeshan; Viswanathan, Venkatasubramanian

    2016-08-01

    Computationally-guided material discovery is being increasingly employed using a descriptor-based screening through the calculation of a few properties of interest. A precise understanding of the uncertainty associated with first-principles density functional theory calculated property values is important for the success of descriptor-based screening. The Bayesian error estimation approach has been built in to several recently developed exchange-correlation functionals, which allows an estimate of the uncertainty associated with properties related to the ground state energy, for example, adsorption energies. Here, we propose a robust and computationally efficient method for quantifying uncertainty in mechanical properties, which depend on the derivatives of the energy. The procedure involves calculating energies around the equilibrium cell volume with different strains and fitting the obtained energies to the corresponding energy-strain relationship. At each strain, we use instead of a single energy, an ensemble of energies, giving us an ensemble of fits and thereby, an ensemble of mechanical properties associated with each fit, whose spread can be used to quantify its uncertainty. The generation of ensemble of energies is only a post-processing step involving a perturbation of parameters of the exchange-correlation functional and solving for the energy non-self-consistently. The proposed method is computationally efficient and provides a more robust uncertainty estimate compared to the approach of self-consistent calculations employing several different exchange-correlation functionals. We demonstrate the method by calculating the uncertainty bounds for several materials belonging to different classes and having different structures using the developed method. We show that the calculated uncertainty bounds the property values obtained using three different GGA functionals: PBE, PBEsol, and RPBE. Finally, we apply the approach to calculate the uncertainty

  4. Sequential experimental design based generalised ANOVA

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    2016-07-15

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  5. Concurrent analysis: towards generalisable qualitative research.

    Science.gov (United States)

    Snowden, Austyn; Martin, Colin R

    2011-10-01

    This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.

  6. When the uncertainty principle goes up to 11 or how to explain quantum physics with heavy metal

    CERN Document Server

    Moriarty, Philip

    2018-01-01

    There are deep and fascinating links between heavy metal and quantum physics. No, there are. Really. While teaching at the University of Nottingham, physicist Philip Moriarty noticed something odd--a surprising number of his students were heavily into metal music. Colleagues, too: a Venn diagram of physicists and metal fans would show a shocking amount of overlap. What's more, it turns out that heavy metal music is uniquely well-suited to explaining quantum principles. In When the Uncertainty Principle Goes Up to Eleven, Moriarty explains the mysteries of the universe's inner workings via drum beats and feedback: You'll discover how the Heisenberg uncertainty principle comes into play with every chugging guitar riff, what wave interference has to do with Iron Maiden, and why metalheads in mosh pits behave just like molecules in a gas. If you're a metal fan trying to grasp the complexities of quantum physics, a quantum physicist baffled by heavy metal, or just someone who'd like to know how the fundamental sci...

  7. Entropic formulation of the uncertainty principle for the number and annihilation operators

    International Nuclear Information System (INIS)

    Rastegin, Alexey E

    2011-01-01

    An entropic approach to formulating uncertainty relations for the number-annihilation pair is considered. We construct some normal operator that traces the annihilation operator as well as commuting quadratures with a complete system of common eigenfunctions. Expanding the measured wave function with respect to them, one obtains a relevant probability distribution. Another distribution is naturally generated by measuring the number operator. Due to the Riesz-Thorin theorem, there exists a nontrivial inequality between corresponding functionals of the above distributions. We find the bound in this inequality and further derive uncertainty relations in terms of both the Rényi and Tsallis entropies. Entropic uncertainty relations for a continuous distribution as well as relations for a discretized one are presented. (comment)

  8. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    International Nuclear Information System (INIS)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic; Mollicone, Danilo; Federici, Sandro

    2008-01-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation

  9. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    Energy Technology Data Exchange (ETDEWEB)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic [Institute for Environment and Sustainability, Joint Research Centre of the European Commission, I-21020 Ispra (Italy); Mollicone, Danilo [Department of Geography, University of Alcala de Henares, Madrid (Spain); Federici, Sandro

    2008-07-15

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.

  10. A Framework for Generalising the Newton Method and Other Iterative Methods from Euclidean Space to Manifolds

    OpenAIRE

    Manton, Jonathan H.

    2012-01-01

    The Newton iteration is a popular method for minimising a cost function on Euclidean space. Various generalisations to cost functions defined on manifolds appear in the literature. In each case, the convergence rate of the generalised Newton iteration needed establishing from first principles. The present paper presents a framework for generalising iterative methods from Euclidean space to manifolds that ensures local convergence rates are preserved. It applies to any (memoryless) iterative m...

  11. Trans-Planckian Effects in Inflationary Cosmology and the Modified Uncertainty Principle

    DEFF Research Database (Denmark)

    F. Hassan, S.; Sloth, Martin Snoager

    2002-01-01

    There are good indications that fundamental physics gives rise to a modified space-momentum uncertainty relation that implies the existence of a minimum length scale. We implement this idea in the scalar field theory that describes density perturbations in flat Robertson-Walker space-time. This l...

  12. Living with uncertainty: from the precautionary principle to the methodology of ongoing normative assessment

    International Nuclear Information System (INIS)

    Dupuy, J.P.; Grinbaum, A.

    2005-01-01

    The analysis of our epistemic situation regarding singular events, such as abrupt climate change, shows essential limitations in the traditional modes of dealing with uncertainty. Typical cognitive barriers lead to the paralysis of action. What is needed is taking seriously the reality of the future. We argue for the application of the methodology of ongoing normative assessment. We show that it is, paradoxically, a matter of forming a project on the basis of a fixed future which one does not want, and this in a coordinated way at the level of social institutions. Ongoing assessment may be viewed as a prescription to live with uncertainty, in a particular sense of the term, in order for a future catastrophe not to occur. The assessment is necessarily normative in that it must include the anticipation of a retrospective ethical judgment on present choices (notion of moral luck). (authors)

  13. Blockchain to Rule the Waves - Nascent Design Principles for Reducing Risk and Uncertainty in Decentralized Environments

    OpenAIRE

    Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman; Palmund, Søren

    2017-01-01

    Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings of a Design Science Research project involving the construction and evaluation of an information technology artifact in collaboration with Maersk, a leading international shipping company, where cen...

  14. Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle

    Science.gov (United States)

    Shoufan Fang; George Z. Gertner

    2000-01-01

    When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...

  15. Wavelets-Computational Aspects of Sterian Realistic Approach to Uncertainty Principle in High Energy Physics: A Transient Approach

    Directory of Open Access Journals (Sweden)

    Cristian Toma

    2013-01-01

    Full Text Available This study presents wavelets-computational aspects of Sterian-realistic approach to uncertainty principle in high energy physics. According to this approach, one cannot make a device for the simultaneous measuring of the canonical conjugate variables in reciprocal Fourier spaces. However, such aspects regarding the use of conjugate Fourier spaces can be also noticed in quantum field theory, where the position representation of a quantum wave is replaced by momentum representation before computing the interaction in a certain point of space, at a certain moment of time. For this reason, certain properties regarding the switch from one representation to another in these conjugate Fourier spaces should be established. It is shown that the best results can be obtained using wavelets aspects and support macroscopic functions for computing (i wave-train nonlinear relativistic transformation, (ii reflection/refraction with a constant shift, (iii diffraction considered as interaction with a null phase shift without annihilation of associated wave, (iv deflection by external electromagnetic fields without phase loss, and (v annihilation of associated wave-train through fast and spatially extended phenomena according to uncertainty principle.

  16. Generalised shot noise Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    2005-01-01

    We introduce a class of cox cluster processes called generalised shot noise Cox processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process that drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can...

  17. Dyads, a generalisation of monads

    NARCIS (Netherlands)

    Fokkinga, M.M.

    The concept of dyad is defined as the least common generalisation of monads and co-monads. So, taking some of the ingredients to be the identity, the concept specialises to the concept of monad, and taking other ingredients to be the identity it specialises to co-monads. Except for one axiom, all

  18. Embracing uncertainty, managing complexity: applying complexity thinking principles to transformation efforts in healthcare systems.

    Science.gov (United States)

    Khan, Sobia; Vandermorris, Ashley; Shepherd, John; Begun, James W; Lanham, Holly Jordan; Uhl-Bien, Mary; Berta, Whitney

    2018-03-21

    Complexity thinking is increasingly being embraced in healthcare, which is often described as a complex adaptive system (CAS). Applying CAS to healthcare as an explanatory model for understanding the nature of the system, and to stimulate changes and transformations within the system, is valuable. A seminar series on systems and complexity thinking hosted at the University of Toronto in 2016 offered a number of insights on applications of CAS perspectives to healthcare that we explore here. We synthesized topics from this series into a set of six insights on how complexity thinking fosters a deeper understanding of accepted ideas in healthcare, applications of CAS to actors within the system, and paradoxes in applications of complexity thinking that may require further debate: 1) a complexity lens helps us better understand the nebulous term "context"; 2) concepts of CAS may be applied differently when actors are cognizant of the system in which they operate; 3) actor responses to uncertainty within a CAS is a mechanism for emergent and intentional adaptation; 4) acknowledging complexity supports patient-centred intersectional approaches to patient care; 5) complexity perspectives can support ways that leaders manage change (and transformation) in healthcare; and 6) complexity demands different ways of implementing ideas and assessing the system. To enhance our exploration of key insights, we augmented the knowledge gleaned from the series with key articles on complexity in the literature. Ultimately, complexity thinking acknowledges the "messiness" that we seek to control in healthcare and encourages us to embrace it. This means seeing challenges as opportunities for adaptation, stimulating innovative solutions to ensure positive adaptation, leveraging the social system to enable ideas to emerge and spread across the system, and even more important, acknowledging that these adaptive actions are part of system behaviour just as much as periods of stability are. By

  19. Logarithmic corrections to the uncertainty principle and infinitude of the number of bound states of n-particle systems

    International Nuclear Information System (INIS)

    Perez, J.F.; Coutinho, F.A.B.; Malta, C.P.

    1985-01-01

    It is shown that critical long distance behaviour for a two-body potential, defining the finiteness or infinitude of the number of negative eigenvalues of Schrodinger operators in ν-dimensions, are given by v sub(k) (r) = - [ν-2/2r] 2 - 1/(2rlnr) 2 + ... - 1/(2rlnr.lnlnr...ln sub(k)r) 2 where k=0,1... for ν not=2 and k=1,2... if ν=2. This result is a consequence of logarithmic corrections to an inequality known as Uncertainty Principle. If the continuum threshold in the N-body problem is defined by a two-cluster break up our results generate corrections to the existing sufficient conditions for the existence of infinitely many bound states. (Author) [pt

  20. Primary small bowel anastomosis in generalised peritonitis

    NARCIS (Netherlands)

    deGraaf, JS; van Goor, Harry; Bleichrodt, RP

    Objective: To find out if primary small bowel anastomosis of the bowel is safe in patients with generalised peritonitis who are treated by planned relaparotomies. Design: Retrospective study. Setting: University hospital, The Netherlands. Subjects. 10 Patients with generalised purulent peritonitis

  1. Maximally Localized States and Quantum Corrections of Black Hole Thermodynamics in the Framework of a New Generalized Uncertainty Principle

    International Nuclear Information System (INIS)

    Zhang, Shao-Jun; Miao, Yan-Gang; Zhao, Ying-Jie

    2015-01-01

    As a generalized uncertainty principle (GUP) leads to the effects of the minimal length of the order of the Planck scale and UV/IR mixing, some significant physical concepts and quantities are modified or corrected correspondingly. On the one hand, we derive the maximally localized states—the physical states displaying the minimal length uncertainty associated with a new GUP proposed in our previous work. On the other hand, in the framework of this new GUP we calculate quantum corrections to the thermodynamic quantities of the Schwardzschild black hole, such as the Hawking temperature, the entropy, and the heat capacity, and give a remnant mass of the black hole at the end of the evaporation process. Moreover, we compare our results with that obtained in the frameworks of several other GUPs. In particular, we observe a significant difference between the situations with and without the consideration of the UV/IR mixing effect in the quantum corrections to the evaporation rate and the decay time. That is, the decay time can greatly be prolonged in the former case, which implies that the quantum correction from the UV/IR mixing effect may give rise to a radical rather than a tiny influence to the Hawking radiation.

  2. An economic uncertainty principle

    Czech Academy of Sciences Publication Activity Database

    Vošvrda, Miloslav

    2000-01-01

    Roč. 8, č. 2 (2000), s. 79-87 ISSN 0572-3043 R&D Projects: GA ČR GA402/97/0007; GA ČR GA402/97/0770 Institutional research plan: AV0Z1075907 Subject RIV: BB - Applied Statistics, Operational Research

  3. Schrodinger's Uncertainty Principle?

    Indian Academy of Sciences (India)

    Research Institute,· mainly on applications of optical and statistical ... serves to be better known in the classroom. Let us recall the basic algebraic steps in the text book proof. We consider the wave function (which has a free real parameter a) (x + iap)1jJ == x1jJ(x) + ia( -in81jJ/8x) == 4>( x), The hat sign over x and p reminds ...

  4. Generalised shot noise Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We introduce a new class of Cox cluster processes called generalised shot-noise processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process which drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can...... be random. Thereby a very large class of models for aggregated or clustered point patterns is obtained. Due to the structure of GSNCPs, a number of useful results can be established. We focus first on deriving summary statistics for GSNCPs and next on how to make simulation for GSNCPs. Particularly, results...... for first and second order moment measures, reduced Palm distributions, the -function, simulation with or without edge effects, and conditional simulation of the intensity function driving a GSNCP are given. Our results are exemplified for special important cases of GSNCPs, and we discuss the relation...

  5. Cloverleaf skull with generalised bone dysplasia

    International Nuclear Information System (INIS)

    Kozlowski, K.; Warren, P.S.; Fisher, C.C.; Royal Hospital for Women, Camperdown

    1985-01-01

    A case of cloverleaf skull with generalised bone dysplasia is reported. The authors believe that bone dysplasia associated with cloverleaf is neither identical with thanatophoric dysplasia nor achondroplasia. Until identity of thanatophoric dysplasia and cloverleaf skull with generalised bone dysplasia is proved the diseases should be looked upon as separate entities and the wording ''thanatophoric dysplasia with cloverleaf skull'' should be abolished. (orig.)

  6. Wagner’s theory of generalised heaps

    CERN Document Server

    Hollings, Christopher D

    2017-01-01

    The theories of V. V. Wagner (1908-1981) on abstractions of systems of binary relations are presented here within their historical and mathematical contexts. This book contains the first translation from Russian into English of a selection of Wagner’s papers, the ideas of which are connected to present-day mathematical research. Along with a translation of Wagner’s main work in this area, his 1953 paper ‘Theory of generalised heaps and generalised groups,’ the book also includes translations of three short precursor articles that provide additional context for his major work. Researchers and students interested in both algebra (in particular, heaps, semiheaps, generalised heaps, semigroups, and groups) and differential geometry will benefit from the techniques offered by these translations, owing to the natural connections between generalised heaps and generalised groups, and the role played by these concepts in differential geometry. This book gives examples from present-day mathematics where ideas r...

  7. The energy-time uncertainty principle and the EPR paradox: Experiments involving correlated two-photon emission in parametric down-conversion

    Science.gov (United States)

    Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

    1992-01-01

    The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

  8. An ontological system for interoperable spatial generalisation in biodiversity monitoring

    Science.gov (United States)

    Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael

    2015-11-01

    Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.

  9. Quantum generalisation of feedforward neural networks

    Science.gov (United States)

    Wan, Kwok Ho; Dahlsten, Oscar; Kristjánsson, Hlér; Gardner, Robert; Kim, M. S.

    2017-09-01

    We propose a quantum generalisation of a classical neural network. The classical neurons are firstly rendered reversible by adding ancillary bits. Then they are generalised to being quantum reversible, i.e., unitary (the classical networks we generalise are called feedforward, and have step-function activation functions). The quantum network can be trained efficiently using gradient descent on a cost function to perform quantum generalisations of classical tasks. We demonstrate numerically that it can: (i) compress quantum states onto a minimal number of qubits, creating a quantum autoencoder, and (ii) discover quantum communication protocols such as teleportation. Our general recipe is theoretical and implementation-independent. The quantum neuron module can naturally be implemented photonically.

  10. Cloverleaf skull with generalised bone dysplasia

    Energy Technology Data Exchange (ETDEWEB)

    Kozlowski, K.; Warren, P.S.; Fisher, C.C.

    1985-09-01

    A case of cloverleaf skull with generalised bone dysplasia is reported. The authors believe that bone dysplasia associated with cloverleaf is neither identical with thanatophoric dysplasia nor achondroplasia. Until identity of thanatophoric dysplasia and cloverleaf skull with generalised bone dysplasia is proved the diseases should be looked upon as separate entities and the wording ''thanatophoric dysplasia with cloverleaf skull'' should be abolished.

  11. Improvement on generalised synchronisation of chaotic systems

    International Nuclear Information System (INIS)

    Hui-Bin, Zhu; Fang, Qiu; Bao-Tong, Cui

    2010-01-01

    In this paper, the problem of generalised synchronisation of two different chaotic systems is investigated. Some less conservative conditions are derived using linear matrix inequality other than existing results. Furthermore, a simple adaptive control scheme is proposed to achieve the generalised synchronisation of chaotic systems. The proposed method is simple and easy to implement in practice and can be applied to secure communications. Numerical simulations are also given to demonstrate the effectiveness and feasibility of the theoretical analysis

  12. Further Generalisations of Twisted Gabidulin Codes

    DEFF Research Database (Denmark)

    Puchinger, Sven; Rosenkilde, Johan Sebastian Heesemann; Sheekey, John

    2017-01-01

    We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes.......We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes....

  13. The generalised Sylvester matrix equations over the generalised bisymmetric and skew-symmetric matrices

    Science.gov (United States)

    Dehghan, Mehdi; Hajarian, Masoud

    2012-08-01

    A matrix P is called a symmetric orthogonal if P = P T = P -1. A matrix X is said to be a generalised bisymmetric with respect to P if X = X T = PXP. It is obvious that any symmetric matrix is also a generalised bisymmetric matrix with respect to I (identity matrix). By extending the idea of the Jacobi and the Gauss-Seidel iterations, this article proposes two new iterative methods, respectively, for computing the generalised bisymmetric (containing symmetric solution as a special case) and skew-symmetric solutions of the generalised Sylvester matrix equation ? (including Sylvester and Lyapunov matrix equations as special cases) which is encountered in many systems and control applications. When the generalised Sylvester matrix equation has a unique generalised bisymmetric (skew-symmetric) solution, the first (second) iterative method converges to the generalised bisymmetric (skew-symmetric) solution of this matrix equation for any initial generalised bisymmetric (skew-symmetric) matrix. Finally, some numerical results are given to illustrate the effect of the theoretical results.

  14. Generalisability of a composite student selection programme

    DEFF Research Database (Denmark)

    O'Neill, Lotte Dyhrberg; Korsholm, Lars; Wallstedt, Birgitta

    2009-01-01

    format); general knowledge (multiple-choice test), and a semi-structured admission interview. The aim of this study was to estimate the generalisability of a composite selection. METHODS: Data from 307 applicants who participated in the admission to medicine in 2007 were available for analysis. Each...... admission parameter was double-scored using two random, blinded and independent raters. Variance components for applicant, rater and residual effects were estimated for a mixed model with the restricted maximum likelihood (REML) method. The reliability of obtained applicant ranks (G coefficients......) was calculated for individual admission criteria and for composite admission procedures. RESULTS: A pre-selection procedure combining qualification and motivation scores showed insufficient generalisability (G = 0.45). The written motivation in particular, displayed low generalisability (G = 0.10). Good...

  15. Automatic map generalisation from research to production

    Science.gov (United States)

    Nyberg, Rose; Johansson, Mikael; Zhang, Yang

    2018-05-01

    The manual work of map generalisation is known to be a complex and time consuming task. With the development of technology and societies, the demands for more flexible map products with higher quality are growing. The Swedish mapping, cadastral and land registration authority Lantmäteriet has manual production lines for databases in five different scales, 1 : 10 000 (SE10), 1 : 50 000 (SE50), 1 : 100 000 (SE100), 1 : 250 000 (SE250) and 1 : 1 million (SE1M). To streamline this work, Lantmäteriet started a project to automatically generalise geographic information. Planned timespan for the project is 2015-2022. Below the project background together with the methods for the automatic generalisation are described. The paper is completed with a description of results and conclusions.

  16. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.

    2001-01-01

    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...... propose an interesting application to formalisation of hybrid systems. We obtain some class of hybrid systems, which trajectories are computable in the sense of computable analysis. This research was supported in part by the RFBR (grants N 99-01-00485, N 00-01- 00810) and by the Siberian Branch of RAS (a...... grant for young researchers, 2000)...

  17. The oculocerebral syndrome in association with generalised ...

    African Journals Online (AJOL)

    A 14-year-old girl with generalised hypopigmentation, mental retardation, abnormal movements, and ocular anomalies is described. It is suggested that she represents a further case of oculocerebral albinism, a rare autosomal recessive condition. Reference is made to previous similar cases.

  18. On Generalisation of Polynomials in Complex Plane

    Directory of Open Access Journals (Sweden)

    Maslina Darus

    2010-01-01

    Full Text Available The generalised Bell and Laguerre polynomials of fractional-order in complex z-plane are defined. Some properties are studied. Moreover, we proved that these polynomials are univalent solutions for second order differential equations. Also, the Laguerre-type of some special functions are introduced.

  19. Exactly marginal deformations from exceptional generalised geometry

    Energy Technology Data Exchange (ETDEWEB)

    Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford,Andrew Wiles Building, Woodstock Road, Oxford, OX2 6GG (United Kingdom); Gabella, Maxime [Institute for Advanced Study,Einstein Drive, Princeton, NJ 08540 (United States); Graña, Mariana [Institut de Physique Théorique, CEA/Saclay,91191 Gif-sur-Yvette (France); Petrini, Michela [Sorbonne Université, UPMC Paris 05, UMR 7589, LPTHE,75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)

    2017-01-27

    We apply exceptional generalised geometry to the study of exactly marginal deformations of N=1 SCFTs that are dual to generic AdS{sub 5} flux backgrounds in type IIB or eleven-dimensional supergravity. In the gauge theory, marginal deformations are parametrised by the space of chiral primary operators of conformal dimension three, while exactly marginal deformations correspond to quotienting this space by the complexified global symmetry group. We show how the supergravity analysis gives a geometric interpretation of the gauge theory results. The marginal deformations arise from deformations of generalised structures that solve moment maps for the generalised diffeomorphism group and have the correct charge under the generalised Reeb vector, generating the R-symmetry. If this is the only symmetry of the background, all marginal deformations are exactly marginal. If the background possesses extra isometries, there are obstructions that come from fixed points of the moment maps. The exactly marginal deformations are then given by a further quotient by these extra isometries. Our analysis holds for any N=2 AdS{sub 5} flux background. Focussing on the particular case of type IIB Sasaki-Einstein backgrounds we recover the result that marginal deformations correspond to perturbing the solution by three-form flux at first order. In various explicit examples, we show that our expression for the three-form flux matches those in the literature and the obstruction conditions match the one-loop beta functions of the dual SCFT.

  20. Generalised fluid dynamics and quantum mechanics

    NARCIS (Netherlands)

    Broer, L.J.F.

    1974-01-01

    A generalised theory of irrotational fluid flow is developed in hamiltonian form. This allows a systematic derivation of equations for momentum, energy and the rate of work. It is shown that a nonlinear field equation for weakly interacting condensed bosons as given by Gross1) and the one-electron

  1. Generalised phase contrast: microscopy, manipulation and more

    DEFF Research Database (Denmark)

    Palima, Darwin; Glückstad, Jesper

    2010-01-01

    Generalised phase contrast (GPC) not only leads to more accurate phase imaging beyond thin biological samples, but serves as an enabling framework in developing tools over a wide spectrum of contemporary applications in optics and photonics, including optical trapping and micromanipulation, optic...

  2. Hyperscaling violating solutions in generalised EMD theory

    Directory of Open Access Journals (Sweden)

    Li Li

    2017-04-01

    Full Text Available This short note is devoted to deriving scaling but hyperscaling violating solutions in a generalised Einstein–Maxwell-Dilaton theory with an arbitrary number of scalars and vectors. We obtain analytic solutions in some special case and discuss the physical constraints on the allowed parameter range in order to have a well-defined holographic ground-state solution.

  3. Hyperscaling violating solutions in generalised EMD theory

    Energy Technology Data Exchange (ETDEWEB)

    Li, Li, E-mail: lil416@lehigh.edu [Crete Center for Theoretical Physics, Institute for Theoretical and Computational Physics, Department of Physics, University of Crete, 71003 Heraklion (Greece); Crete Center for Quantum Complexity and Nanotechnology, Department of Physics, University of Crete, 71003 Heraklion (Greece); Department of Physics, Lehigh University, Bethlehem, PA, 18018 (United States)

    2017-04-10

    This short note is devoted to deriving scaling but hyperscaling violating solutions in a generalised Einstein–Maxwell-Dilaton theory with an arbitrary number of scalars and vectors. We obtain analytic solutions in some special case and discuss the physical constraints on the allowed parameter range in order to have a well-defined holographic ground-state solution.

  4. On the Momentum Transported by the Radiation Field of a Long Transient Dipole and Time Energy Uncertainty Principle

    Directory of Open Access Journals (Sweden)

    Vernon Cooray

    2016-11-01

    Full Text Available The paper describes the net momentum transported by the transient electromagnetic radiation field of a long transient dipole in free space. In the dipole a current is initiated at one end and propagates towards the other end where it is absorbed. The results show that the net momentum transported by the radiation is directed along the axis of the dipole where the currents are propagating. In general, the net momentum P transported by the electromagnetic radiation of the dipole is less than the quantity U / c , where U is the total energy radiated by the dipole and c is the speed of light in free space. In the case of a Hertzian dipole, the net momentum transported by the radiation field is zero because of the spatial symmetry of the radiation field. As the effective wavelength of the current decreases with respect to the length of the dipole (or the duration of the current decreases with respect to the travel time of the current along the dipole, the net momentum transported by the radiation field becomes closer and closer to U / c , and for effective wavelengths which are much shorter than the length of the dipole, P ≈ U / c . The results show that when the condition P ≈ U / c is satisfied, the radiated fields satisfy the condition Δ t Δ U ≥ h / 4 π where Δ t is the duration of the radiation, Δ U is the uncertainty in the dissipated energy and h is the Plank constant.

  5. The special theory of Brownian relativity: equivalence principle for dynamic and static random paths and uncertainty relation for diffusion.

    Science.gov (United States)

    Mezzasalma, Stefano A

    2007-03-15

    The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected.

  6. Quantum mechanics of a generalised rigid body

    International Nuclear Information System (INIS)

    Gripaios, Ben; Sutherland, Dave

    2016-01-01

    We consider the quantum version of Arnold’s generalisation of a rigid body in classical mechanics. Thus, we quantise the motion on an arbitrary Lie group manifold of a particle whose classical trajectories correspond to the geodesics of any one-sided-invariant metric. We show how the derivation of the spectrum of energy eigenstates can be simplified by making use of automorphisms of the Lie algebra and (for groups of type I) by methods of harmonic analysis. We show how the method can be extended to cosets, generalising the linear rigid rotor. As examples, we consider all connected and simply connected Lie groups up to dimension 3. This includes the universal cover of the archetypical rigid body, along with a number of new exactly solvable models. We also discuss a possible application to the topical problem of quantising a perfect fluid. (paper)

  7. Support vector machines and generalisation in HEP

    Science.gov (United States)

    Bevan, Adrian; Gamboa Goñi, Rodrigo; Hays, Jon; Stevenson, Tom

    2017-10-01

    We review the concept of Support Vector Machines (SVMs) and discuss examples of their use in a number of scenarios. Several SVM implementations have been used in HEP and we exemplify this algorithm using the Toolkit for Multivariate Analysis (TMVA) implementation. We discuss examples relevant to HEP including background suppression for H → τ + τ - at the LHC with several different kernel functions. Performance benchmarking leads to the issue of generalisation of hyper-parameter selection. The avoidance of fine tuning (over training or over fitting) in MVA hyper-parameter optimisation, i.e. the ability to ensure generalised performance of an MVA that is independent of the training, validation and test samples, is of utmost importance. We discuss this issue and compare and contrast performance of hold-out and k-fold cross-validation. We have extended the SVM functionality and introduced tools to facilitate cross validation in TMVA and present results based on these improvements.

  8. Open quantum generalisation of Hopfield neural networks

    Science.gov (United States)

    Rotondo, P.; Marcuzzi, M.; Garrahan, J. P.; Lesanovsky, I.; Müller, M.

    2018-03-01

    We propose a new framework to understand how quantum effects may impact on the dynamics of neural networks. We implement the dynamics of neural networks in terms of Markovian open quantum systems, which allows us to treat thermal and quantum coherent effects on the same footing. In particular, we propose an open quantum generalisation of the Hopfield neural network, the simplest toy model of associative memory. We determine its phase diagram and show that quantum fluctuations give rise to a qualitatively new non-equilibrium phase. This novel phase is characterised by limit cycles corresponding to high-dimensional stationary manifolds that may be regarded as a generalisation of storage patterns to the quantum domain.

  9. Quantum field theory in generalised Snyder spaces

    International Nuclear Information System (INIS)

    Meljanac, S.; Meljanac, D.; Mignemi, S.; Štrajn, R.

    2017-01-01

    We discuss the generalisation of the Snyder model that includes all possible deformations of the Heisenberg algebra compatible with Lorentz invariance and investigate its properties. We calculate perturbatively the law of addition of momenta and the star product in the general case. We also undertake the construction of a scalar field theory on these noncommutative spaces showing that the free theory is equivalent to the commutative one, like in other models of noncommutative QFT.

  10. Quantum field theory in generalised Snyder spaces

    Energy Technology Data Exchange (ETDEWEB)

    Meljanac, S.; Meljanac, D. [Rudjer Bošković Institute, Bijenička cesta 54, 10002 Zagreb (Croatia); Mignemi, S., E-mail: smignemi@unica.it [Dipartimento di Matematica e Informatica, Università di Cagliari, viale Merello 92, 09123 Cagliari (Italy); INFN, Sezione di Cagliari, Cittadella Universitaria, 09042 Monserrato (Italy); Štrajn, R. [Dipartimento di Matematica e Informatica, Università di Cagliari, viale Merello 92, 09123 Cagliari (Italy); INFN, Sezione di Cagliari, Cittadella Universitaria, 09042 Monserrato (Italy)

    2017-05-10

    We discuss the generalisation of the Snyder model that includes all possible deformations of the Heisenberg algebra compatible with Lorentz invariance and investigate its properties. We calculate perturbatively the law of addition of momenta and the star product in the general case. We also undertake the construction of a scalar field theory on these noncommutative spaces showing that the free theory is equivalent to the commutative one, like in other models of noncommutative QFT.

  11. Generalised linear models for correlated pseudo-observations, with applications to multi-state models

    DEFF Research Database (Denmark)

    Andersen, Per Kragh; Klein, John P.; Rosthøj, Susanne

    2003-01-01

    Generalised estimating equation; Generalised linear model; Jackknife pseudo-value; Logistic regression; Markov Model; Multi-state model......Generalised estimating equation; Generalised linear model; Jackknife pseudo-value; Logistic regression; Markov Model; Multi-state model...

  12. Generalised model for anisotropic compact stars

    Energy Technology Data Exchange (ETDEWEB)

    Maurya, S.K. [University of Nizwa, Department of Mathematical and Physical Sciences College of Arts and Science, Nizwa (Oman); Gupta, Y.K. [Raj Kumar Goel Institute of Technology, Department of Mathematics, Ghaziabad, Uttar Pradesh (India); Ray, Saibal [Government College of Engineering and Ceramic Technology, Department of Physics, Kolkata, West Bengal (India); Deb, Debabrata [Indian Institute of Engineering Science and Technology, Shibpur, Department of Physics, Howrah, West Bengal (India)

    2016-12-15

    In the present investigation an exact generalised model for anisotropic compact stars of embedding class 1 is sought with a general relativistic background. The generic solutions are verified by exploring different physical aspects, viz. energy conditions, mass-radius relation, stability of the models, in connection to their validity. It is observed that the model presented here for compact stars is compatible with all these physical tests and thus physically acceptable as far as the compact star candidates RXJ 1856-37, SAX J 1808.4-3658 (SS1) and SAX J 1808.4-3658 (SS2) are concerned. (orig.)

  13. Uncertainty Principles and Fourier Analysis

    Indian Academy of Sciences (India)

    analysis on the part of the reader. Those who are not fa- miliar with Fourier analysis are encouraged to look up Box. 1 along with [3]. (A) Heisenberg's inequality: Let us measure concentration in terms of standard deviation i.e. for a square integrable func-. 00 tion defined on 1R and normalized so that J If(x)12d,x = 1,. -00. 00.

  14. Generalised structures for N=1 AdS backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Coimbra, André [Institut für Theoretische Physik & Center for Quantum Engineering and Spacetime Research,Leibniz Universität Hannover,Appelstraße 2, 30167 Hannover (Germany); Strickland-Constable, Charles [Institut de physique théorique, Université Paris Saclay, CEA, CNRS, Orme des Merisiers, F-91191 Gif-sur-Yvette (France)

    2016-11-16

    We expand upon a claim made in a recent paper [http://arxiv.org/abs/1411.5721] that generic minimally supersymmetric AdS backgrounds of warped flux compactifications of Type II and M theory can be understood as satisfying a straightforward weak integrability condition in the language of E{sub d(d)}×ℝ{sup +} generalised geometry. Namely, they are spaces admitting a generalised G-structure set by the Killing spinor and with constant singlet generalised intrinsic torsion.

  15. On uncertainty in information and ignorance in knowledge

    Science.gov (United States)

    Ayyub, Bilal M.

    2010-05-01

    This paper provides an overview of working definitions of knowledge, ignorance, information and uncertainty and summarises formalised philosophical and mathematical framework for their analyses. It provides a comparative examination of the generalised information theory and the generalised theory of uncertainty. It summarises foundational bases for assessing the reliability of knowledge constructed as a collective set of justified true beliefs. It discusses system complexity for ancestor simulation potentials. It offers value-driven communication means of knowledge and contrarian knowledge using memes and memetics.

  16. A generalised Dynamic Overflow Risk Assessment (DORA) for Real Time Control of urban drainage systems

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Grum, Morten

    2014-01-01

    An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored in the drai......An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored...... and their uncertainty contributed to further improving the performance of drainage systems. The results of this paper will contribute to the wider usage of global RTC methods in the management of urban drainage networks....

  17. Asymptotic Behaviour of Total Generalised Variation

    KAUST Repository

    Papafitsoros, Konstantinos; Valkonen, Tuomo

    2015-01-01

    © Springer International Publishing Switzerland 2015. The recently introduced second order total generalised variation functional TGV2 β,α has been a successful regulariser for image processing purposes. Its definition involves two positive parameters α and β whose values determine the amount and the quality of the regularisation. In this paper we report on the behaviour of TGV2 β,α in the cases where the parameters α, β as well as their ratio β/α becomes very large or very small. Among others, we prove that for sufficiently symmetric two dimensional data and large ratio β/α, TGV2 β,α regularisation coincides with total variation (TV) regularization

  18. Acute generalised exanthematous pustulosis: An update

    Directory of Open Access Journals (Sweden)

    Abhishek De

    2018-01-01

    Full Text Available Acute generalised exanthematous pustulosis (AGEP is a severe cutaneous adverse reaction and is attributed to drugs in more than 90% of cases. It is a rare disease, with an estimated incidence of 1–5 patients per million per year. The clinical manifestations characterised by the rapid development of sterile pustular lesions, fever and leucocytosis. Number of drugs has been reported to be associated with AGEP, most common being the antibiotics. Histopathologically there is intraepidermal pustules and papillary dermal oedema with neutrophilic and eosinophilic infiltrations. Systemic involvement can be present in more severe cases. Early diagnosis with withdrawal of the causative drug is the most important step in the management. Treatment includes supportive care, prevention of antibiotics and use of a potent topical steroid.

  19. Threshold corrections, generalised prepotentials and Eichler integrals

    CERN Document Server

    Angelantonj, Carlo; Pioline, Boris

    2015-06-12

    We continue our study of one-loop integrals associated to BPS-saturated amplitudes in $\\mathcal{N}=2$ heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur-Poincar\\'e series in the complex structure modulus. The closure of Niebur-Poincar\\'e series under modular derivatives implies that such integrals derive from holomorphic prepotentials $f_n$, generalising the familiar prepotential of $\\mathcal{N}=2$ supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involv...

  20. Acute Generalised Exanthematous Pustulosis: An Update.

    Science.gov (United States)

    De, Abhishek; Das, Sudip; Sarda, Aarti; Pal, Dayamay; Biswas, Projna

    2018-01-01

    Acute generalised exanthematous pustulosis (AGEP) is a severe cutaneous adverse reaction and is attributed to drugs in more than 90% of cases. It is a rare disease, with an estimated incidence of 1-5 patients per million per year. The clinical manifestations characterised by the rapid development of sterile pustular lesions, fever and leucocytosis. Number of drugs has been reported to be associated with AGEP, most common being the antibiotics. Histopathologically there is intraepidermal pustules and papillary dermal oedema with neutrophilic and eosinophilic infiltrations. Systemic involvement can be present in more severe cases. Early diagnosis with withdrawal of the causative drug is the most important step in the management. Treatment includes supportive care, prevention of antibiotics and use of a potent topical steroid.

  1. On a quaternionic generalisation of the Riccati differential equation

    OpenAIRE

    Kravchenko, Viktor; Kravchenko, Vladislav; Williams, Benjamin

    2001-01-01

    A quaternionic partial differential equation is shown to be a generalisation of the Riccati ordinary differential equation and its relationship with the Schrodinger equation is established. Various approaches to the problem of finding particular solutions are explored, and the generalisations of two theorems of Euler on the Riccati differential equation, which correspond to the quaternionic equation, are given.

  2. A generalised groundwater flow equation using the concept of non ...

    African Journals Online (AJOL)

    The classical Darcy law is generalised by regarding the water flow as a function of a non-integer order derivative of the piezometric head. This generalised law and the law of conservation of mass are then used to derive a new equation for groundwater flow. Numerical solutions of this equation for various fractional orders of ...

  3. Generalised Brown Clustering and Roll-up Feature Generation

    DEFF Research Database (Denmark)

    Derczynski, Leon; Chester, Sean

    2016-01-01

    active set size. Moreover, the generalisation permits a novel approach to feature selection from Brown clusters: We show that the standard approach of shearing the Brown clustering output tree at arbitrary bitlengths is lossy and that features should be chosen instead by rolling up Generalised Brown...

  4. On the exceptional generalised Lie derivative for d≥7

    International Nuclear Information System (INIS)

    Rosabal, J.A.

    2015-01-01

    In this work we revisit the E_8×ℝ"+ generalised Lie derivative encoding the algebra of diffeomorphisms and gauge transformations of compactifications of M-theory on eight-dimensional manifolds, by extending certain features of the E_7×ℝ"+ one. Compared to its E_d×ℝ"+, d≤7 counterparts, a new term is needed for consistency. However, we find that no compensating parameters need to be introduced, but rather that the new term can be written in terms of the ordinary generalised gauge parameters by means of a connection. This implies that no further degrees of freedom, beyond those of the field content of the E_8 group, are needed to have a well defined theory. We discuss the implications of the structure of the E_8×ℝ"+ generalised transformation on the construction of the d=8 generalised geometry. Finally, we suggest how to lift the generalised Lie derivative to eleven dimensions.

  5. Deformations of the generalised Picard bundle

    International Nuclear Information System (INIS)

    Biswas, I.; Brambila-Paz, L.; Newstead, P.E.

    2004-08-01

    Let X be a nonsingular algebraic curve of genus g ≥ 3, and let Mξ denote the moduli space of stable vector bundles of rank n ≥ 2 and degree d with fixed determinant ξ over X such that n and d are coprime. We assume that if g = 3 then n ≥ 4 and if g = 4 then n ≥ 3, and suppose further that n 0 , d 0 are integers such that n 0 ≥ 1 and nd 0 + n 0 d > nn 0 (2g - 2). Let E be a semistable vector bundle over X of rank n 0 and degree d 0 . The generalised Picard bundle W ξ (E) is by definition the vector bundle over M ξ defined by the direct image p M ξ *(U ξ x p X * E) where U ξ is a universal vector bundle over X x M ξ . We obtain an inversion formula allowing us to recover E from W ξ (E) and show that the space of infinitesimal deformations of W ξ (E) is isomorphic to H 1 (X, End(E)). This construction gives a locally complete family of vector bundles over M ξ parametrised by the moduli space M(n 0 ,d 0 ) of stable bundles of rank n 0 and degree d 0 over X. If (n 0 ,d 0 ) = 1 and W ξ (E) is stable for all E is an element of M(n 0 ,d 0 ), the construction determines an isomorphism from M(n 0 ,d 0 ) to a connected component M 0 of a moduli space of stable sheaves over M ξ . This applies in particular when n 0 = 1, in which case M 0 is isomorphic to the Jacobian J of X as a polarised variety. The paper as a whole is a generalisation of results of Kempf and Mukai on Picard bundles over J, and is also related to a paper of Tyurin on the geometry of moduli of vector bundles. (author)

  6. Thermodynamic properties of ideal Fermi gases in a harmonic potential in an n-dimensional space under the generalized uncertainty principle

    Science.gov (United States)

    Li, Heling; Ren, Jinxiu; Wang, Wenwei; Yang, Bin; Shen, Hongjun

    2018-02-01

    Using the semi-classical (Thomas-Fermi) approximation, the thermodynamic properties of ideal Fermi gases in a harmonic potential in an n-dimensional space are studied under the generalized uncertainty principle (GUP). The mean particle number, internal energy, heat capacity and other thermodynamic variables of the Fermi system are calculated analytically. Then, analytical expressions of the mean particle number, internal energy, heat capacity, chemical potential, Fermi energy, ground state energy and amendments of the GUP are obtained at low temperatures. The influence of both the GUP and the harmonic potential on the thermodynamic properties of a copper-electron gas and other systems with higher electron densities are studied numerically at low temperatures. We find: (1) When the GUP is considered, the influence of the harmonic potential is very much larger, and the amendments produced by the GUP increase by eight to nine orders of magnitude compared to when no external potential is applied to the electron gas. (2) The larger the particle density, or the smaller the particle masses, the bigger the influence of the GUP. (3) The effect of the GUP increases with the increase in the spatial dimensions. (4) The amendments of the chemical potential, Fermi energy and ground state energy increase with an increase in temperature, while the heat capacity decreases. T F0 is the Fermi temperature of the ideal Fermi system in a harmonic potential. When the temperature is lower than a certain value (0.22 times T F0 for the copper-electron gas, and this value decreases with increasing electron density), the amendment to the internal energy is positive, however, the amendment decreases with increasing temperature. When the temperature increases to the value, the amendment is zero, and when the temperature is higher than the value, the amendment to the internal energy is negative and the absolute value of the amendment increases with increasing temperature. (5) When electron

  7. Work and entropy production in generalised Gibbs ensembles

    International Nuclear Information System (INIS)

    Perarnau-Llobet, Martí; Riera, Arnau; Gallego, Rodrigo; Wilming, Henrik; Eisert, Jens

    2016-01-01

    Recent years have seen an enormously revived interest in the study of thermodynamic notions in the quantum regime. This applies both to the study of notions of work extraction in thermal machines in the quantum regime, as well as to questions of equilibration and thermalisation of interacting quantum many-body systems as such. In this work we bring together these two lines of research by studying work extraction in a closed system that undergoes a sequence of quenches and equilibration steps concomitant with free evolutions. In this way, we incorporate an important insight from the study of the dynamics of quantum many body systems: the evolution of closed systems is expected to be well described, for relevant observables and most times, by a suitable equilibrium state. We will consider three kinds of equilibration, namely to (i) the time averaged state, (ii) the Gibbs ensemble and (iii) the generalised Gibbs ensemble, reflecting further constants of motion in integrable models. For each effective description, we investigate notions of entropy production, the validity of the minimal work principle and properties of optimal work extraction protocols. While we keep the discussion general, much room is dedicated to the discussion of paradigmatic non-interacting fermionic quantum many-body systems, for which we identify significant differences with respect to the role of the minimal work principle. Our work not only has implications for experiments with cold atoms, but also can be viewed as suggesting a mindset for quantum thermodynamics where the role of the external heat baths is instead played by the system itself, with its internal degrees of freedom bringing coarse-grained observables to equilibrium. (paper)

  8. Do horses generalise between objects during habituation?

    DEFF Research Database (Denmark)

    Christensen, Janne Winther; Zharkikh, Tatjana; Ladevig, Jan

    2008-01-01

    Habituation to frightening stimuli plays an important role in horse training. To investigate the extent to which horses generalise between different visual objects, 2-year-old stallions were habituated to feeding from a container placed inside a test arena and assigned as TEST (n = 12) or REFERENCE...... horses (n = 12). In Experiment 1, TEST horses were habituated to six objects (ball, barrel, board, box, cone, cylinder) presented in sequence in a balanced order. The objects were of similar size but different colour. Each object was placed 0.5 m in front of the feed container, forcing the horses to pass...... the object to get to the food. TEST horses received as many 2 min exposures to each object as required to meet a habituation criterion. We recorded behavioural reactions to the object, latency to feed, total eating time, and heart rate (HR) during all exposures. There was no significant decrease in initial...

  9. Generalised derived limits for radioisotopes of iodine

    International Nuclear Information System (INIS)

    Hughes, J.S.; Haywood, S.M.; Simmonds, J.R.

    1984-04-01

    Generalised Derived Limits (GDLs) are evaluated for iodine-125,129,131,132,133,134,135 in selected materials from the terrestrial and aquatic environments and for discharge to atmosphere. They are intended for use as convenient reference levels against which the results of environmental monitoring can be compared and atmospheric discharges assessed. GDLs are intended for use when the environmental contamination or discharge to atmosphere is less than about 5% of the GDL. If the level of environmental contamination or discharge to the atmosphere exceeds this percentage of the GDL it does not necessarily mean that the dose equivalents to members of the public are approaching the dose equivalent limit. It is rather an indication that it may be appropriate to obtain a more specific derived limit for the particular situation by reviewing the values of the parameters involved in the calculation. GDL values are specified for iodine radionuclides in water, soil, grass, sediments and various foodstuffs derived from the terrestrial and aquatic environments. GDLs are also given for iodine radionuclides on terrestrial surfaces and for their discharge to atmosphere. (author)

  10. Threshold corrections, generalised prepotentials and Eichler integrals

    Directory of Open Access Journals (Sweden)

    Carlo Angelantonj

    2015-08-01

    Full Text Available We continue our study of one-loop integrals associated to BPS-saturated amplitudes in N=2 heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur–Poincaré series in the complex structure modulus. The closure of Niebur–Poincaré series under modular derivatives implies that such integrals derive from holomorphic prepotentials fn, generalising the familiar prepotential of N=2 supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involving the Γ0(N Hauptmodul, a full characterisation of holomorphic prepotentials including their quantum monodromies, as well as concrete formulæ for holomorphic Yukawa couplings.

  11. An environmental generalised Luenberger-Hicks-Moorsteen productivity indicator and an environmental generalised Hicks-Moorsteen productivity index.

    Science.gov (United States)

    Abad, A

    2015-09-15

    The purpose of this paper is to introduce an environmental generalised productivity indicator and its ratio-based counterpart. The innovative environmental generalised total factor productivity measures inherit the basic structure of both Hicks-Moorsteen productivity index and Luenberger-Hicks-Moorsteen productivity indicator. This methodological contribution shows that these new environmental generalised total factor productivity measures yield the earlier standard Hicks-Moorsteen index and Luenberger-Hicks-Moorsteen indicator, as well as environmental performance index, as special cases. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Rational first integrals of geodesic equations and generalised hidden symmetries

    International Nuclear Information System (INIS)

    Aoki, Arata; Houri, Tsuyoshi; Tomoda, Kentaro

    2016-01-01

    We discuss novel generalisations of Killing tensors, which are introduced by considering rational first integrals of geodesic equations. We introduce the notion of inconstructible generalised Killing tensors, which cannot be constructed from ordinary Killing tensors. Moreover, we introduce inconstructible rational first integrals, which are constructed from inconstructible generalised Killing tensors, and provide a method for checking the inconstructibility of a rational first integral. Using the method, we show that the rational first integral of the Collinson–O’Donnell solution is not inconstructible. We also provide several examples of metrics admitting an inconstructible rational first integral in two and four-dimensions, by using the Maciejewski–Przybylska system. Furthermore, we attempt to generalise other hidden symmetries such as Killing–Yano tensors. (paper)

  13. Supersymmetric backgrounds, the Killing superalgebra, and generalised special holonomy

    Energy Technology Data Exchange (ETDEWEB)

    Coimbra, André [Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Strickland-Constable, Charles [Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Institut de physique théorique, Université Paris Saclay, CEA, CNRS,Orme des Merisiers, F-91191 Gif-sur-Yvette (France)

    2016-11-10

    We prove that, for M theory or type II, generic Minkowski flux backgrounds preserving N supersymmetries in dimensions D≥4 correspond precisely to integrable generalised G{sub N} structures, where G{sub N} is the generalised structure group defined by the Killing spinors. In other words, they are the analogues of special holonomy manifolds in E{sub d(d)}×ℝ{sup +} generalised geometry. In establishing this result, we introduce the Kosmann-Dorfman bracket, a generalisation of Kosmann’s Lie derivative of spinors. This allows us to write down the internal sector of the Killing superalgebra, which takes a rather simple form and whose closure is the key step in proving the main result. In addition, we find that the eleven-dimensional Killing superalgebra of these backgrounds is necessarily the supertranslational part of the N-extended super-Poincaré algebra.

  14. Kolkata Restaurant Problem as a Generalised El Farol Bar Problem

    Science.gov (United States)

    Chakrabarti, Bikas K.

    Generalisation of the El Farol bar problem to that of many bars here leads to the Kolkata restaurant problem, where the decision to go to any restaurant or not is much simpler (depending on the previous experience of course, as in the El Farol bar problem). This generalised problem can be exactly analysed in some limiting cases discussed here. The fluctuation in the restaurant service can be shown to have precisely an inverse cubic behavior, as widely seen in the stock market fluctuations.

  15. Loop Amplitudes in Pure Yang-Mills from Generalised Unitarity

    OpenAIRE

    Brandhuber, Andreas; McNamara, Simon; Spence, Bill; Travaglini, Gabriele

    2005-01-01

    We show how generalised unitarity cuts in D = 4 - 2 epsilon dimensions can be used to calculate efficiently complete one-loop scattering amplitudes in non-supersymmetric Yang-Mills theory. This approach naturally generates the rational terms in the amplitudes, as well as the cut-constructible parts. We test the validity of our method by re-deriving the one-loop ++++, -+++, --++, -+-+ and +++++ gluon scattering amplitudes using generalised quadruple cuts and triple cuts in D dimensions.

  16. Loop amplitudes in pure Yang-Mills from generalised unitarity

    International Nuclear Information System (INIS)

    Brandhuber, Andreas; McNamara, Simon; Spence, Bill; Travaglini, Gabriele

    2005-01-01

    We show how generalised unitarity cuts in D = 4-2ε dimensions can be used to calculate efficiently complete one-loop scattering amplitudes in non-supersymmetric Yang-Mills theory. This approach naturally generates the rational terms in the amplitudes, as well as the cut-constructible parts. We test the validity of our method by re-deriving the one-loop ++++, -+++, --++, -+-+ and +++++ gluon scattering amplitudes using generalised quadruple cuts and triple cuts in D dimensions

  17. Loop amplitudes in pure Yang-Mills from generalised unitarity

    Energy Technology Data Exchange (ETDEWEB)

    Brandhuber, Andreas [Department of Physics, Queen Mary, University of London, Mile End Road, London, E1 4NS (United Kingdom); McNamara, Simon [Department of Physics, Queen Mary, University of London, Mile End Road, London, E1 4NS (United Kingdom); Spence, Bill [Department of Physics, Queen Mary, University of London, Mile End Road, London, E1 4NS (United Kingdom); Travaglini, Gabriele [Department of Physics, Queen Mary, University of London, Mile End Road, London, E1 4NS (United Kingdom)

    2005-10-15

    We show how generalised unitarity cuts in D = 4-2{epsilon} dimensions can be used to calculate efficiently complete one-loop scattering amplitudes in non-supersymmetric Yang-Mills theory. This approach naturally generates the rational terms in the amplitudes, as well as the cut-constructible parts. We test the validity of our method by re-deriving the one-loop ++++, -+++, --++, -+-+ and +++++ gluon scattering amplitudes using generalised quadruple cuts and triple cuts in D dimensions.

  18. Towards a 'pointless' generalisation of Yang-Mills theory

    International Nuclear Information System (INIS)

    Chan Hongmo; Tsou Sheungtsun

    1989-05-01

    We examine some generalisations in physical concepts of gauge theories, leading towards a scenario corresponding to non-commutative geometry, where the concept of locality loses its usual meaning of being associated with points on a base manifold and becomes intertwined with the concept of internal symmetry, suggesting thereby a gauge theory of extended objects. Examples are given where such generalised gauge structures can be realised, in particular that of string theory. (author)

  19. The certainty principle (review)

    OpenAIRE

    Arbatsky, D. A.

    2006-01-01

    The certainty principle (2005) allowed to conceptualize from the more fundamental grounds both the Heisenberg uncertainty principle (1927) and the Mandelshtam-Tamm relation (1945). In this review I give detailed explanation and discussion of the certainty principle, oriented to all physicists, both theorists and experimenters.

  20. On the Action of the Radiation Field Generated by a Traveling-Wave Element and Its Connection to the Time Energy Uncertainty Principle, Elementary Charge and the Fine Structure Constant

    Directory of Open Access Journals (Sweden)

    Vernon Cooray

    2017-02-01

    Full Text Available Recently, we published two papers in this journal. One of the papers dealt with the action of the radiation fields generated by a traveling-wave element and the other dealt with the momentum transferred by the same radiation fields and their connection to the time energy uncertainty principle. The traveling-wave element is defined as a conductor through which a current pulse propagates with the speed of light in free space from one end of the conductor to the other without attenuation. The goal of this letter is to combine the information provided in these two papers together and make conclusive statements concerning the connection between the energy dissipated by the radiation fields, the time energy uncertainty principle and the elementary charge. As we will show here, the results presented in these two papers, when combined together, show that the time energy uncertainty principle can be applied to the classical radiation emitted by a traveling-wave element and it results in the prediction that the smallest charge associated with the current that can be detected using radiated energy as a vehicle is on the order of the elementary charge. Based on the results, an expression for the fine structure constant is obtained. This is the first time that an order of magnitude estimation of the elementary charge based on electromagnetic radiation fields is obtained. Even though the results obtained in this paper have to be considered as order of magnitude estimations, a strict interpretation of the derived equations shows that the fine structure constant or the elementary charge may change as the size or the age of the universe increases.

  1. Regulatory decision making in the presence of uncertainty in the context of the disposal of long lived radioactive wastes. Third report of the Working group on principles and criteria for radioactive waste disposal

    International Nuclear Information System (INIS)

    1997-10-01

    Plans for disposing of radioactive wastes have raised a number of unique and mostly philosophical problems, mainly due to the very long time-scales which have to be considered. While there is general agreement on disposal concepts and on many aspects of a safety philosophy, consensus on a number of issues remains to be achieved. The IAEA established a subgroup under the International Radioactive Waste Management Advisory Committee (INWAC). The subgroup started its work in 1991 as the ''INWAC Subgroup on Principles and Criteria for Radioactive Waste Disposal''. With the reorganization in 1995 of IAEA senior advisory committees in the nuclear safety area, the title of the group was changed to ''Working Group on Principles and Criteria for Radioactive Waste Disposal''. The working group is intended to provide an open forum for: (1) the discussion and resolution of contentious issues, especially those with an international component, in the area of principles and criteria for safe disposal of waste; (2) the review and analysis of new ideas and concepts in the subject area; (3) establishing areas of consensus; (4) the consideration of issues related to safety principles and criteria in the IAEA's Radioactive Waste Safety Standards (RADWASS) programme; (5) the exchange of information on national safety criteria and policies for radioactive waste disposal. This is the third report of the working group and it deals with the subject of regulatory decision making under conditions of uncertainty which is a matter of concern with respect to disposal of radioactive wastes underground. 14 refs

  2. Thoracic involvement in generalised lymphatic anomaly (or lymphangiomatosis

    Directory of Open Access Journals (Sweden)

    Francesca Luisi

    2016-06-01

    Full Text Available Generalised lymphatic anomaly (GLA, also known as lymphangiomatosis, is a rare disease caused by congenital abnormalities of lymphatic development. It usually presents in childhood but can also be diagnosed in adults. GLA encompasses a wide spectrum of clinical manifestations ranging from single-organ involvement to generalised disease. Given the rarity of the disease, most of the information regarding it comes from case reports. To date, no clinical trials concerning treatment are available. This review focuses on thoracic GLA and summarises possible diagnostic and therapeutic approaches.

  3. Generalised discrete torsion and mirror symmetry for G2 manifolds

    International Nuclear Information System (INIS)

    Gaberdiel, Matthias R.; Kaste, Peter

    2004-01-01

    A generalisation of discrete torsion is introduced in which different discrete torsion phases are considered for the different fixed points or twist fields of a twisted sector. The constraints that arise from modular invariance are analysed carefully. As an application we show how all the different resolutions of the T 7 /Z 2 3 orbifold of Joyce have an interpretation in terms of such generalised discrete torsion orbifolds. Furthermore, we show that these manifolds are pairwise identified under G 2 mirror symmetry. From a conformal field theory point of view, this mirror symmetry arises from an automorphism of the extended chiral algebra of the G 2 compactification. (author)

  4. Uncertainty and complementarity in axiomatic quantum mechanics

    International Nuclear Information System (INIS)

    Lahti, P.J.

    1980-01-01

    An investigation of the uncertainty principle and the complementarity principle is carried through. The physical content of these principles and their representation in the conventional Hilbert space formulation of quantum mechanics forms a natural starting point. Thereafter is presented more general axiomatic framework for quantum mechanics, namely, a probability function formulation of the theory. Two extra axioms are stated, reflecting the ideas of the uncertainty principle and the complementarity principle, respectively. The quantal features of these axioms are explicated. (author)

  5. Relativistic generalisation of the Kroll-Watson formula

    International Nuclear Information System (INIS)

    Kaminski, J.Z.

    1985-01-01

    The relativistic analogue of the space-translation method is derived. Using this method the generalisation of the Kroll-Watson formula [1973, Phys. Rev. A. 8 804] is obtained for the scattering of an arbitrary charged particle (e.g. mesons, hyperons, quarks, etc). The separation of the background and resonant parts of the scattering amplitude is predicted. (author)

  6. A note on a generalisation of Weyl's theory of gravitation

    International Nuclear Information System (INIS)

    Dereli, T.; Tucker, R.W.

    1982-01-01

    A scale-invariant gravitational theory due to Bach and Weyl is generalised by the inclusion of space-time torsion. The difference between the arbitrary and zero torsion constrained variations of the Weyl action is elucidated. Conformal rescaling properties of the gravitational fields are discussed. A new class of classical solutions with torsion is presented. (author)

  7. Commitment of mathematicians in medicine: a personal experience, and generalisations.

    Science.gov (United States)

    Clairambault, Jean

    2011-12-01

    I will present here a personal point of view on the commitment of mathematicians in medicine. Starting from my personal experience, I will suggest generalisations including favourable signs and caveats to show how mathematicians can be welcome and helpful in medicine, both in a theoretical and in a practical way.

  8. Gait analysis of adults with generalised joint hypermobility

    DEFF Research Database (Denmark)

    Simonsen, Erik B; Tegner, Heidi; Alkjær, Tine

    2012-01-01

    BACKGROUND: The majority of adults with Generalised Joint Hypermobility experience symptoms such as pain and joint instability, which is likely to influence their gait pattern. Accordingly, the purpose of the present project was to perform a biomechanical gait analysis on a group of patients...

  9. Generalised time functions and finiteness of the Lorentzian distance

    OpenAIRE

    Rennie, Adam; Whale, Ben E.

    2014-01-01

    We show that finiteness of the Lorentzian distance is equivalent to the existence of generalised time functions with gradient uniformly bounded away from light cones. To derive this result we introduce new techniques to construct and manipulate achronal sets. As a consequence of these techniques we obtain a functional description of the Lorentzian distance extending the work of Franco and Moretti.

  10. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  11. Generalisability of an online randomised controlled trial: an empirical analysis.

    Science.gov (United States)

    Wang, Cheng; Mollan, Katie R; Hudgens, Michael G; Tucker, Joseph D; Zheng, Heping; Tang, Weiming; Ling, Li

    2018-02-01

    Investigators increasingly use online methods to recruit participants for randomised controlled trials (RCTs). However, the extent to which participants recruited online represent populations of interest is unknown. We evaluated how generalisable an online RCT sample is to men who have sex with men in China. Inverse probability of sampling weights (IPSW) and the G-formula were used to examine the generalisability of an online RCT using model-based approaches. Online RCT data and national cross-sectional study data from China were analysed to illustrate the process of quantitatively assessing generalisability. The RCT (identifier NCT02248558) randomly assigned participants to a crowdsourced or health marketing video for promotion of HIV testing. The primary outcome was self-reported HIV testing within 4 weeks, with a non-inferiority margin of -3%. In the original online RCT analysis, the estimated difference in proportions of HIV tested between the two arms (crowdsourcing and health marketing) was 2.1% (95% CI, -5.4% to 9.7%). The hypothesis that the crowdsourced video was not inferior to the health marketing video to promote HIV testing was not demonstrated. The IPSW and G-formula estimated differences were -2.6% (95% CI, -14.2 to 8.9) and 2.7% (95% CI, -10.7 to 16.2), with both approaches also not establishing non-inferiority. Conducting generalisability analysis of an online RCT is feasible. Examining the generalisability of online RCTs is an important step before an intervention is scaled up. NCT02248558. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Retina as Reciprocal Spatial Fourier Transform Space Implies ``Wave-transformation'' Functions, String Theory, the Inappropriate Uncertainty Principle, and Predicts ``Quarked'' Protons.

    Science.gov (United States)

    Mc Leod, Roger David; Mc Leod, David M.

    2007-10-01

    Vision, via transform space: ``Nature behaves in a reciprocal way;' also, Rect x pressure-input sense-reports as Sinc p, indicating brain interprets reciprocal ``p'' space as object space. Use Mott's and Sneddon's Wave Mechanics and Its Applications. Wave transformation functions are strings of positron, electron, proton, and neutron; uncertainty is a semantic artifact. Neutrino-string de Broglie-Schr"odinger wave-function models for electron, positron, suggest three-quark models for protons, neutrons. Variably vibrating neutrino-quills of this model, with appropriate mass-energy, can be a vertical proton string, quills leftward; thread string circumferentially, forming three interlinked circles with ``overpasses''. Diameters are 2:1:2, center circle has quills radially outward; call it a down quark, charge --1/3, charge 2/3 for outward quills, the up quarks of outer circles. String overlap summations are nodes; nodes also far left and right. Strong nuclear forces may be --px. ``Dislodging" positron with neutrino switches quark-circle configuration to 1:2:1, `downers' outside. Unstable neutron charge is 0. Atoms build. With scale factors, retinal/vision's, and quantum mechanics,' spatial Fourier transforms/inverses are equivalent.

  13. A Generalised Fault Protection Structure Proposed for Uni-grounded Low-Voltage AC Microgrids

    Science.gov (United States)

    Bui, Duong Minh; Chen, Shi-Lin; Lien, Keng-Yu; Jiang, Jheng-Lun

    2016-04-01

    This paper presents three main configurations of uni-grounded low-voltage AC microgrids. Transient situations of a uni-grounded low-voltage (LV) AC microgrid (MG) are simulated through various fault tests and operation transition tests between grid-connected and islanded modes. Based on transient simulation results, available fault protection methods are proposed for main and back-up protection of a uni-grounded AC microgrid. In addition, concept of a generalised fault protection structure of uni-grounded LVAC MGs is mentioned in the paper. As a result, main contributions of the paper are: (i) definition of different uni-grounded LVAC MG configurations; (ii) analysing transient responses of a uni-grounded LVAC microgrid through line-to-line faults, line-to-ground faults, three-phase faults and a microgrid operation transition test, (iii) proposing available fault protection methods for uni-grounded microgrids, such as: non-directional or directional overcurrent protection, under/over voltage protection, differential current protection, voltage-restrained overcurrent protection, and other fault protection principles not based on phase currents and voltages (e.g. total harmonic distortion detection of currents and voltages, using sequence components of current and voltage, 3I0 or 3V0 components), and (iv) developing a generalised fault protection structure with six individual protection zones to be suitable for different uni-grounded AC MG configurations.

  14. Change and uncertainty in quantum systems

    International Nuclear Information System (INIS)

    Franson, J.D.

    1996-01-01

    A simple inequality shows that any change in the expectation value of an observable quantity must be associated with some degree of uncertainty. This inequality is often more restrictive than the Heisenberg uncertainty principle. copyright 1996 The American Physical Society

  15. An uncertainty principle for star formation - II. A new method for characterising the cloud-scale physics of star formation and feedback across cosmic history

    Science.gov (United States)

    Kruijssen, J. M. Diederik; Schruba, Andreas; Hygate, Alexander P. S.; Hu, Chia-Yu; Haydon, Daniel T.; Longmore, Steven N.

    2018-05-01

    The cloud-scale physics of star formation and feedback represent the main uncertainty in galaxy formation studies. Progress is hampered by the limited empirical constraints outside the restricted environment of the Local Group. In particular, the poorly-quantified time evolution of the molecular cloud lifecycle, star formation, and feedback obstructs robust predictions on the scales smaller than the disc scale height that are resolved in modern galaxy formation simulations. We present a new statistical method to derive the evolutionary timeline of molecular clouds and star-forming regions. By quantifying the excess or deficit of the gas-to-stellar flux ratio around peaks of gas or star formation tracer emission, we directly measure the relative rarity of these peaks, which allows us to derive their lifetimes. We present a step-by-step, quantitative description of the method and demonstrate its practical application. The method's accuracy is tested in nearly 300 experiments using simulated galaxy maps, showing that it is capable of constraining the molecular cloud lifetime and feedback time-scale to <0.1 dex precision. Access to the evolutionary timeline provides a variety of additional physical quantities, such as the cloud-scale star formation efficiency, the feedback outflow velocity, the mass loading factor, and the feedback energy or momentum coupling efficiencies to the ambient medium. We show that the results are robust for a wide variety of gas and star formation tracers, spatial resolutions, galaxy inclinations, and galaxy sizes. Finally, we demonstrate that our method can be applied out to high redshift (z≲ 4) with a feasible time investment on current large-scale observatories. This is a major shift from previous studies that constrained the physics of star formation and feedback in the immediate vicinity of the Sun.

  16. Free Fall and the Equivalence Principle Revisited

    Science.gov (United States)

    Pendrill, Ann-Marie

    2017-01-01

    Free fall is commonly discussed as an example of the equivalence principle, in the context of a homogeneous gravitational field, which is a reasonable approximation for small test masses falling moderate distances. Newton's law of gravity provides a generalisation to larger distances, and also brings in an inhomogeneity in the gravitational field.…

  17. Myocardial infarction and generalised anxiety disorder : 10-year follow-up

    NARCIS (Netherlands)

    Roest, Annelieke M.; Zuidersma, Marij; de Jonge, Peter

    Background Few studies have addressed the relationship between generalised anxiety disorder and cardiovascular prognosis using a diagnostic interview. Aims To assess the association between generalised anxiety disorder and adverse outcomes in patients with myocardial infarction. Method Patients with

  18. Darwin without borders? Looking at 'generalised Darwinism' through the prism of the 'hourglass model'.

    Science.gov (United States)

    Levit, Georgy S; Hossfeld, Uwe

    2011-12-01

    This article critically analyzes the arguments of the 'generalized Darwinism' recently proposed for the analysis of social-economical systems. We argue that 'generalized Darwinism' is both restrictive and empty. It is restrictive because it excludes alternative (non-selectionist) evolutionary mechanisms such as orthogenesis, saltationism and mutationism without any examination of their suitability for modeling socio-economic processes and ignoring their important roles in the development of contemporary evolutionary theory. It is empty, because it reduces Darwinism to an abstract triple-principle scheme (variation, selection and inheritance) thus ignoring the actual structure of Darwinism as a complex and dynamic theoretical structure inseparable from a very detailed system of theoretical constraints. Arguing against 'generalised Darwinism' we present our vision of the history of evolutionary biology with the help of the 'hourglass model' reflecting the internal dynamic of competing theories of evolution.

  19. Generalising the logistic map through the q-product

    International Nuclear Information System (INIS)

    Pessoa, R W S; Borges, E P

    2011-01-01

    We investigate a generalisation of the logistic map as x n+1 = 1 - ax n x qmap x n (-1 ≤ x n ≤ 1, 0 map → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for q map > 1 at the edge of chaos, particularly at the first critical point a c , that depends on the value of q map . Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at a c (q map ), and connections with nonextensive statistical mechanics are explored.

  20. Generalising the logistic map through the q-product

    Science.gov (United States)

    Pessoa, R. W. S.; Borges, E. P.

    2011-03-01

    We investigate a generalisation of the logistic map as xn+1 = 1 - axn otimesqmap xn (-1 Borges, E.P. Physica A 340, 95 (2004)]. The usual product, and consequently the usual logistic map, is recovered in the limit q → 1, The tent map is also a particular case for qmap → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for qmap > 1 at the edge of chaos, particularly at the first critical point ac, that depends on the value of qmap. Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at ac(qmap), and connections with nonextensive statistical mechanics are explored.

  1. Object recognition and generalisation during habituation in horses

    DEFF Research Database (Denmark)

    Christensen, Janne Winther; Zharkikh, Tjatjana; Chovaux, Elodie

    2011-01-01

    The ability of horses to habituate to frightening stimuli greatly increases safety in the horse–human relationship. A recent experiment suggested, however, that habituation to frightening visual stimuli is relatively stimulus-specific in horses and that shape and colour are important factors...... for object generalisation (Christensen et al., 2008). In a series of experiments, we aimed to further explore the ability of horses (n = 30, 1 and 2-year-old mares) to recognise and generalise between objects during habituation. TEST horses (n = 15) were habituated to a complex object, composed of five...... simple objects of varying shape and colour, whereas CONTROL horses (n = 15) were habituated to the test arena, but not to the complex object. In the first experiment, we investigated whether TEST horses subsequently reacted less to i) simple objects that were previously part of the complex object (i...

  2. ''Nature is unknowable''. The idea of uncertainty

    International Nuclear Information System (INIS)

    Crozon, M.

    2000-01-01

    This paper deals with one of the great idea of the twentieth century, the uncertainty principle of Heisenberg. With a philosophical approach the author explains this principle and presents its cultural impacts on mind. (A.L.B.)

  3. Generalisation of geographic information cartographic modelling and applications

    CERN Document Server

    Mackaness, William A; Sarjakoski, L Tiina

    2011-01-01

    Theoretical and Applied Solutions in Multi Scale MappingUsers have come to expect instant access to up-to-date geographical information, with global coverage--presented at widely varying levels of detail, as digital and paper products; customisable data that can readily combined with other geographic information. These requirements present an immense challenge to those supporting the delivery of such services (National Mapping Agencies (NMA), Government Departments, and private business. Generalisation of Geographic Information: Cartographic Modelling and Applications provides detailed review

  4. Projecting UK mortality using Bayesian generalised additive models

    OpenAIRE

    Hilton, Jason; Dodd, Erengul; Forster, Jonathan; Smith, Peter W.F.

    2018-01-01

    Forecasts of mortality provide vital information about future populations, with implications for pension and health-care policy as well as for decisions made by private companies about life insurance and annuity pricing. This paper presents a Bayesian approach to the forecasting of mortality that jointly estimates a Generalised Additive Model (GAM) for mortality for the majority of the age-range and a parametric model for older ages where the data are sparser. The GAM allows smooth components...

  5. Generalised pole-placement control of steam turbine speed

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez-del-Busto, R. [ITESM, Cuernavaca (Mexico). Div. de Ingenieria y Ciencias; Munoz, J. [ITESM, Xochimilco (Mexico). Div. de Ingenieria y Ciencias

    1996-12-31

    An application of a pole-placement self-tuning predictive control algorithm is developed to regulate speed of a power plant steam turbine model. Two types of system representation (CARMA and CARIMA) are used to test the control algorithm. Simulation results show that when using a CARMA model better results are produced. Two further comparisons are made when using a PI controller and a generalised predictive controller. (author)

  6. The stability of vacuum solutions in generalised gravity

    Energy Technology Data Exchange (ETDEWEB)

    Madsen, M.S. (Sussex Univ., Brighton (UK). Astronomy Centre); Low, R.J. (Coventry (Lanchester) Polytechnic (UK). Dept. of Mathematics)

    1990-05-10

    The stability of the Ricci-flat solutions of a large class of generalised gravity theories is examined. It is shown by use of complementary methods that all such solutions are stable in a given theory if that theory admits a truncation to a quadratic theory in which the solution is stable. In particular, this means that the exterior Schwarzschild solution is stable in any gravity theory constructed purely from the Ricci scalar, provided that it exists in that theory. (orig.).

  7. The stability of vacuum solutions in generalised gravity

    International Nuclear Information System (INIS)

    Madsen, M.S.; Low, R.J.

    1990-01-01

    The stability of the Ricci-flat solutions of a large class of generalised gravity theories is examined. It is shown by use of complementary methods that all such solutions are stable in a given theory if that theory admits a truncation to a quadratic theory in which the solution is stable. In particular, this means that the exterior Schwarzschild solution is stable in any gravity theory constructed purely from the Ricci scalar, provided that it exists in that theory. (orig.)

  8. Learning and Generalisation in Neural Networks with Local Preprocessing

    OpenAIRE

    Kutsia, Merab

    2007-01-01

    We study learning and generalisation ability of a specific two-layer feed-forward neural network and compare its properties to that of a simple perceptron. The input patterns are mapped nonlinearly onto a hidden layer, much larger than the input layer, and this mapping is either fixed or may result from an unsupervised learning process. Such preprocessing of initially uncorrelated random patterns results in the correlated patterns in the hidden layer. The hidden-to-output mapping of the net...

  9. Generalised Multiplicative Indices of Polycyclic Aromatic Hydrocarbons and Benzenoid Systems

    Science.gov (United States)

    Kulli, V. R.; Stone, Branden; Wang, Shaohui; Wei, Bing

    2017-05-01

    Many types of topological indices such as degree-based topological indices, distance-based topological indices, and counting-related topological indices are explored during past recent years. Among degree-based topological indices, Zagreb indices are the oldest one and studied well. In the paper, we define a generalised multiplicative version of these indices and compute exact formulas for Polycyclic Aromatic Hydrocarbons and jagged-rectangle Benzenoid systems.

  10. Generalised Chou-Yang model and recent results

    International Nuclear Information System (INIS)

    Fazal-e-Aleem; Rashid, H.

    1995-09-01

    It is shown that most recent results of E710 and UA4/2 collaboration for the total cross section and ρ together with earlier measurements give good agreement with measurements for the differential cross section at 546 and 1800 GeV the framework of Generalised Chou-Yang model. These results are also compared with the predictions of other models. (author). 16 refs, 2 figs

  11. Generalised Chou-Yang model and recent results

    Energy Technology Data Exchange (ETDEWEB)

    Fazal-e-Aleem [International Centre for Theoretical Physics, Trieste (Italy); Rashid, H. [Punjab Univ., Lahore (Pakistan). Centre for High Energy Physics

    1996-12-31

    It is shown that most recent results of E710 and UA4/2 collaboration for the total cross section and {rho} together with earlier measurements give good agreement with measurements for the differential cross section at 546 and 1800 GeV within the framework of Generalised Chou-Yang model. These results are also compared with the predictions of other models. (author) 16 refs.

  12. Generalised Chou-Yang model and recent results

    International Nuclear Information System (INIS)

    Fazal-e-Aleem; Rashid, H.

    1996-01-01

    It is shown that most recent results of E710 and UA4/2 collaboration for the total cross section and ρ together with earlier measurements give good agreement with measurements for the differential cross section at 546 and 1800 GeV within the framework of Generalised Chou-Yang model. These results are also compared with the predictions of other models. (author)

  13. A study of idiopathic generalised epilepsy in an Irish population.

    LENUS (Irish Health Repository)

    Mullins, G M

    2012-02-03

    Idiopathic generalised epilepsy (IGE) is subdivided into syndromes based on clinical and EEG features. PURPOSE: The aim of this study was to characterise all cases of IGE with supportive EEG abnormalities in terms of gender differences, seizure types reported, IGE syndromes, family history of epilepsy and EEG findings. We also calculated the limited duration prevalence of IGE in our cohort. METHODS: Data on abnormal EEGs were collected retrospectively from two EEG databases at two tertiary referral centres for neurology. Clinical information was obtained from EEG request forms, standardised EEG questionnaires and medical notes of patients. RESULTS: two hundred twenty-three patients met our inclusion criteria, 89 (39.9%) male and 134 (60.1%) females. Tonic clonic seizures were the most common seizure type reported, 162 (72.65%) having a generalised tonic clonic seizure (GTCS) at some time. IGE with GTCS only (EGTCSA) was the most common syndrome in our cohort being present in 94 patients (34 male, 60 female), with 42 (15 male, 27 female) patients diagnosed with Juvenile myoclonic epilepsy (JME), 23 (9 male, 14 female) with Juvenile absence epilepsy (JAE) and 20 (9 male, 11 female) with childhood absence epilepsy (CAE). EEG studies in all patients showed generalised epileptiform activity. CONCLUSIONS: More women than men were diagnosed with generalised epilepsy. Tonic clonic seizures were the most common seizure type reported. EGTCSA was the most frequent syndrome seen. Gender differences were evident for JAE and JME as previously reported and for EGTCSA, which was not reported to date, and reached statistical significance for EGTCA and JME.

  14. Generalised pruritus as a presentation of Grave’s disease

    OpenAIRE

    Tan, CE; Loh, KY

    2013-01-01

    Pruritus is a lesser known symptom of hyperthyroidism, particularly in autoimmune thyroid disorders. This is a case report of a 27-year-old woman who presented with generalised pruritus at a primary care clinic. Incidental findings of tachycardia and a goiter led to the investigations of her thyroid status. The thyroid function test revealed elevated serum free T4 and suppressed thyroid stimulating hormone levels. The anti-thyroid antibodies were positive. She was diagnosed with Graves’ disea...

  15. Dirac equations for generalised Yang-Mills systems

    International Nuclear Information System (INIS)

    Lechtenfeld, O.; Nahm, W.; Tchrakian, D.H.

    1985-06-01

    We present Dirac equations in 4p dimensions for the generalised Yang-Mills (GYM) theories introduced earlier. These Dirac equations are related to the self-duality equations of the GYM and are checked to be elliptic in a 'BPST' background. In this background these Dirac equations are integrated exactly. The possibility of imposing supersymmetry in the GYM-Dirac system is investigated, with negative results. (orig.)

  16. Generalised summation-by-parts operators and variable coefficients

    Science.gov (United States)

    Ranocha, Hendrik

    2018-06-01

    High-order methods for conservation laws can be highly efficient if their stability is ensured. A suitable means mimicking estimates of the continuous level is provided by summation-by-parts (SBP) operators and the weak enforcement of boundary conditions. Recently, there has been an increasing interest in generalised SBP operators both in the finite difference and the discontinuous Galerkin spectral element framework. However, if generalised SBP operators are used, the treatment of the boundaries becomes more difficult since some properties of the continuous level are no longer mimicked discretely - interpolating the product of two functions will in general result in a value different from the product of the interpolations. Thus, desired properties such as conservation and stability are more difficult to obtain. Here, new formulations are proposed, allowing the creation of discretisations using general SBP operators that are both conservative and stable. Thus, several shortcomings that might be attributed to generalised SBP operators are overcome (cf. Nordström and Ruggiu (2017) [38] and Manzanero et al. (2017) [39]).

  17. Generalised pollination systems for three invasive milkweeds in Australia.

    Science.gov (United States)

    Ward, M; Johnson, S D

    2013-05-01

    Because most plants require pollinator visits for seed production, the ability of an introduced plant species to establish pollinator relationships in a new ecosystem may have a central role in determining its success or failure as an invader. We investigated the pollination ecology of three milkweed species - Asclepias curassavica, Gomphocarpus fruticosus and G. physocarpus - in their invaded range in southeast Queensland, Australia. The complex floral morphology of milkweeds has often been interpreted as a general trend towards specialised pollination requirements. Based on this interpretation, invasion by milkweeds contradicts the expectation than plant species with specialised pollination systems are less likely to become invasive that those with more generalised pollination requirements. However, observations of flower visitors in natural populations of the three study species revealed that their pollination systems are essentially specialised at the taxonomic level of the order, but generalised at the species level. Specifically, pollinators of the two Gomphocarpus species included various species of Hymenoptera (particularly vespid wasps), while pollinators of A. curassavica were primarily Lepidoptera (particularly nymphalid butterflies). Pollinators of all three species are rewarded with copious amounts of highly concentrated nectar. It is likely that successful invasion by these three milkweed species is attributable, at least in part, to their generalised pollinator requirements. The results of this study are discussed in terms of how data from the native range may be useful in predicting pollination success of species in a new environment. © 2012 German Botanical Society and The Royal Botanical Society of the Netherlands.

  18. Influence of colour on acquisition and generalisation of graphic symbols.

    Science.gov (United States)

    Hetzroni, O E; Ne'eman, A

    2013-07-01

    Children with autism may benefit from using graphic symbols for their communication, language and literacy development. The purpose of this study was to investigate the influence of colour versus grey-scale displays on the identification of graphic symbols using a computer-based intervention. An alternating treatment design was employed to examine the learning and generalisation of 58 colour and grey-scale symbols by four preschool children with autism. The graphic symbols were taught via a meaning-based intervention using stories and educational games. Results demonstrate that all of the children were able to learn and maintain symbol identification over time for both symbol displays with no apparent differences. Differences were apparent for two of the children who exhibited better generalisation when learning grey-scale symbols first. The other two showed no noticeable difference, between displays when generalising from one display to the other. Implications and further research are discussed. © 2012 The Authors. Journal of Intellectual Disability Research © 2012 John Wiley & Sons Ltd, MENCAP & IASSID.

  19. A Generalised Approach to Petri Nets and Algebraic Specifications

    International Nuclear Information System (INIS)

    Sivertsen, Terje

    1998-02-01

    The present report represents a continuation of the work on Petri nets and algebraic specifications. The reported research has focused on generalising the approach introduced in HWR-454, with the aim of facilitating the translation of a wider class of Petri nets into algebraic specification. This includes autonomous Petri nets with increased descriptive power, as well as non-autonomous Petri nets allowing the modelling of systems (1) involving extensive data processing; (2) with transitions synchronized on external events; (3) whose evolutions are time dependent. The generalised approach has the important property of being modular in the sense that the translated specifications can be gradually extended to include data processing, synchronization, and timing. The report also discusses the relative merits of state-based and transition-based specifications, and includes a non-trivial case study involving automated proofs of a large number of interrelated theorems. The examples in the report illustrate the use of the new HRP Prover. Of particular importance in this context is the automatic transformation between state-based and transitionbased specifications. It is expected that the approach introduced in HWR-454 and generalised in the present report will prove useful in future work on combination of wide variety of specification techniques

  20. Working dogs cooperate among one another by generalised reciprocity.

    Science.gov (United States)

    Gfrerer, Nastassja; Taborsky, Michael

    2017-03-06

    Cooperation by generalised reciprocity implies that individuals apply the decision rule "help anyone if helped by someone". This mechanism has been shown to generate evolutionarily stable levels of cooperation, but as yet it is unclear how widely this cooperation mechanism is applied among animals. Dogs (Canis familiaris) are highly social animals with considerable cognitive potential and the ability to differentiate between individual social partners. But although dogs can solve complex problems, they may use simple rules for behavioural decisions. Here we show that dogs trained in an instrumental cooperative task to provide food to a social partner help conspecifics more often after receiving help from a dog before. Remarkably, in so doing they show no distinction between partners that had helped them before and completely unfamiliar conspecifics. Apparently, dogs use the simple decision rule characterizing generalised reciprocity, although they are probably capable of using the more complex decision rule of direct reciprocity: "help someone who has helped you". However, generalized reciprocity involves lower information processing costs and is therefore a cheaper cooperation strategy. Our results imply that generalised reciprocity might be applied more commonly than direct reciprocity also in other mutually cooperating animals.

  1. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  2. The Uncertainty of Measurement Results

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  3. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  4. Uncertainty and Complementarity in Axiomatic Quantum Mechanics

    Science.gov (United States)

    Lahti, Pekka J.

    1980-11-01

    In this work an investigation of the uncertainty principle and the complementarity principle is carried through. A study of the physical content of these principles and their representation in the conventional Hilbert space formulation of quantum mechanics forms a natural starting point for this analysis. Thereafter is presented more general axiomatic framework for quantum mechanics, namely, a probability function formulation of the theory. In this general framework two extra axioms are stated, reflecting the ideas of the uncertainty principle and the complementarity principle, respectively. The quantal features of these axioms are explicated. The sufficiency of the state system guarantees that the observables satisfying the uncertainty principle are unbounded and noncompatible. The complementarity principle implies a non-Boolean proposition structure for the theory. Moreover, nonconstant complementary observables are always noncompatible. The uncertainty principle and the complementarity principle, as formulated in this work, are mutually independent. Some order is thus brought into the confused discussion about the interrelations of these two important principles. A comparison of the present formulations of the uncertainty principle and the complementarity principle with the Jauch formulation of the superposition principle is also given. The mutual independence of the three fundamental principles of the quantum theory is hereby revealed.

  5. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  6. Coherence Generalises Duality: A Logical Explanation of Multiparty Session Types

    DEFF Research Database (Denmark)

    Carbone, Marco; Lindley, Sam; Montesi, Fabrizio

    2016-01-01

    the duality of classical linear logic (relating two types) with a more general notion of coherence (relating an arbitrary number of types). This paper introduces variants of CP and MCP, plus a new intermediate calculus of Globally-governed Classical Processes (GCP). We show a tight relation between......Wadler introduced Classical Processes (CP), a calculus based on a propositions-as-types correspondence between propositions of classical linear logic and session types. Carbone et al. introduced Multiparty Classical Processes, a calculus that generalises CP to multiparty session types, by replacing...

  7. On quantization, the generalised Schroedinger equation and classical mechanics

    International Nuclear Information System (INIS)

    Jones, K.R.W.

    1991-01-01

    A ψ-dependent linear functional operator, was defined, which solves the problem of quantization in non-relativistic quantum mechanics. Weyl ordering is implemented automatically and permits derivation of many of the quantum to classical correspondences. The parameter λ presents a natural C ∞ deformation of the dynamical structure of quantum mechanics via a non-linear integro-differential 'Generalised Schroedinger Equation', admitting an infinite family of soliton solutions. All these solutions are presented and it is shown that this equation gives an exact dynamic and energetic reproduction of classical mechanics with the correct measurement theoretic limit. 23 refs

  8. Generalised boundary terms for higher derivative theories of gravity

    Energy Technology Data Exchange (ETDEWEB)

    Teimouri, Ali; Talaganis, Spyridon; Edholm, James [Consortium for Fundamental Physics, Lancaster University,North West Drive, Lancaster, LA1 4YB (United Kingdom); Mazumdar, Anupam [Consortium for Fundamental Physics, Lancaster University,North West Drive, Lancaster, LA1 4YB (United Kingdom); Kapteyn Astronomical Institute, University of Groningen,9700 AV Groningen (Netherlands)

    2016-08-24

    In this paper we wish to find the corresponding Gibbons-Hawking-York term for the most general quadratic in curvature gravity by using Coframe slicing within the Arnowitt-Deser-Misner (ADM) decomposition of spacetime in four dimensions. In order to make sure that the higher derivative gravity is ghost and tachyon free at a perturbative level, one requires infinite covariant derivatives, which yields a generalised covariant infinite derivative theory of gravity. We will be exploring the boundary term for such a covariant infinite derivative theory of gravity.

  9. Building Abelian Functions with Generalised Baker-Hirota Operators

    Directory of Open Access Journals (Sweden)

    Matthew England

    2012-06-01

    Full Text Available We present a new systematic method to construct Abelian functions on Jacobian varieties of plane, algebraic curves. The main tool used is a symmetric generalisation of the bilinear operator defined in the work of Baker and Hirota. We give explicit formulae for the multiple applications of the operators, use them to define infinite sequences of Abelian functions of a prescribed pole structure and deduce the key properties of these functions. We apply the theory on the two canonical curves of genus three, presenting new explicit examples of vector space bases of Abelian functions. These reveal previously unseen similarities between the theories of functions associated to curves of the same genus.

  10. Application of a Bayesian/generalised least-squares method to generate correlations between independent neutron fission yield data

    International Nuclear Information System (INIS)

    Fiorito, L.; Diez, C.; Cabellos, O.; Stankovskiy, A.; Van den Eynde, G.; Labeau, P.E.

    2014-01-01

    Fission product yields are fundamental parameters for several nuclear engineering calculations and in particular for burn-up/activation problems. The impact of their uncertainties was widely studied in the past and evaluations were released, although still incomplete. Recently, the nuclear community expressed the need for full fission yield covariance matrices to produce inventory calculation results that take into account the complete uncertainty data. In this work, we studied and applied a Bayesian/generalised least-squares method for covariance generation, and compared the generated uncertainties to the original data stored in the JEFF-3.1.2 library. Then, we focused on the effect of fission yield covariance information on fission pulse decay heat results for thermal fission of 235 U. Calculations were carried out using different codes (ACAB and ALEPH-2) after introducing the new covariance values. Results were compared with those obtained with the uncertainty data currently provided by the library. The uncertainty quantification was performed with the Monte Carlo sampling technique. Indeed, correlations between fission yields strongly affect the statistics of decay heat. (authors)

  11. A realization of the uncertainty principle

    Directory of Open Access Journals (Sweden)

    V. M. Dilnyi

    2015-07-01

    Full Text Available We obtain the statement about the imitation behavior of the sum of functions on the real half-line by each of the summands under some conditions for these functions and their Laplace transforms.

  12. Effect Displays in R for Generalised Linear Models

    Directory of Open Access Journals (Sweden)

    John Fox

    2003-07-01

    Full Text Available This paper describes the implementation in R of a method for tabular or graphical display of terms in a complex generalised linear model. By complex, I mean a model that contains terms related by marginality or hierarchy, such as polynomial terms, or main effects and interactions. I call these tables or graphs effect displays. Effect displays are constructed by identifying high-order terms in a generalised linear model. Fitted values under the model are computed for each such term. The lower-order "relatives" of a high-order term (e.g., main effects marginal to an interaction are absorbed into the term, allowing the predictors appearing in the high-order term to range over their values. The values of other predictors are fixed at typical values: for example, a covariate could be fixed at its mean or median, a factor at its proportional distribution in the data, or to equal proportions in its several levels. Variations of effect displays are also described, including representation of terms higher-order to any appearing in the model.

  13. Cosmological principles. II. Physical principles

    International Nuclear Information System (INIS)

    Harrison, E.R.

    1974-01-01

    The discussion of cosmological principle covers the uniformity principle of the laws of physics, the gravitation and cognizability principles, and the Dirac creation, chaos, and bootstrap principles. (U.S.)

  14. Supersymmetry for gauged double field theory and generalised Scherk–Schwarz reductions

    International Nuclear Information System (INIS)

    Berman, David S.; Lee, Kanghoon

    2014-01-01

    Previous constructions of supersymmetry for double field theory have relied on the so-called strong constraint. In this paper, the strong constraint is relaxed and the theory is shown to possess supersymmetry once the generalised Scherk–Schwarz reduction is imposed. The equivalence between the generalised Scherk–Schwarz reduced theory and the gauged double field theory is then examined in detail for the supersymmetric theory. As a biproduct we write the generalised Killing spinor equations for the supersymmetric double field theory

  15. Generalised relativistic Ohm's laws, extended gauge transformations, and magnetic linking

    International Nuclear Information System (INIS)

    Pegoraro, F.

    2015-01-01

    Generalisations of the relativistic ideal Ohm's law are presented that include specific dynamical features of the current carrying particles in a plasma. Cases of interest for space and laboratory plasmas are identified where these generalisations allow for the definition of generalised electromagnetic fields that transform under a Lorentz boost in the same way as the real electromagnetic fields and that obey the same set of homogeneous Maxwell's equations

  16. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  17. Vacancies and a generalised melting curve of metals

    International Nuclear Information System (INIS)

    Gorecki, T.

    1979-01-01

    The vacancy mechanism of the melting process is used as a starting point for deriving an expression for the pressure dependence of the melting temperature of metals. The results obtained for the initial slope of the melting curve are compared with experimental data for 45 metals and in most cases the agreement is very good. The nonlinearity of the melting curve and the appearance of a maximum on the melting curve at a pressure approximately equal to the bulk modules is also predicted, with qualitative agreement with experimental data. A relation between bonding energy, atomic volume, and bulk modulus of metals is established. On the basis of this relation and the proposed vacancy mechanism, a generalised equation for the pressure dependence of the melting temperature of metals is derived. (author)

  18. Generalised additive modelling approach to the fermentation process of glutamate.

    Science.gov (United States)

    Liu, Chun-Bo; Li, Yun; Pan, Feng; Shi, Zhong-Ping

    2011-03-01

    In this work, generalised additive models (GAMs) were used for the first time to model the fermentation of glutamate (Glu). It was found that three fermentation parameters fermentation time (T), dissolved oxygen (DO) and oxygen uptake rate (OUR) could capture 97% variance of the production of Glu during the fermentation process through a GAM model calibrated using online data from 15 fermentation experiments. This model was applied to investigate the individual and combined effects of T, DO and OUR on the production of Glu. The conditions to optimize the fermentation process were proposed based on the simulation study from this model. Results suggested that the production of Glu can reach a high level by controlling concentration levels of DO and OUR to the proposed optimization conditions during the fermentation process. The GAM approach therefore provides an alternative way to model and optimize the fermentation process of Glu. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  19. Spatial generalised linear mixed models based on distances.

    Science.gov (United States)

    Melo, Oscar O; Mateu, Jorge; Melo, Carlos E

    2016-10-01

    Risk models derived from environmental data have been widely shown to be effective in delineating geographical areas of risk because they are intuitively easy to understand. We present a new method based on distances, which allows the modelling of continuous and non-continuous random variables through distance-based spatial generalised linear mixed models. The parameters are estimated using Markov chain Monte Carlo maximum likelihood, which is a feasible and a useful technique. The proposed method depends on a detrending step built from continuous or categorical explanatory variables, or a mixture among them, by using an appropriate Euclidean distance. The method is illustrated through the analysis of the variation in the prevalence of Loa loa among a sample of village residents in Cameroon, where the explanatory variables included elevation, together with maximum normalised-difference vegetation index and the standard deviation of normalised-difference vegetation index calculated from repeated satellite scans over time. © The Author(s) 2013.

  20. Adapting Metacognitive Therapy to Children with Generalised Anxiety Disorder

    DEFF Research Database (Denmark)

    Esbjørn, Barbara Hoff; Normann, Nicoline; Reinholdt-Dunne, Marie Louise

    2015-01-01

    -c) with generalised anxiety disorder (GAD) and create suggestions for an adapted manual. The adaptation was based on the structure and techniques used in MCT for adults with GAD. However, the developmental limitations of children were taken into account. For instance, therapy was aided with worksheets, practical......The metacognitive model and therapy has proven to be a promising theory and intervention for emotional disorders in adults. The model has also received empirical support in normal and clinical child samples. The purpose of the present study was to adapt metacognitive therapy to children (MCT...... exercises and delivered in a group format. Overall, the intervention relied heavily on practising MCT techniques in vivo with therapist assistance. A detailed description of how the manual was adapted for this age group is given, and examples from a group of four children are presented in a case series...

  1. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  2. Multiplicative quiver varieties and generalised Ruijsenaars-Schneider models

    Science.gov (United States)

    Chalykh, Oleg; Fairon, Maxime

    2017-11-01

    We study some classical integrable systems naturally associated with multiplicative quiver varieties for the (extended) cyclic quiver with m vertices. The phase space of our integrable systems is obtained by quasi-Hamiltonian reduction from the space of representations of the quiver. Three families of Poisson-commuting functions are constructed and written explicitly in suitable Darboux coordinates. The case m = 1 corresponds to the tadpole quiver and the Ruijsenaars-Schneider system and its variants, while for m > 1 we obtain new integrable systems that generalise the Ruijsenaars-Schneider system. These systems and their quantum versions also appeared recently in the context of supersymmetric gauge theory and cyclotomic DAHAs (Braverman et al. [32,34,35] and Kodera and Nakajima [36]), as well as in the context of the Macdonald theory (Chalykh and Etingof, 2013).

  3. Optimising, generalising and integrating educational practice using neuroscience

    Science.gov (United States)

    Colvin, Robert

    2016-07-01

    Practical collaboration at the intersection of education and neuroscience research is difficult because the combined discipline encompasses both the activity of microscopic neurons and the complex social interactions of teachers and students in a classroom. Taking a pragmatic view, this paper discusses three education objectives to which neuroscience can be effectively applied: optimising, generalising and integrating instructional techniques. These objectives are characterised by: (1) being of practical importance; (2) building on existing education and cognitive research; and (3) being infeasible to address based on behavioural experiments alone. The focus of the neuroscientific aspect of collaborative research should be on the activity of the brain before, during and after learning a task, as opposed to performance of a task. The objectives are informed by literature that highlights possible pitfalls with educational neuroscience research, and are described with respect to the static and dynamic aspects of brain physiology that can be measured by current technology.

  4. Generalised pruritus as a presentation of Grave’s disease

    Directory of Open Access Journals (Sweden)

    Tan CE

    2013-05-01

    Full Text Available Pruritus is a lesser known symptom of hyperthyroidism, particularly in autoimmune thyroid disorders. This is a case report of a 27-year-old woman who presented with generalised pruritus at a primary care clinic. Incidental findings of tachycardia and a goiter led to the investigations of her thyroid status. The thyroid function test revealed elevated serum free T4 and suppressed thyroid stimulating hormone levels. The anti-thyroid antibodies were positive. She was diagnosed with Graves’ disease and treated with carbimazole until her symptoms subsided. Graves’ disease should be considered as an underlying cause for patients presenting with pruritus. A thorough history and complete physical examination are crucial in making an accurate diagnosis. Underlying causes must be determined before treating the symptoms.

  5. Generalised two target localisation using passive monopulse radar

    KAUST Repository

    Jardak, Seifallah

    2017-04-07

    The simultaneous lobing technique, also known as monopulse technique, has been widely used for fast target localisation and tracking purposes. Many works focused on accurately localising one or two targets lying within a narrow beam centred around the monopulse antenna boresight. In this study, a new approach is proposed, which uses the outputs of four antennas to rapidly localise two point targets present in the hemisphere. If both targets have the same elevation angle, the proposed scheme cannot detect them. To detect such targets, a second set of antennas is required. In this study, to detect two targets at generalised locations, the antenna array is divided into multiple overlapping sets each of four antennas. Two algorithms are proposed to combine the outputs from multiple sets and improve the detection performance. Simulation results show that the algorithm is able to localise both targets with <;2° mean square error in azimuth and elevation.

  6. Generalised pruritus as a presentation of Grave's disease.

    Science.gov (United States)

    Tan, Ce; Loh, Ky

    2013-01-01

    Pruritus is a lesser known symptom of hyperthyroidism, particularly in autoimmune thyroid disorders. This is a case report of a 27-year-old woman who presented with generalised pruritus at a primary care clinic. Incidental findings of tachycardia and a goiter led to the investigations of her thyroid status. The thyroid function test revealed elevated serum free T4 and suppressed thyroid stimulating hormone levels. The anti-thyroid antibodies were positive. She was diagnosed with Graves' disease and treated with carbimazole until her symptoms subsided. Graves' disease should be considered as an underlying cause for patients presenting with pruritus. A thorough history and complete physical examination are crucial in making an accurate diagnosis. Underlying causes must be determined before treating the symptoms.

  7. Generalised partition functions: inferences on phase space distributions

    Directory of Open Access Journals (Sweden)

    R. A. Treumann

    2016-06-01

    Full Text Available It is demonstrated that the statistical mechanical partition function can be used to construct various different forms of phase space distributions. This indicates that its structure is not restricted to the Gibbs–Boltzmann factor prescription which is based on counting statistics. With the widely used replacement of the Boltzmann factor by a generalised Lorentzian (also known as the q-deformed exponential function, where κ = 1∕|q − 1|, with κ, q ∈ R both the kappa-Bose and kappa-Fermi partition functions are obtained in quite a straightforward way, from which the conventional Bose and Fermi distributions follow for κ → ∞. For κ ≠ ∞ these are subject to the restrictions that they can be used only at temperatures far from zero. They thus, as shown earlier, have little value for quantum physics. This is reasonable, because physical κ systems imply strong correlations which are absent at zero temperature where apart from stochastics all dynamical interactions are frozen. In the classical large temperature limit one obtains physically reasonable κ distributions which depend on energy respectively momentum as well as on chemical potential. Looking for other functional dependencies, we examine Bessel functions whether they can be used for obtaining valid distributions. Again and for the same reason, no Fermi and Bose distributions exist in the low temperature limit. However, a classical Bessel–Boltzmann distribution can be constructed which is a Bessel-modified Lorentzian distribution. Whether it makes any physical sense remains an open question. This is not investigated here. The choice of Bessel functions is motivated solely by their convergence properties and not by reference to any physical demands. This result suggests that the Gibbs–Boltzmann partition function is fundamental not only to Gibbs–Boltzmann but also to a large class of generalised Lorentzian distributions as well as to the

  8. An anisotropic elastoplastic constitutive formulation generalised for orthotropic materials

    Science.gov (United States)

    Mohd Nor, M. K.; Ma'at, N.; Ho, C. S.

    2018-03-01

    This paper presents a finite strain constitutive model to predict a complex elastoplastic deformation behaviour that involves very high pressures and shockwaves in orthotropic materials using an anisotropic Hill's yield criterion by means of the evolving structural tensors. The yield surface of this hyperelastic-plastic constitutive model is aligned uniquely within the principal stress space due to the combination of Mandel stress tensor and a new generalised orthotropic pressure. The formulation is developed in the isoclinic configuration and allows for a unique treatment for elastic and plastic orthotropy. An isotropic hardening is adopted to define the evolution of plastic orthotropy. The important feature of the proposed hyperelastic-plastic constitutive model is the introduction of anisotropic effect in the Mie-Gruneisen equation of state (EOS). The formulation is further combined with Grady spall failure model to predict spall failure in the materials. The proposed constitutive model is implemented as a new material model in the Lawrence Livermore National Laboratory (LLNL)-DYNA3D code of UTHM's version, named Material Type 92 (Mat92). The combination of the proposed stress tensor decomposition and the Mie-Gruneisen EOS requires some modifications in the code to reflect the formulation of the generalised orthotropic pressure. The validation approach is also presented in this paper for guidance purpose. The \\varvec{ψ} tensor used to define the alignment of the adopted yield surface is first validated. This is continued with an internal validation related to elastic isotropic, elastic orthotropic and elastic-plastic orthotropic of the proposed formulation before a comparison against range of plate impact test data at 234, 450 and {895 ms}^{-1} impact velocities is performed. A good agreement is obtained in each test.

  9. Managing uncertainty for sustainability of complex projects

    DEFF Research Database (Denmark)

    Brink, Tove

    2017-01-01

    Purpose – The purpose of this paper is to reveal how management of uncertainty can enable sustainability of complex projects. Design/methodology/approach – The research was conducted from June 2014 to May 2015 using a qualitative deductive approach among operation and maintenance actors in offshore...... wind farms. The research contains a focus group interview with 11 companies, 20 individual interviews and a seminar presenting preliminary findings with 60 participants. Findings – The findings reveal the need for management of uncertainty through two different paths. First, project management needs...... to join efforts. Research limitations/implications – Further research is needed to reveal the generalisability of the findings in other complex project contexts containing “unknown unknowns”. Practical implications – The research leads to the development of a tool for uncertainty management...

  10. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  11. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  12. Bernoulli's Principle

    Science.gov (United States)

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  13. Superstatistical generalised Langevin equation: non-Gaussian viscoelastic anomalous diffusion

    Science.gov (United States)

    Ślęzak, Jakub; Metzler, Ralf; Magdziarz, Marcin

    2018-02-01

    Recent advances in single particle tracking and supercomputing techniques demonstrate the emergence of normal or anomalous, viscoelastic diffusion in conjunction with non-Gaussian distributions in soft, biological, and active matter systems. We here formulate a stochastic model based on a generalised Langevin equation in which non-Gaussian shapes of the probability density function and normal or anomalous diffusion have a common origin, namely a random parametrisation of the stochastic force. We perform a detailed analysis demonstrating how various types of parameter distributions for the memory kernel result in exponential, power law, or power-log law tails of the memory functions. The studied system is also shown to exhibit a further unusual property: the velocity has a Gaussian one point probability density but non-Gaussian joint distributions. This behaviour is reflected in the relaxation from a Gaussian to a non-Gaussian distribution observed for the position variable. We show that our theoretical results are in excellent agreement with stochastic simulations.

  14. The creep analysis of shell structures using generalised models

    International Nuclear Information System (INIS)

    Boyle, J.T.; Spence, J.

    1981-01-01

    In this paper a new, more complete estimate of the accuracy of the stationary creep model is given for the general case through the evaluation of exact and approximate energy surfaces. In addition, the stationary model is extended to include more general non-stationary (combined elastic-creep) behaviour and to include the possibility of material deterioration through damage. The resulting models are then compared to existing exact solutions for several shell structures - e.g. a thin pressurised cylinder, a curved pipe in bending and an S-bellows under axial extension with large deflections. In each case very good agreement is obtained. Although requiring similar computing effort, so that the same solution techniques can be utilised, the calculation times are shown to be significantly reduced using the generalised approach. In conclusion, it has been demonstrated that a new simple mechanical model of a thin shell in creep, with or without material deterioration can be constructed; the model is assessed in detail and successfully compared to existing solutions. (orig./HP)

  15. Generalised perturbation theory and source of information through chemical measurements

    International Nuclear Information System (INIS)

    Lelek, V.; Marek, T.

    2001-01-01

    It is important to make all analyses and collect all information from the work of the new facility (which the transmutation demonstration unit will surely be) to be sure that the operation corresponds to the forecast or to correct the equations of the facility. The behaviour of the molten salt reactor and in particular the system of measurement are very different from that of the solid fuel reactor. Key information from the long time kinetics could be the nearly on line knowledge of the fuel composition. In this work it is shown how to include it into the control and use such data for the correction of neutron cross-sections for the high actinides or other characteristics. Also the problem of safety - change of the boundary problem to the initial problem - is mentioned. The problem is transformed into the generalised perturbation theory in which the adjoint function is obtained through the solution of the equations with right hand side having the form of source. Such an approach should be a theoretical base for the calculation of the sensitivity coefficients. (authors)

  16. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  17. Optimal entropic uncertainty relation for successive measurements ...

    Indian Academy of Sciences (India)

    measurements in quantum information theory. M D SRINIVAS ... derived by Robertson in 1929 [2] from the first principles of quantum theory, does not ... systems and may hence be referred to as 'uncertainty relations for distinct measurements'.

  18. Fearing shades of grey: individual differences in fear responding towards generalisation stimuli.

    Science.gov (United States)

    Arnaudova, Inna; Krypotos, Angelos-Miltiadis; Effting, Marieke; Kindt, Merel; Beckers, Tom

    2017-09-01

    Individual differences in fear generalisation have been proposed to play a role in the aetiology and/or maintenance of anxiety disorders, but few data are available to directly support that claim. The research that is available has focused mostly on generalisation of peripheral and central physiological fear responses. Far less is known about the generalisation of avoidance, the behavioural component of fear. In two experiments, we evaluated how neuroticism, a known vulnerability factor for anxiety, modulates an array of fear responses, including avoidance tendencies, towards generalisation stimuli (GS). Participants underwent differential fear conditioning, in which one conditioned stimulus (CS+) was repeatedly paired with an aversive outcome (shock; unconditioned stimulus, US), whereas another was not (CS-). Fear generalisation was observed across measures in Experiment 1 (US expectancy and evaluative ratings) and Experiment 2 (US expectancy, evaluative ratings, skin conductance, startle responses, safety behaviours), with overall highest responding to the CS+, lowest to the CS- and intermediate responding to the GSs. Neuroticism had very little impact on fear generalisation (but did affect GS recognition rates in Experiment 1), in line with the idea that fear generalisation is largely an adaptive process.

  19. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  20. Classical r-matrices for the generalised Chern–Simons formulation of 3d gravity

    Science.gov (United States)

    Osei, Prince K.; Schroers, Bernd J.

    2018-04-01

    We study the conditions for classical r-matrices to be compatible with the generalised Chern–Simons action for 3d gravity. Compatibility means solving the classical Yang–Baxter equations with a prescribed symmetric part for each of the real Lie algebras and bilinear pairings arising in the generalised Chern–Simons action. We give a new construction of r-matrices via a generalised complexification and derive a non-linear set of matrix equations determining the most general compatible r-matrix. We exhibit new families of solutions and show that they contain some known r-matrices for special parameter values.

  1. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    International Nuclear Information System (INIS)

    Klos, Richard

    2008-03-01

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  2. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  3. Designing synthetic networks in silico: a generalised evolutionary algorithm approach.

    Science.gov (United States)

    Smith, Robert W; van Sluijs, Bob; Fleck, Christian

    2017-12-02

    Evolution has led to the development of biological networks that are shaped by environmental signals. Elucidating, understanding and then reconstructing important network motifs is one of the principal aims of Systems & Synthetic Biology. Consequently, previous research has focused on finding optimal network structures and reaction rates that respond to pulses or produce stable oscillations. In this work we present a generalised in silico evolutionary algorithm that simultaneously finds network structures and reaction rates (genotypes) that can satisfy multiple defined objectives (phenotypes). The key step to our approach is to translate a schema/binary-based description of biological networks into systems of ordinary differential equations (ODEs). The ODEs can then be solved numerically to provide dynamic information about an evolved networks functionality. Initially we benchmark algorithm performance by finding optimal networks that can recapitulate concentration time-series data and perform parameter optimisation on oscillatory dynamics of the Repressilator. We go on to show the utility of our algorithm by finding new designs for robust synthetic oscillators, and by performing multi-objective optimisation to find a set of oscillators and feed-forward loops that are optimal at balancing different system properties. In sum, our results not only confirm and build on previous observations but we also provide new designs of synthetic oscillators for experimental construction. In this work we have presented and tested an evolutionary algorithm that can design a biological network to produce desired output. Given that previous designs of synthetic networks have been limited to subregions of network- and parameter-space, the use of our evolutionary optimisation algorithm will enable Synthetic Biologists to construct new systems with the potential to display a wider range of complex responses.

  4. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  5. Modelling of extreme minimum rainfall using generalised extreme value distribution for Zimbabwe

    Directory of Open Access Journals (Sweden)

    Delson Chikobvu

    2015-09-01

    Full Text Available We modelled the mean annual rainfall for data recorded in Zimbabwe from 1901 to 2009. Extreme value theory was used to estimate the probabilities of meteorological droughts. Droughts can be viewed as extreme events which go beyond and/or below normal rainfall occurrences, such as exceptionally low mean annual rainfall. The duality between the distribution of the minima and maxima was exploited and used to fit the generalised extreme value distribution (GEVD to the data and hence find probabilities of extreme low levels of mean annual rainfall. The augmented Dickey Fuller test confirmed that rainfall data were stationary, while the normal quantile-quantile plot indicated that rainfall data deviated from the normality assumption at both ends of the tails of the distribution. The maximum likelihood estimation method and the Bayesian approach were used to find the parameters of the GEVD. The Kolmogorov-Smirnov and Anderson-Darling goodnessof- fit tests showed that the Weibull class of distributions was a good fit to the minima mean annual rainfall using the maximum likelihood estimation method. The mean return period estimate of a meteorological drought using the threshold value of mean annual rainfall of 473 mm was 8 years. This implies that if in the year there is a meteorological drought then another drought of the same intensity or greater is expected after 8 years. It is expected that the use of Bayesian inference may better quantify the level of uncertainty associated with the GEVD parameter estimates than with the maximum likelihood estimation method. The Markov chain Monte Carlo algorithm for the GEVD was applied to construct the model parameter estimates using the Bayesian approach. These findings are significant because results based on non-informative priors (Bayesian method and the maximum likelihood method approach are expected to be similar.

  6. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  7. Emotion recognition training using composite faces generalises across identities but not all emotions.

    Science.gov (United States)

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  8. Dosimetric quantities and basic data for the evaluation of generalised derived limits

    International Nuclear Information System (INIS)

    Harrison, N.T.; Simmonds, J.R.

    1980-12-01

    The procedures, dosimetric quantities and basic data to be used for the evaluation of Generalised Derived Limits (GDLs) in environmental materials and of Generalised Derived Limits for discharges to atmosphere are described. The dosimetric considerations and the appropriate intake rates for both children and adults are discussed. In most situations in the nuclear industry and in those institutions, hospitals and laboratories which use relatively small quantities of radioactive material, the Generalised Derived Limits provide convenient reference levels against which the results of environmental monitoring can be compared, and atmospheric discharges can be assessed. They are intended for application when the environmental contamination or discharge to atmosphere is less than about 5% of the Generalised Derived Limit; above this level, it will usually be necessary to undertake a more detailed site-specific assessment. (author)

  9. Generalised brain edema and brain infarct in ergotamine abuse: Visualization by CT, MR and angiography

    International Nuclear Information System (INIS)

    Toedt, C.; Hoetzinger, H.; Salbeck, R.; Beyer, H.K.

    1989-01-01

    Abuse of ergotamine can release a generalised brain edema and brain infarctions. This can be visualized by CT, MR and angiography. The reason, however, can only be found in the patients history. (orig.) [de

  10. Control configuration selection for bilinear systems via generalised Hankel interaction index array

    DEFF Research Database (Denmark)

    Shaker, Hamid Reza; Tahavori, Maryamsadat

    2015-01-01

    configuration selection. It is well known that a suitable control configuration selection is an important prerequisite for a successful industrial control. In this paper the problem of control configuration selection for multiple-input and multiple-output (MIMO) bilinear processes is addressed. First...... way, an iterative method for solving the generalised Sylvester equation is proposed. The generalised cross-gramian is used to form the generalised Hankel interaction index array. The generalised Hankel interaction index array is used for control configuration selection of MIMO bilinear processes. Most......Decentralised and partially decentralised control strategies are very popular in practice. To come up with a suitable decentralised or partially decentralised control structure, it is important to select the appropriate input and output pairs for control design. This procedure is called control...

  11. Enhancing generalisation in biofeedback intervention using the challenge point framework: A case study

    Science.gov (United States)

    HITCHCOCK, ELAINE R.; BYUN, TARA McALLISTER

    2014-01-01

    Biofeedback intervention can help children achieve correct production of a treatment-resistant error sound, but generalisation is often limited. This case study suggests that generalisation can be enhanced when biofeedback intervention is structured in accordance with a “challenge point” framework for speech-motor learning. The participant was an 11-year-old with residual /r/ misarticulation who had previously attained correct /r/ production through a structured course of ultrasound biofeedback treatment but did not generalise these gains beyond the word level. Treatment difficulty was adjusted in an adaptive manner following predetermined criteria for advancing, maintaining, or moving back a level in a multidimensional hierarchy of functional task complexity. The participant achieved and maintained virtually 100% accuracy in producing /r/ at both word and sentence levels. These preliminary results support the efficacy of a semi-structured implementation of the challenge point framework as a means of achieving generalisation and maintenance of treatment gains. PMID:25216375

  12. Fundamental principles of quantum theory

    International Nuclear Information System (INIS)

    Bugajski, S.

    1980-01-01

    After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)

  13. The diagnostic value of the alveolar lamina dura in generalised bone disease

    International Nuclear Information System (INIS)

    Kuhlencordt, J.; Kruse, H.P.; Franke, J.; Hamburg Univ.

    1981-01-01

    Changes in the alveolar lamina dura in 134 patients have been analysed. They included 32 cases with urolithiasis in whom generalised bone disease had been excluded, 37 cases of primary hyperparathyroidism, 31 cases of secondary hyperparathyroidism and 34 cases with primary osteoporosis. The state of the lamina dura was related to biochemical, radiological and histological findings in the various groups. The value of the lamina dura in the diagnosis of generalised skeletal abnormalities has been defined. (orig.) [de

  14. libmpdata++ 1.0: a library of parallel MPDATA solvers for systems of generalised transport equations

    Science.gov (United States)

    Jaruga, A.; Arabas, S.; Jarecka, D.; Pawlowska, H.; Smolarkiewicz, P. K.; Waruszewski, M.

    2015-04-01

    This paper accompanies the first release of libmpdata++, a C++ library implementing the multi-dimensional positive-definite advection transport algorithm (MPDATA) on regular structured grid. The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; a shallow-water system compared with analytical solution (originally derived for a 2-D case); and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.

  15. libmpdata++ 0.1: a library of parallel MPDATA solvers for systems of generalised transport equations

    Science.gov (United States)

    Jaruga, A.; Arabas, S.; Jarecka, D.; Pawlowska, H.; Smolarkiewicz, P. K.; Waruszewski, M.

    2014-11-01

    This paper accompanies first release of libmpdata++, a C++ library implementing the Multidimensional Positive-Definite Advection Transport Algorithm (MPDATA). The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include: homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; shallow-water system compared with analytical solution (originally derived for a 2-D case); and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.

  16. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  17. Uncertainty in the classroom—teaching quantum physics

    International Nuclear Information System (INIS)

    Johansson, K E; Milstead, D

    2008-01-01

    The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how it can be used to elucidate many topics in modern physics

  18. Effets Josephson generalises entre antiferroaimants et entre supraconducteurs antiferromagnetiques

    Science.gov (United States)

    Chasse, Dominique

    L'effet Josephson est generalement presente comme le resultat de l'effet tunnel coherent de paires de Cooper a travers une jonction tunnel entre deux supraconducteurs, mais il est possible de l'expliquer dans un contexte plus general. Par exemple, Esposito & al. ont recemment demontre que l'effet Josephson DC peut etre decrit a l'aide du boson pseudo-Goldstone de deux systemes couples brisant chacun la symetrie abelienne U(1). Puisque cette description se generalise de facon naturelle a des brisures de symetries continues non-abeliennes, l'equivalent de l'effet Josephson devrait donc exister pour des types d'ordre a longue portee differents de la supraconductivite. Le cas de deux ferroaimants itinerants (brisure de symetrie 0(3)) couples a travers une jonction tunnel a deja ete traite dans la litterature Afin de mettre en evidence la generalite du phenomene et dans le but de faire des predictions a partir d'un modele realiste, nous etudions le cas d'une jonction tunnel entre deux antiferroaimants itinerants. En adoptant une approche Similaire a celle d'Ambegaokar & Baratoff pour une jonction Josephson, nous trouvons un courant d'aimantation alternee a travers la jonction qui est proportionnel a sG x sD ou fG et sD sont les vecteurs de Neel de part et d'autre de la jonction. La fonction sinus caracteristique du courant Josephson standard est donc remplacee.ici par un produit vectoriel. Nous montrons que, d'un point de vue microscopique, ce phenomene resulte de l'effet tunnel coherent de paires particule-trou de spin 1 et de vecteur d'onde net egal au vecteur d'onde antiferromagnetique Q. Nous trouvons egalement la dependance en temperature de l'analogue du courant critique. En presence d'un champ magnetique externe, nous obtenons l'analogue de l'effet Josephson AC et la description complete que nous en donnons s'applique aussi au cas d'une jonction tunnel entre ferroaimants (dans ce dernier cas, les traitements anterieurs de cet effet AC s'averent incomplets). Nous

  19. The generalised anxiety stigma scale (GASS): psychometric properties in a community sample

    Science.gov (United States)

    2011-01-01

    Background Although there is substantial concern about negative attitudes to mental illness, little is known about the stigma associated with Generalised Anxiety Disorder (GAD) or its measurement. The aim of this study was to develop a multi-item measure of Generalised Anxiety Disorder stigma (the GASS). Methods Stigma items were developed from a thematic analysis of web-based text about the stigma associated with GAD. Six hundred and seventeen members of the public completed a survey comprising the resulting 20 stigma items and measures designed to evaluate construct validity. Follow-up data were collected for a subset of the participants (n = 212). Results The factor structure comprised two components: Personal Stigma (views about Generalised Anxiety Disorder); and Perceived Stigma (views about the beliefs of most others in the community). There was evidence of good construct validity and reliability for each of the Generalised Anxiety Stigma Scale (GASS) subscales. Conclusions The GASS is a promising brief measure of the stigma associated with Generalised Anxiety Disorder. PMID:22108099

  20. The exceptional generalised geometry of supersymmetric AdS flux backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford, Andrew Wiles Building,Woodstock Road, Oxford, OX2 6GG (United Kingdom); Petrini, Michela [Sorbonne Université, UPMC Paris 06, UMR 7589,LPTHE, 75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)

    2016-12-29

    We analyse generic AdS flux backgrounds preserving eight supercharges in D=4 and D=5 dimensions using exceptional generalised geometry. We show that they are described by a pair of globally defined, generalised structures, identical to those that appear for flat flux backgrounds but with different integrability conditions. We give a number of explicit examples of such “exceptional Sasaki-Einstein” backgrounds in type IIB supergravity and M-theory. In particular, we give the complete analysis of the generic AdS{sub 5} M-theory backgrounds. We also briefly discuss the structure of the moduli space of solutions. In all cases, one structure defines a “generalised Reeb vector” that generates a Killing symmetry of the background corresponding to the R-symmetry of the dual field theory, and in addition encodes the generic contact structures that appear in the D=4 M-theory and D=5 type IIB cases. Finally, we investigate the relation between generalised structures and quantities in the dual field theory, showing that the central charge and R-charge of BPS wrapped-brane states are both encoded by the generalised Reeb vector, as well as discussing how volume minimisation (the dual of a- and F-maximisation) is encoded.

  1. The generalised anxiety stigma scale (GASS: psychometric properties in a community sample

    Directory of Open Access Journals (Sweden)

    Griffiths Kathleen M

    2011-11-01

    Full Text Available Abstract Background Although there is substantial concern about negative attitudes to mental illness, little is known about the stigma associated with Generalised Anxiety Disorder (GAD or its measurement. The aim of this study was to develop a multi-item measure of Generalised Anxiety Disorder stigma (the GASS. Methods Stigma items were developed from a thematic analysis of web-based text about the stigma associated with GAD. Six hundred and seventeen members of the public completed a survey comprising the resulting 20 stigma items and measures designed to evaluate construct validity. Follow-up data were collected for a subset of the participants (n = 212. Results The factor structure comprised two components: Personal Stigma (views about Generalised Anxiety Disorder; and Perceived Stigma (views about the beliefs of most others in the community. There was evidence of good construct validity and reliability for each of the Generalised Anxiety Stigma Scale (GASS subscales. Conclusions The GASS is a promising brief measure of the stigma associated with Generalised Anxiety Disorder.

  2. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  3. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  4. The one-dimensional normalised generalised equivalence theory (NGET) for generating equivalent diffusion theory group constants for PWR reflector regions

    International Nuclear Information System (INIS)

    Mueller, E.Z.

    1991-01-01

    An equivalent diffusion theory PWR reflector model is presented, which has as its basis Smith's generalisation of Koebke's Equivalent Theory. This method is an adaptation, in one-dimensional slab geometry, of the Generalised Equivalence Theory (GET). Since the method involves the renormalisation of the GET discontinuity factors at nodal interfaces, it is called the Normalised Generalised Equivalence Theory (NGET) method. The advantages of the NGET method for modelling the ex-core nodes of a PWR are summarized. 23 refs

  5. Calculation of nuclear reactivity using the generalised Adams-Bashforth-Moulton predictor corrector method

    Energy Technology Data Exchange (ETDEWEB)

    Suescun-Diaz, Daniel [Surcolombiana Univ., Neiva (Colombia). Groupo de Fisica Teorica; Narvaez-Paredes, Mauricio [Javeriana Univ., Cali (Colombia). Groupo de Matematica y Estadistica Aplicada Pontificia; Lozano-Parada, Jamie H. [Univ. del Valle, Cali (Colombia). Dept. de Ingenieria

    2016-03-15

    In this paper, the generalisation of the 4th-order Adams-Bashforth-Moulton predictor-corrector method is proposed to numerically solve the point kinetic equations of the nuclear reactivity calculations without using the nuclear power history. Due to the nature of the point kinetic equations, different predictor modifiers are used in order improve the precision of the approximations obtained. The results obtained with the prediction formulas and generalised corrections improve the precision when compared with previous methods and are valid for various forms of nuclear power and different time steps.

  6. Anaesthesia for caesarean section in a patient with acute generalised pustular psoriasis.

    Science.gov (United States)

    Samieh-Tucker, A; Rupasinghe, M

    2007-10-01

    We describe a 30-year-old parturient with acute generalised pustular psoriasis who presented for urgent caesarean section. A multidisciplinary team was involved and general anaesthesia was used successfully. Management of this condition is discussed and the literature reviewed. While generalised pustular psoriasis or impetigo herpetiformis is well recognised in pregnancy, it has not hitherto been reported in obstetric anaesthesia literature. The purpose of this article is to delineate the clinical picture of this disease, its treatment, and the effect on the mother and the fetus.

  7. Generalised synchronisation of spatiotemporal chaos using feedback control method and phase compression

    International Nuclear Information System (INIS)

    Xing-Yuan, Wang; Na, Zhang

    2010-01-01

    Coupled map lattices are taken as examples to study the synchronisation of spatiotemporal chaotic systems. First, a generalised synchronisation of two coupled map lattices is realised through selecting an appropriate feedback function and appropriate range of feedback parameter. Based on this method we use the phase compression method to extend the range of the parameter. So, we integrate the feedback control method with the phase compression method to implement the generalised synchronisation and obtain an exact range of feedback parameter. This technique is simple to implement in practice. Numerical simulations show the effectiveness and the feasibility of the proposed program. (general)

  8. Impulsivity modulates performance under response uncertainty in a reaching task.

    Science.gov (United States)

    Tzagarakis, C; Pellizzer, G; Rogers, R D

    2013-03-01

    We sought to explore the interaction of the impulsivity trait with response uncertainty. To this end, we used a reaching task (Pellizzer and Hedges in Exp Brain Res 150:276-289, 2003) where a motor response direction was cued at different levels of uncertainty (1 cue, i.e., no uncertainty, 2 cues or 3 cues). Data from 95 healthy adults (54 F, 41 M) were analysed. Impulsivity was measured using the Barratt Impulsiveness Scale version 11 (BIS-11). Behavioral variables recorded were reaction time (RT), errors of commission (referred to as 'early errors') and errors of precision. Data analysis employed generalised linear mixed models and generalised additive mixed models. For the early errors, there was an interaction of impulsivity with uncertainty and gender, with increased errors for high impulsivity in the one-cue condition for women and the three-cue condition for men. There was no effect of impulsivity on precision errors or RT. However, the analysis of the effect of RT and impulsivity on precision errors showed a different pattern for high versus low impulsives in the high uncertainty (3 cue) condition. In addition, there was a significant early error speed-accuracy trade-off for women, primarily in low uncertainty and a 'reverse' speed-accuracy trade-off for men in high uncertainty. These results extend those of past studies of impulsivity which help define it as a behavioural trait that modulates speed versus accuracy response styles depending on environmental constraints and highlight once more the importance of gender in the interplay of personality and behaviour.

  9. The Role Of Tasks That Supports Making Algebraic Generalisation In Forming 7th Grade Students’ Ability To Generalise

    Directory of Open Access Journals (Sweden)

    Rukiye Gökce

    2017-01-01

    Full Text Available Number patterns play an important role in the formation of mathematical concepts, as mathematics is treated as a science of patterns and relations, and as it is important to learn mathematics with generalization. With the reform in the middle school mathematics curriculum, the concept of pattern entering the curriculum has brought some learning difficulties in the context of generalization In this study the potential of tasks which are developed by considering student difficulties reported in the literature, algebraic generalization process and task design principles to shape generalization skills is examined. The study was conducted with thirteen students in five weeks (16 hours. Data were collected through notes, video and audio recordings and observations held during the implementation process. The data were analyzed qualitatively. As a result of the research; it has been determined that tasks can play an important role in strategy and notation use, algebraic generalization and effective use of visual models in finding a rule.

  10. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  11. Uncertainty in spatial planning proceedings

    Directory of Open Access Journals (Sweden)

    Aleš Mlakar

    2009-01-01

    Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.

  12. Investigation of the cognitive variables associated with worry in children with Generalised Anxiety Disorder and their parents.

    Science.gov (United States)

    Donovan, Caroline L; Holmes, Monique C; Farrell, Lara J

    2016-03-01

    Intolerance of uncertainty (IU), negative beliefs about worry (NBW), positive beliefs about worry (PBW), negative problem orientation (NPO) and cognitive avoidance (CA) have been found to be integral in the conceptualisation of Generalised Anxiety Disorder (GAD) in adults, yet they have rarely been investigated in children with GAD. This study sought to determine (a) whether IU, NBW, PBW, NPO and CA differ between children diagnosed with GAD and non-anxious children and (b) to examine whether IU, NBW, PBW, NPO and CA differ between parents of children diagnosed with GAD and parents of children without an anxiety disorder. Participants were 50 children (aged 7-12 years), plus one of their parents. The 25 GAD children and 25 non-anxious children were matched on age and gender. Parents and children completed clinical diagnostic interviews, as well as a battery of questionnaires measuring worry, IU, NBW, PBW, NPO and CA. Children with GAD endorsed significantly higher levels of worry, IU, NBW, NPO and CA, but not PBW compared to non-anxious children. Parents of children with GAD did not differ from parents of non-anxious children on any of the variables. The study was limited by it's use of modified adult measures for some variables and a lack of heterogeneity in the sample. The cognitive variables of IU, NBW, NPO and CA may also be important in the conceptualisation and treatment of GAD in children as they are in adults. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Uncertainty, causality and decision: The case of social risks and nuclear risk in particular

    International Nuclear Information System (INIS)

    Lahidji, R.

    2012-01-01

    Probability and causality are two indispensable tools for addressing situations of social risk. Causal relations are the foundation for building risk assessment models and identifying risk prevention, mitigation and compensation measures. Probability enables us to quantify risk assessments and to calibrate intervention measures. It therefore seems not only natural, but also necessary to make the role of causality and probability explicit in the definition of decision problems in situations of social risk. Such is the aim of this thesis.By reviewing the terminology of risk and the logic of public interventions in various fields of social risk, we gain a better understanding of the notion and of the issues that one faces when trying to model it. We further elaborate our analysis in the case of nuclear safety, examining in detail how methods and policies have been developed in this field and how they have evolved through time. This leads to a number of observations concerning risk and safety assessments.Generalising the concept of intervention in a Bayesian network allows us to develop a variety of causal Bayesian networks adapted to our needs. In this framework, we propose a definition of risk which seems to be relevant for a broad range of issues. We then offer simple applications of our model to specific aspects of the Fukushima accident and other nuclear safety problems. In addition to specific lessons, the analysis leads to the conclusion that a systematic approach for identifying uncertainties is needed in this area. When applied to decision theory, our tool evolves into a dynamic decision model in which acts cause consequences and are causally interconnected. The model provides a causal interpretation of Savage's conceptual framework, solves some of its paradoxes and clarifies certain aspects. It leads us to considering uncertainty with regard to a problem's causal structure as the source of ambiguity in decision-making, an interpretation which corresponds to a

  14. Dynamics of screw dislocations : a generalised minimising-movements scheme approach

    NARCIS (Netherlands)

    Bonaschi, G.A.; Meurs, van P.J.P.; Morandotti, M.

    2015-01-01

    The gradient flow structure of the model introduced in [CG99] for the dynamics of screw dislocations is investigated by means of a generalised minimising-movements scheme approach. The assumption of a finite number of available glide directions, together with the "maximal dissipation criterion" that

  15. Multi-Trial Guruswami–Sudan Decoding for Generalised Reed–Solomon Codes

    DEFF Research Database (Denmark)

    Nielsen, Johan Sebastian Rosenkilde; Zeh, Alexander

    2013-01-01

    An iterated refinement procedure for the Guruswami–Sudan list decoding algorithm for Generalised Reed–Solomon codes based on Alekhnovich’s module minimisation is proposed. The method is parametrisable and allows variants of the usual list decoding approach. In particular, finding the list...

  16. A retrospective study of carbamazepine therapy in the treatment of idiopathic generalised epilepsy

    LENUS (Irish Health Repository)

    O'Connor, G

    2011-05-01

    Objective: The exacerbation of idiopathic generalised epilepsy (IGE) by some anti-epileptic drugs (AEDs) such as carbamazepine (CBZ) has been well documented. However, it is unclear whether IGE is always worsened by the use of CBZ, or whether some patients with IGE benefit from its use. \\r\

  17. Generalisation of a 1:10k map from municipal data

    NARCIS (Netherlands)

    Van Altena, V.; Bakermans, J.; Lentjes, P.; Nijhuis, R.; Post, M.; Reuvers, M.; Stoter, J.E.

    2014-01-01

    This paper reports about the feasibility study carried out by the Dutch Kadaster to automatically generalise the largest scale topographical data set maintained by the Kadaster (i.e. TOP10NL) from the 1:1k topographical object oriented data set, which is currently being collected and structured by

  18. Passivation controller design for turbo-generators based on generalised Hamiltonian system theory

    NARCIS (Netherlands)

    Cao, M.; Shen, T.L.; Song, Y.H.

    2002-01-01

    A method of pre-feedback to formulate the generalised forced Hamiltonian system model for speed governor control systems is proposed. Furthermore, passivation controllers are designed based on the scheme of Hamiltonian structure for single machne infinite bus and multimachine power systems. In

  19. W-algebra symmetries of generalised Drinfel'd-Sokolov hierarchies

    International Nuclear Information System (INIS)

    Spence, B.

    1992-01-01

    Using the zero curvature formulation, it is shown that W-algebra transformations are symmetries of corresponding generalised Drinfel'd-Sokolov hierarchies. This result is illustrated with the examples of the KdV and Boussinesque hierarchies, and the hierarchy associated to the Polyakov-Bershadsky W-algebra. (orig.)

  20. Brief Report: Generalisation of Word-Picture Relations in Children with Autism and Typically Developing Children

    Science.gov (United States)

    Hartley, Calum; Allen, Melissa L.

    2014-01-01

    We investigated whether low-functioning children with autism generalise labels from colour photographs based on sameness of shape, colour, or both. Children with autism and language-matched controls were taught novel words paired with photographs of unfamiliar objects, and then sorted pictures and objects into two buckets according to whether or…

  1. Situational and Generalised Conduct Problems and Later Life Outcomes: Evidence from a New Zealand Birth Cohort

    Science.gov (United States)

    Fergusson, David M.; Boden, Joseph M.; Horwood, L. John

    2009-01-01

    Background: There is considerable evidence suggesting that many children show conduct problems that are specific to a given context (home; school). What is less well understood is the extent to which children with situation-specific conduct problems show similar outcomes to those with generalised conduct problems. Methods: Data were gathered as…

  2. Generalised Multi-sequence Shift-Register Synthesis using Module Minimisation

    DEFF Research Database (Denmark)

    Nielsen, Johan Sebastian Rosenkilde

    2013-01-01

    We show how to solve a generalised version of the Multi-sequence Linear Feedback Shift-Register (MLFSR) problem using minimisation of free modules over F[x]. We show how two existing algorithms for minimising such modules run particularly fast on these instances. Furthermore, we show how one...

  3. Generalised Partially Linear Regression with Misclassified Data and an Application to Labour Market Transitions

    DEFF Research Database (Denmark)

    Dlugosz, Stephan; Mammen, Enno; Wilke, Ralf

    We consider the semiparametric generalised linear regression model which has mainstream empirical models such as the (partially) linear mean regression, logistic and multinomial regression as special cases. As an extension to related literature we allow a misclassified covariate to be interacted...

  4. Modelling Problem-Solving Situations into Number Theory Tasks: The Route towards Generalisation

    Science.gov (United States)

    Papadopoulos, Ioannis; Iatridou, Maria

    2010-01-01

    This paper examines the way two 10th graders cope with a non-standard generalisation problem that involves elementary concepts of number theory (more specifically linear Diophantine equations) in the geometrical context of a rectangle's area. Emphasis is given on how the students' past experience of problem solving (expressed through interplay…

  5. Processing bias in children with separation anxiety disorder, social phobia and generalised anxiety disorder

    NARCIS (Netherlands)

    Kindt, M.; Bögels, S.M.; Morren, M.

    2003-01-01

    The present study examined processing bias in children suffering from anxiety disorders. Processing bias was assessed using of the emotional Stroop task in clinically referred children with separation anxiety disorder (SAD), social phobia (SP), and/or generalised anxiety disorder (GAD) and normal

  6. Specificity of dysfunctional thinking in children with symptoms of social anxiety, separation anxiety and generalised anxiety

    NARCIS (Netherlands)

    Bogels, S.M.; Snieder, N.; Kindt, M.

    2003-01-01

    The present study investigated whether children with high symptom levels of either social phobia (SP), separation anxiety disorder (SAD), or generalised anxiety disorder (GAD) are characterised by a specific set of dysfunctional interpretations that are consistent with the cognitive model of their

  7. First-ever generalised tonic-clonic seizures in adults in the ...

    African Journals Online (AJOL)

    First-ever generalised tonic-clonic seizures in adults in the emergency room: Review of cranial computed tomography of 76 cases in a tertiary hospital in Benin-city, Nigeria. ... Clinical and CT diagnoses agreed only in 8.4% of the cases.

  8. [Epileptic seizures during childbirth in a patient with idiopathic generalised epilepsy

    NARCIS (Netherlands)

    Voermans, N.C.; Zwarts, M.J.; Renier, W.O.; Bloem, B.R.

    2005-01-01

    During her first pregnancy, a 37-year-old woman with idiopathic generalised epilepsy that was adequately controlled with lamotrigine experienced a series of epileptic seizures following an elective caesarean section. The attacks were terminated with diazepam. The following day, she developed

  9. Quantitative modelling of HDPE spurt experiments using wall slip and generalised Newtonian flow

    NARCIS (Netherlands)

    Doelder, den C.F.J.; Koopmans, R.J.; Molenaar, J.

    1998-01-01

    A quantitative model to describe capillary rheometer experiments is presented. The model can generate ‘two-branched' discontinuous flow curves and the associated pressure oscillations. Polymer compressibility in the barrel, incompressible axisymmetric generalised Newtonian flow in the die, and a

  10. Efficacy and safety of pregabalin in generalised anxiety disorder : A critical review of the literature

    NARCIS (Netherlands)

    Baldwin, David S.; den Boer, Johan A.; Lyndon, Gavin; Emir, Birol; Schweizer, Edward; Haswell, Hannah

    2015-01-01

    The aim of this review is to summarise the literature on the efficacy and safety of pregabalin for the treatment of generalised anxiety disorder (GAD). Of 241 literature citations, 13 clinical trials were identified that were specifically designed to evaluate the efficacy and safety of pregabalin in

  11. Issues in the Analysis of Focus Groups: Generalisability, Quantifiability, Treatment of Context and Quotations

    Science.gov (United States)

    Vicsek, Lilla

    2010-01-01

    In this paper I discuss some concerns related to the analysis of focus groups: (a) the issue of generalisation; (b) the problems of using numbers and quantifying in the analysis; (c) how the concrete situation of the focus groups could be included in the analysis, and (d) what formats can be used when quoting from focus groups. Problems with…

  12. An attempt to introduce dynamics into generalised exergy considerations

    International Nuclear Information System (INIS)

    Grubbstroem, Robert W.

    2007-01-01

    In previous research, the author developed a general abstract framework for the exergy content of a system of finite objects [Grubbstroem RW. Towards a generalized exergy concept. In: van Gool W, Bruggink JJC, editors. Energy and time in the economic and physical sciences. Amsterdam: North-Holland; 1985. p. 41-56]. Each such object is characterised by its initial extensive properties and has an inner energy written as a function of these properties. It was shown that if these objects were allowed to interact, there is a maximum amount of work that can be extracted from the system as a whole, and a general formula for this potential was provided. It was also shown that if one of the objects was allowed to be of infinite magnitude initially, taking on the role as an environment having constant intensive properties, then the formula provided took on the same form as the classical expression for exergy. As a side result, the theoretical considerations demonstrated that the second law of thermodynamics could be interpreted as the inner energy function being a (weakly) convex function of its arguments, when these are chosen as the extensive properties. Since exergy considerations are based on the principle that total entropy is conserved when extracting work, these processes would take an infinite time to complete. In the current paper, instead, a differential-equation approach is introduced to describe the interaction in finite time between given finite objects of a system. Differences in intensive properties between the objects provide a force enabling an exchange of energy and matter. An example of such an interaction is heat conduction. The resulting considerations explain how the power extracted from the system will be limited by the processes being required to perform within finite-time constraints. Applying finite-time processes, in which entropy necessarily is generated, leads to formulating a theory for a maximal power output from the system. It is shown that

  13. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  14. Variational principles

    CERN Document Server

    Moiseiwitsch, B L

    2004-01-01

    This graduate-level text's primary objective is to demonstrate the expression of the equations of the various branches of mathematical physics in the succinct and elegant form of variational principles (and thereby illuminate their interrelationship). Its related intentions are to show how variational principles may be employed to determine the discrete eigenvalues for stationary state problems and to illustrate how to find the values of quantities (such as the phase shifts) that arise in the theory of scattering. Chapter-by-chapter treatment consists of analytical dynamics; optics, wave mecha

  15. Quantification of uncertainties of modeling and simulation

    International Nuclear Information System (INIS)

    Ma Zhibo; Yin Jianwei

    2012-01-01

    The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)

  16. Safety Principles

    Directory of Open Access Journals (Sweden)

    V. A. Grinenko

    2011-06-01

    Full Text Available The offered material in the article is picked up so that the reader could have a complete representation about concept “safety”, intrinsic characteristics and formalization possibilities. Principles and possible strategy of safety are considered. A material of the article is destined for the experts who are taking up the problems of safety.

  17. Maquet principle

    Energy Technology Data Exchange (ETDEWEB)

    Levine, R.B.; Stassi, J.; Karasick, D.

    1985-04-01

    Anterior displacement of the tibial tubercle is a well-accepted orthopedic procedure in the treatment of certain patellofemoral disorders. The radiologic appearance of surgical procedures utilizing the Maquet principle has not been described in the radiologic literature. Familiarity with the physiologic and biochemical basis for the procedure and its postoperative appearance is necessary for appropriate roentgenographic evaluation and the radiographic recognition of complications.

  18. Cosmological principle

    International Nuclear Information System (INIS)

    Wesson, P.S.

    1979-01-01

    The Cosmological Principle states: the universe looks the same to all observers regardless of where they are located. To most astronomers today the Cosmological Principle means the universe looks the same to all observers because density of the galaxies is the same in all places. A new Cosmological Principle is proposed. It is called the Dimensional Cosmological Principle. It uses the properties of matter in the universe: density (rho), pressure (p), and mass (m) within some region of space of length (l). The laws of physics require incorporation of constants for gravity (G) and the speed of light (C). After combining the six parameters into dimensionless numbers, the best choices are: 8πGl 2 rho/c 2 , 8πGl 2 rho/c 4 , and 2 Gm/c 2 l (the Schwarzchild factor). The Dimensional Cosmological Principal came about because old ideas conflicted with the rapidly-growing body of observational evidence indicating that galaxies in the universe have a clumpy rather than uniform distribution

  19. Extracting drug mechanism and pharmacodynamic information from clinical electroencephalographic data using generalised semi-linear canonical correlation analysis

    International Nuclear Information System (INIS)

    Brain, P; Strimenopoulou, F; Ivarsson, M; Wilson, F J; Diukova, A; Wise, R G; Berry, E; Jolly, A; Hall, J E

    2014-01-01

    Conventional analysis of clinical resting electroencephalography (EEG) recordings typically involves assessment of spectral power in pre-defined frequency bands at specific electrodes. EEG is a potentially useful technique in drug development for measuring the pharmacodynamic (PD) effects of a centrally acting compound and hence to assess the likelihood of success of a novel drug based on pharmacokinetic–pharmacodynamic (PK–PD) principles. However, the need to define the electrodes and spectral bands to be analysed a priori is limiting where the nature of the drug-induced EEG effects is initially not known. We describe the extension to human EEG data of a generalised semi-linear canonical correlation analysis (GSLCCA), developed for small animal data. GSLCCA uses data from the whole spectrum, the entire recording duration and multiple electrodes. It provides interpretable information on the mechanism of drug action and a PD measure suitable for use in PK–PD modelling. Data from a study with low (analgesic) doses of the μ-opioid agonist, remifentanil, in 12 healthy subjects were analysed using conventional spectral edge analysis and GSLCCA. At this low dose, the conventional analysis was unsuccessful but plausible results consistent with previous observations were obtained using GSLCCA, confirming that GSLCCA can be successfully applied to clinical EEG data. (paper)

  20. Large-uncertainty intelligent states for angular momentum and angle

    International Nuclear Information System (INIS)

    Goette, Joerg B; Zambrini, Roberta; Franke-Arnold, Sonja; Barnett, Stephen M

    2005-01-01

    The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In this paper, we highlight the difference between intelligent states and minimum uncertainty states by investigating a class of intelligent states which obey the equality in the angular uncertainty relation while having an arbitrarily large uncertainty product. To develop an understanding for the uncertainties of angle and angular momentum for the large-uncertainty intelligent states we compare exact solutions with analytical approximations in two limiting cases

  1. Uncertainty for Part Density Determination: An Update

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Mario Orlando [Los Alamos National Laboratory

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.

  2. Uncertainty Relations and Possible Experience

    Directory of Open Access Journals (Sweden)

    Gregg Jaeger

    2016-06-01

    Full Text Available The uncertainty principle can be understood as a condition of joint indeterminacy of classes of properties in quantum theory. The mathematical expressions most closely associated with this principle have been the uncertainty relations, various inequalities exemplified by the well known expression regarding position and momentum introduced by Heisenberg. Here, recent work involving a new sort of “logical” indeterminacy principle and associated relations introduced by Pitowsky, expressable directly in terms of probabilities of outcomes of measurements of sharp quantum observables, is reviewed and its quantum nature is discussed. These novel relations are derivable from Boolean “conditions of possible experience” of the quantum realm and have been considered both as fundamentally logical and as fundamentally geometrical. This work focuses on the relationship of indeterminacy to the propositions regarding the values of discrete, sharp observables of quantum systems. Here, reasons for favoring each of these two positions are considered. Finally, with an eye toward future research related to indeterminacy relations, further novel approaches grounded in category theory and intended to capture and reconceptualize the complementarity characteristics of quantum propositions are discussed in relation to the former.

  3. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  4. Uncertainty vs. Information (Invited)

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  5. Quantum Uncertainty and Fundamental Interactions

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-04-01

    Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.

  6. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  7. General principles of quantum mechanics

    International Nuclear Information System (INIS)

    Pauli, W.

    1980-01-01

    This book is a textbook for a course in quantum mechanics. Starting from the complementarity and the uncertainty principle Schroedingers equation is introduced together with the operator calculus. Then stationary states are treated as eigenvalue problems. Furthermore matrix mechanics are briefly discussed. Thereafter the theory of measurements is considered. Then as approximation methods perturbation theory and the WKB approximation are introduced. Then identical particles, spin, and the exclusion principle are discussed. There after the semiclassical theory of radiation and the relativistic one-particle problem are discussed. Finally an introduction is given into quantum electrodynamics. (HSI)

  8. New Inequalities and Uncertainty Relations on Linear Canonical Transform Revisit

    Directory of Open Access Journals (Sweden)

    Xu Guanlei

    2009-01-01

    Full Text Available The uncertainty principle plays an important role in mathematics, physics, signal processing, and so on. Firstly, based on definition of the linear canonical transform (LCT and the traditional Pitt's inequality, one novel Pitt's inequality in the LCT domains is obtained, which is connected with the LCT parameters a and b. Then one novel logarithmic uncertainty principle is derived from this novel Pitt's inequality in the LCT domains, which is associated with parameters of the two LCTs. Secondly, from the relation between the original function and LCT, one entropic uncertainty principle and one Heisenberg's uncertainty principle in the LCT domains are derived, which are associated with the LCT parameters a and b. The reason why the three lower bounds are only associated with LCT parameters a and b and independent of c and d is presented. The results show it is possible that the bounds tend to zeros.

  9. Generalised BRST symmetry and gaugeon formalism for perturbative quantum gravity: Novel observation

    International Nuclear Information System (INIS)

    Upadhyay, Sudhaker

    2014-01-01

    In this paper the novel features of Yokoyama gaugeon formalism are stressed out for the theory of perturbative quantum gravity in the Einstein curved spacetime. The quantum gauge transformations for the theory of perturbative gravity are demonstrated in the framework of gaugeon formalism. These quantum gauge transformations lead to renormalised gauge parameter. Further, we analyse the BRST symmetric gaugeon formalism which embeds more acceptable Kugo–Ojima subsidiary condition. Further, the BRST symmetry is made finite and field-dependent. Remarkably, the Jacobian of path integral under finite and field-dependent BRST symmetry amounts to the exact gaugeon action in the effective theory of perturbative quantum gravity. -- Highlights: •We analyse the perturbative gravity in gaugeon formalism. •The generalisation of BRST transformation is also studied in this context. •Within the generalised BRST framework we found the exact gaugeon modes in the theory

  10. A Game Theoretical Study of Generalised Trust and Reciprocation in Poland : I. Theory and Experimental Design

    Directory of Open Access Journals (Sweden)

    Urszula Markowska-Przybyła

    2014-01-01

    Full Text Available Although studies using experimental game theory have been carried out in various countries, no such major study has occurred in Poland. The study described here aims to investigate generalized trust and reciprocation among Polish students. In the literature, these traits are seen to be positively correlated with economic growth. Poland is regarded as the most successful post-soviet bloc country in transforming to a market economy but the level of generalised trust compared to other postcommunist countries is reported to be low. This study aims to see to what degree this reported level of generalised trust is visible amongst young Poles via experimental game theory, along with a questionnaire. The three games to be played have been described. Bayesian equilibria illustrating behavior observed in previous studies have been derived for two of these games and the experimental procedure has been described. (original abstract

  11. Collaborative care for panic disorder, generalised anxiety disorder and social phobia in general practice

    DEFF Research Database (Denmark)

    Curth, Nadja Kehler; Brinck-Claussen, Ursula Ødum; Davidsen, Annette Sofie

    2017-01-01

    such as cognitive behavioral therapy. A limited number of studies suggest that collaborative care has a positive effect on symptoms for people with anxiety disorders. However, most studies are carried out in the USA and none have reported results for social phobia or generalised anxiety disorder separately. Thus...... in this protocol and focus on panic disorder, generalised anxiety disorder and social phobia. The aim is to investigate whether treatment according to the Collabri model has a better effect than usual treatment on symptoms when provided to people with anxiety disorders. Methods: Three cluster-randomised, clinical...... practices located in the Capital Region of Denmark. For all trials, the primary outcome is anxiety symptoms (Beck Anxiety Inventory (BAI)) 6 months after baseline. Secondary outcomes include BAI after 15 months, depression symptoms (Beck Depression Inventory) after 6 months, level of psychosocial...

  12. Stability analysis and observational measurement in chameleonic generalised Brans-Dicke cosmology

    International Nuclear Information System (INIS)

    Farajollahi, Hossein; Salehi, Amin

    2011-01-01

    We investigate the dynamics of the chameleonic Generalised Brans-Dicke model in flat FRW cosmology. In a new approach, a framework to study stability and attractor solutions in the phase space for the model is developed by simultaneously best fitting the stability and model parameters with the observational data. The results show that for an accelerating universe the phantom crossing does not occur in the past and near future

  13. Aspects of string theory compactifications. D-brane statistics and generalised geometry

    Energy Technology Data Exchange (ETDEWEB)

    Gmeiner, F.

    2006-05-26

    In this thesis we investigate two different aspects of string theory compactifications. The first part deals with the issue of the huge amount of possible string vacua, known as the landscape. Concretely we investigate a specific well defined subset of type II orientifold compactifications. We develop the necessary tools to construct a very large set of consistent models and investigate their gauge sector on a statistical basis. In particular we analyse the frequency distributions of gauge groups and the possible amount of chiral matter for compactifications to six and four dimensions. In the phenomenologically relevant case of four-dimensional compactifications, special attention is paid to solutions with gauge groups that include those of the standard model, as well as Pati-Salam, SU(5) and flipped SU(5) models. Additionally we investigate the frequency distribution of coupling constants and correlations between the observables in the gauge sector. These results are compared with a recent study of Gepner models. Moreover, we elaborate on questions concerning the finiteness of the number of solutions and the computational complexity of the algorithm. In the second part of this thesis we consider a new mathematical framework, called generalised geometry, to describe the six-manifolds used in string theory compactifications. In particular, the formulation of T-duality and mirror symmetry for nonlinear topological sigma models is investigated. Therefore we provide a reformulation and extension of the known topological A- and B-models to the generalised framework. The action of mirror symmetry on topological D-branes in this setup is presented and the transformation of the boundary conditions is analysed. To extend the considerations to D-branes in type II string theory, we introduce the notion of generalised calibrations. We show that the known calibration conditions of supersymmetric branes in type IIA and IIB can be obtained as special cases. Finally we investigate

  14. Aspects of string theory compactifications. D-brane statistics and generalised geometry

    International Nuclear Information System (INIS)

    Gmeiner, F.

    2006-01-01

    In this thesis we investigate two different aspects of string theory compactifications. The first part deals with the issue of the huge amount of possible string vacua, known as the landscape. Concretely we investigate a specific well defined subset of type II orientifold compactifications. We develop the necessary tools to construct a very large set of consistent models and investigate their gauge sector on a statistical basis. In particular we analyse the frequency distributions of gauge groups and the possible amount of chiral matter for compactifications to six and four dimensions. In the phenomenologically relevant case of four-dimensional compactifications, special attention is paid to solutions with gauge groups that include those of the standard model, as well as Pati-Salam, SU(5) and flipped SU(5) models. Additionally we investigate the frequency distribution of coupling constants and correlations between the observables in the gauge sector. These results are compared with a recent study of Gepner models. Moreover, we elaborate on questions concerning the finiteness of the number of solutions and the computational complexity of the algorithm. In the second part of this thesis we consider a new mathematical framework, called generalised geometry, to describe the six-manifolds used in string theory compactifications. In particular, the formulation of T-duality and mirror symmetry for nonlinear topological sigma models is investigated. Therefore we provide a reformulation and extension of the known topological A- and B-models to the generalised framework. The action of mirror symmetry on topological D-branes in this setup is presented and the transformation of the boundary conditions is analysed. To extend the considerations to D-branes in type II string theory, we introduce the notion of generalised calibrations. We show that the known calibration conditions of supersymmetric branes in type IIA and IIB can be obtained as special cases. Finally we investigate

  15. A study of the one dimensional total generalised variation regularisation problem

    KAUST Repository

    Papafitsoros, Konstantinos

    2015-03-01

    © 2015 American Institute of Mathematical Sciences. In this paper we study the one dimensional second order total generalised variation regularisation (TGV) problem with L2 data fitting term. We examine the properties of this model and we calculate exact solutions using simple piecewise affine functions as data terms. We investigate how these solutions behave with respect to the TGV parameters and we verify our results using numerical experiments.

  16. A study of the one dimensional total generalised variation regularisation problem

    KAUST Repository

    Papafitsoros, Konstantinos; Bredies, Kristian

    2015-01-01

    © 2015 American Institute of Mathematical Sciences. In this paper we study the one dimensional second order total generalised variation regularisation (TGV) problem with L2 data fitting term. We examine the properties of this model and we calculate exact solutions using simple piecewise affine functions as data terms. We investigate how these solutions behave with respect to the TGV parameters and we verify our results using numerical experiments.

  17. The linear stability of the Schwarzschild solution to gravitational perturbations in the generalised wave gauge

    OpenAIRE

    Johnson, Thomas

    2018-01-01

    In a recent seminal paper \\cite{D--H--R} of Dafermos, Holzegel and Rodnianski the linear stability of the Schwarzschild family of black hole solutions to the Einstein vacuum equations was established by imposing a double null gauge. In this paper we shall prove that the Schwarzschild family is linearly stable as solutions to the Einstein vacuum equations by imposing instead a generalised wave gauge: all sufficiently regular solutions to the system of equations that result from linearising the...

  18. Zymography Principles.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2017-01-01

    Zymography, the detection, identification, and even quantification of enzyme activity fractionated by gel electrophoresis, has received increasing attention in the last years, as revealed by the number of articles published. A number of enzymes are routinely detected by zymography, especially with clinical interest. This introductory chapter reviews the major principles behind zymography. New advances of this method are basically focused towards two-dimensional zymography and transfer zymography as will be explained in the rest of the chapters. Some general considerations when performing the experiments are outlined as well as the major troubleshooting and safety issues necessary for correct development of the electrophoresis.

  19. Basic principles

    International Nuclear Information System (INIS)

    Wilson, P.D.

    1996-01-01

    Some basic explanations are given of the principles underlying the nuclear fuel cycle, starting with the physics of atomic and nuclear structure and continuing with nuclear energy and reactors, fuel and waste management and finally a discussion of economics and the future. An important aspect of the fuel cycle concerns the possibility of ''closing the back end'' i.e. reprocessing the waste or unused fuel in order to re-use it in reactors of various kinds. The alternative, the ''oncethrough'' cycle, discards the discharged fuel completely. An interim measure involves the prolonged storage of highly radioactive waste fuel. (UK)

  20. Role of information theoretic uncertainty relations in quantum theory

    Energy Technology Data Exchange (ETDEWEB)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz [FNSPE, Czech Technical University in Prague, Břehová 7, 115 19 Praha 1 (Czech Republic); ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin (Germany); Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH (United Kingdom); Joo, Jaewoo, E-mail: j.joo@surrey.ac.uk [Advanced Technology Institute and Department of Physics, University of Surrey, Guildford, GU2 7XH (United Kingdom)

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  1. Role of information theoretic uncertainty relations in quantum theory

    International Nuclear Information System (INIS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-01-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed

  2. Recent Fuzzy Generalisations of Rough Sets Theory: A Systematic Review and Methodological Critique of the Literature

    Directory of Open Access Journals (Sweden)

    Abbas Mardani

    2017-01-01

    Full Text Available Rough set theory has been used extensively in fields of complexity, cognitive sciences, and artificial intelligence, especially in numerous fields such as expert systems, knowledge discovery, information system, inductive reasoning, intelligent systems, data mining, pattern recognition, decision-making, and machine learning. Rough sets models, which have been recently proposed, are developed applying the different fuzzy generalisations. Currently, there is not a systematic literature review and classification of these new generalisations about rough set models. Therefore, in this review study, the attempt is made to provide a comprehensive systematic review of methodologies and applications of recent generalisations discussed in the area of fuzzy-rough set theory. On this subject, the Web of Science database has been chosen to select the relevant papers. Accordingly, the systematic and meta-analysis approach, which is called “PRISMA,” has been proposed and the selected articles were classified based on the author and year of publication, author nationalities, application field, type of study, study category, study contribution, and journal in which the articles have appeared. Based on the results of this review, we found that there are many challenging issues related to the different application area of fuzzy-rough set theory which can motivate future research studies.

  3. Effect of lamotrigine on cerebral blood flow in patients with idiopathic generalised epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Eun Yeon [Ewha Womans University, Department of Neurology, College of Medicine, Seoul (Korea); Hong, Seung Bong; Tae, Woo Suk; Han, Sun Jung; Seo, Dae Won [Sungkyunkwan University School of Medicine, Department of Neurology, Samsung Medical Center and Center for Clinical Medicine, SBRI, Gangnam-Gu, Seoul (Korea); Lee, Kyung-Han [Sungkyunkwan University School of Medicine, Department of Nuclear Medicine, Samsung Medical Center and Center for Clinical Medicine, SBRI, Gangnam-Gu, Seoul (Korea); Lee, Mann Hyung [Catholic University of Daegu, College of Pharmacy, Gyeongbuk (Korea)

    2006-06-15

    The purpose of this study was to investigate the effects of the new anti-epileptic drug, lamotrigine, on cerebral blood flow by performing {sup 99m}Tc-ethylcysteinate dimer (ECD) single-photon emission computed tomography (SPECT) before and after medication in patients with drug-naive idiopathic generalised epilepsy. Interictal {sup 99m}Tc-ECD brain SPECT was performed before drug treatment started and then repeated after lamotrigine medication for 4-5 months in 30 patients with generalised epilepsy (M/F=14/16, 19.3{+-}3.4 years). Seizure types were generalised tonic-clonic seizure in 23 patients and myoclonic seizures in seven. The mean lamotrigine dose used was 214.1{+-}29.1 mg/day. For SPM analysis, all SPECT images were spatially normalised to the standard SPECT template and then smoothed using a 12-mm full-width at half-maximum Gaussian kernel. The paired t test was used to compare pre- and post-lamotrigine SPECT images. SPM analysis of pre- and post-lamotrigine brain SPECT images showed decreased perfusion in bilateral dorsomedial nuclei of thalami, bilateral uncus, right amygdala, left subcallosal gyrus, right superior and inferior frontal gyri, right precentral gyrus, bilateral superior and inferior temporal gyri and brainstem (pons, medulla) after lamotrigine medication at a false discovery rate-corrected p<0.05. No brain region showed increased perfusion after lamotrigine administration. (orig.)

  4. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  5. Uncertainty in prediction and in inference

    International Nuclear Information System (INIS)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support

  6. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  7. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  8. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  9. Quantum principles and particles

    CERN Document Server

    Wilcox, Walter

    2012-01-01

    QUANTUM PRINCIPLESPerspective and PrinciplesPrelude to Quantum MechanicsStern-Gerlach Experiment Idealized Stern-Gerlach ResultsClassical Model AttemptsWave Functions for Two Physical-Outcome CaseProcess Diagrams, Operators, and Completeness Further Properties of Operators/ModulationOperator ReformulationOperator RotationBra-Ket Notation/Basis StatesTransition AmplitudesThree-Magnet Setup Example-CoherenceHermitian ConjugationUnitary OperatorsA Very Special OperatorMatrix RepresentationsMatrix Wave Function RecoveryExpectation ValuesWrap Up ProblemsFree Particles in One DimensionPhotoelectric EffectCompton EffectUncertainty Relation for PhotonsStability of Ground StatesBohr ModelFourier Transform and Uncertainty RelationsSchrödinger EquationSchrödinger Equation ExampleDirac Delta FunctionsWave Functions and ProbabilityProbability CurrentTime Separable SolutionsCompleteness for Particle StatesParticle Operator PropertiesOperator RulesTime Evolution and Expectation ValuesWrap-UpProblemsSome One-Dimensional So...

  10. Using the generalised invariant formalism: a class of conformally flat pure radiation metrics with a negative cosmological constant

    Energy Technology Data Exchange (ETDEWEB)

    Edgar, S Brian [Department of Mathematics, Linkoepings Universitet Linkoeping, S-581 83 (Sweden); Ramos, M P Machado [Departamento de Matematica para a Ciencia e Tecnologia, Azurem 4800-058 Guimaraes, Universidade do Minho (Portugal)

    2007-05-15

    We demonstrate an integration procedure for the generalised invariant formalism by obtaining a subclass of conformally flat pure radiation spacetimes with a negative cosmological constant. The method used is a development of the methods used earlier for pure radiation spacetimes of Petrov types O and N respectively. This subclass of spacetimes turns out to have one degree of isotropy freedom, so in this paper we have extended the integration procedure for the generalised invariant formalism to spacetimes with isotropy freedom,.

  11. Using the generalised invariant formalism: a class of conformally flat pure radiation metrics with a negative cosmological constant

    International Nuclear Information System (INIS)

    Edgar, S Brian; Ramos, M P Machado

    2007-01-01

    We demonstrate an integration procedure for the generalised invariant formalism by obtaining a subclass of conformally flat pure radiation spacetimes with a negative cosmological constant. The method used is a development of the methods used earlier for pure radiation spacetimes of Petrov types O and N respectively. This subclass of spacetimes turns out to have one degree of isotropy freedom, so in this paper we have extended the integration procedure for the generalised invariant formalism to spacetimes with isotropy freedom,

  12. Cosmological implications of Heisenberg's principle

    CERN Document Server

    Gonzalo, Julio A

    2015-01-01

    The aim of this book is to analyze the all important implications of Heisenberg's Uncertainty Principle for a finite universe with very large mass-energy content such as ours. The earlier and main contributors to the formulation of Quantum Mechanics are briefly reviewed regarding the formulation of Heisenberg's Principle. After discussing “indeterminacy” versus ”uncertainty”, the universal constants of physics are reviewed and Planck's units are given. Next, a novel set of units, Heisenberg–Lemaitre units, are defined in terms of the large finite mass of the universe. With the help of Heisenberg's principle, the time evolution of the finite zero-point energy for the universe is investigated quantitatively. Next, taking advantage of the rigorous solutions of Einstein's cosmological equation for a flat, open and mixed universe of finite mass, the most recent and accurate data on the “age” (to) and the expansion rate (Ho) of the universe and their implications are reconsidered.

  13. Resolving uncertainty in chemical speciation determinations

    Science.gov (United States)

    Smith, D. Scott; Adams, Nicholas W. H.; Kramer, James R.

    1999-10-01

    Speciation determinations involve uncertainty in system definition and experimentation. Identification of appropriate metals and ligands from basic chemical principles, analytical window considerations, types of species and checking for consistency in equilibrium calculations are considered in system definition uncertainty. A systematic approach to system definition limits uncertainty in speciation investigations. Experimental uncertainty is discussed with an example of proton interactions with Suwannee River fulvic acid (SRFA). A Monte Carlo approach was used to estimate uncertainty in experimental data, resulting from the propagation of uncertainties in electrode calibration parameters and experimental data points. Monte Carlo simulations revealed large uncertainties present at high (>9-10) and low (monoprotic ligands. Least-squares fit the data with 21 sites, whereas linear programming fit the data equally well with 9 sites. Multiresponse fitting, involving simultaneous fluorescence and pH measurements, improved model discrimination. Deconvolution of the excitation versus emission fluorescence surface for SRFA establishes a minimum of five sites. Diprotic sites are also required for the five fluorescent sites, and one non-fluorescent monoprotic site was added to accommodate the pH data. Consistent with greater complexity, the multiresponse method had broader confidence limits than the uniresponse methods, but corresponded better with the accepted total carboxylic content for SRFA. Overall there was a 40% standard deviation in total carboxylic content for the multiresponse fitting, versus 10% and 1% for least-squares and linear programming, respectively.

  14. Objective Bayesianism and the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Jon Williamson

    2013-09-01

    Full Text Available Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities; they should be calibrated to our evidence of physical probabilities; and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper, we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes’ Theorem.

  15. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  16. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  17. Mapping shape to visuomotor mapping: learning and generalisation of sensorimotor behaviour based on contextual information.

    Directory of Open Access Journals (Sweden)

    Loes C J van Dam

    2015-03-01

    Full Text Available Humans can learn and store multiple visuomotor mappings (dual-adaptation when feedback for each is provided alternately. Moreover, learned context cues associated with each mapping can be used to switch between the stored mappings. However, little is known about the associative learning between cue and required visuomotor mapping, and how learning generalises to novel but similar conditions. To investigate these questions, participants performed a rapid target-pointing task while we manipulated the offset between visual feedback and movement end-points. The visual feedback was presented with horizontal offsets of different amounts, dependent on the targets shape. Participants thus needed to use different visuomotor mappings between target location and required motor response depending on the target shape in order to "hit" it. The target shapes were taken from a continuous set of shapes, morphed between spiky and circular shapes. After training we tested participants performance, without feedback, on different target shapes that had not been learned previously. We compared two hypotheses. First, we hypothesised that participants could (explicitly extract the linear relationship between target shape and visuomotor mapping and generalise accordingly. Second, using previous findings of visuomotor learning, we developed a (implicit Bayesian learning model that predicts generalisation that is more consistent with categorisation (i.e. use one mapping or the other. The experimental results show that, although learning the associations requires explicit awareness of the cues' role, participants apply the mapping corresponding to the trained shape that is most similar to the current one, consistent with the Bayesian learning model. Furthermore, the Bayesian learning model predicts that learning should slow down with increased numbers of training pairs, which was confirmed by the present results. In short, we found a good correspondence between the

  18. Ascertaining the uncertainty relations via quantum correlations

    International Nuclear Information System (INIS)

    Li, Jun-Li; Du, Kun; Qiao, Cong-Feng

    2014-01-01

    We propose a new scheme to express the uncertainty principle in the form of inequality of the bipartite correlation functions for a given multipartite state, which provides an experimentally feasible and model-independent way to verify various uncertainty and measurement disturbance relations. By virtue of this scheme, the implementation of experimental measurement on the measurement disturbance relation to a variety of physical systems becomes practical. The inequality in turn, also imposes a constraint on the strength of correlation, i.e. it determines the maximum value of the correlation function for two-body system and a monogamy relation of the bipartite correlation functions for multipartite system. (paper)

  19. The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment

    Science.gov (United States)

    Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea

    2010-01-01

    An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…

  20. Uncertainty principle and the stable interpretation of spectrometric experiment results

    International Nuclear Information System (INIS)

    Zhukovskij, E.L.

    1984-01-01

    Two stable forms for recording least-swuare method used for evaluation of parameters durmng automated processing and interpretation of various type spectra were derived on the basis of the Kramer-Rao inequality. Spectra described by linear equations are considered for which parameter evaluations are recorded in a final form. It is shown that the suggested form of the interpreting functional is maintained for the spectra of different nature (NMR-, IR-, UV-, RS- and mass-spectra), their parameters depending nonlinearly on the wave number

  1. Uncertainty principle and informational entropy for partially coherent light

    NARCIS (Netherlands)

    Bastiaans, M.J.

    1986-01-01

    It is shown that, among all partially coherent wave fields having the same informational entropy, the product of the effective widths of the intensity functions in the space and the spatial-frequency domains takes its minimum value for a wave field with a Gaussian-shaped cross-spectral density

  2. Heisenberg, Matrix Mechanics, and the Uncertainty Principle 4-6 ...

    Indian Academy of Sciences (India)

    silllple example of a point particle that is free to Illove OIl a lille. An observable ill this ... a ('oufin'/1,o//,8 iufinity of valnes, in contrast to a discrete iufinite set of values sHch as. 1. 2, 3, 4, .. '. ..... presentation speech by H Pleigel, ". Your quantum.

  3. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  4. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  5. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  6. A generalised Green-Julg theorem for proper groupoids and Banach algebras

    OpenAIRE

    Paravicini, Walther

    2009-01-01

    The Green-Julg theorem states that K_0^G(B) is isomorphic to K_0(L^1(G,B)) for every compact group G and every G-C*-algebra B. We formulate a generalisation of this result to proper groupoids and Banach algebras and deduce that the Bost assembly map is surjective for proper Banach algebras. On the way, we show that the spectral radius of an element in a C_0(X)-Banach algebra can be calculated from the spectral radius in the fibres.

  7. Utility of natural generalised inverse technique in the interpretation of dyke structures

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, M.M.M.; Murty, T.V.R.; Rao, P.R.; Lakshminarayana, S.; Subrahmanyam, A.S.; Murthy, K.S.R.

    environs along the central west coast of India: analysis using EOF, J. Geophys.Res.,91(1986) 8523 -8526. 9 Marquardt D W, An algorithm for least-squares estimation of non-linear parameters, J. Soc. Indust. Appl. Math, 11 (1963) 431-441. INDIAN J. MAR... technique in reconstruction of gravity anomalies due to a fault, Indian J. Pure. Appl. Math., 34 (2003) 31-47. 16 Ramana Murty T V, Somayajulu Y K & Murty C S, Reconstruction of sound speed profile through natural generalised inverse technique, Indian J...

  8. Generalised Category Attack—Improving Histogram-Based Attack on JPEG LSB Embedding

    Science.gov (United States)

    Lee, Kwangsoo; Westfeld, Andreas; Lee, Sangjin

    We present a generalised and improved version of the category attack on LSB steganography in JPEG images with straddled embedding path. It detects more reliably low embedding rates and is also less disturbed by double compressed images. The proposed methods are evaluated on several thousand images. The results are compared to both recent blind and specific attacks for JPEG embedding. The proposed attack permits a more reliable detection, although it is based on first order statistics only. Its simple structure makes it very fast.

  9. Generalised universality of gauge thresholds in heterotic vacua with and without supersymmetry

    CERN Document Server

    Angelantonj, Carlo; Tsulaia, Mirian

    2015-01-01

    We study one-loop quantum corrections to gauge couplings in heterotic vacua with spontaneous supersymmetry breaking. Although in non-supersymmetric constructions these corrections are not protected and are typically model dependent, we show how a universal behaviour of threshold differences, typical of supersymmetric vacua, may still persist. We formulate specific conditions on the way supersymmetry should be broken for this to occur. Our analysis implies a generalised notion of threshold universality even in the case of unbroken supersymmetry, whenever extra charged massless states appear at enhancement points in the bulk of moduli space. Several examples with universality, including non-supersymmetric chiral models in four dimensions, are presented.

  10. Generalised pustular psoriasis, psoriatic arthritis and nephrotic syndrome associated with systemic amyloidosis.

    Science.gov (United States)

    David, M; Abraham, D; Weinberger, A; Feuerman, E J

    1982-09-01

    The case report is presented of a psoriatic patient with arthropathy, generalised pustular psoriasis and nephrotic syndrome, in whom systemic amyloidosis developed. The literature reports 13 cases of psoriasis associated with amyloidosis, 3 of whom suffered from pustular psoriasis as does our case. With the addition of our case, 12 of these 14 had concomitant arthropathy. This seems to suggest that arthritis is an important factor in the appearance of amyloidosis. Rectal biopsy and/or renal biopsy may be helpful in establishing the diagnosis of amyloidosis relatively early in patients with psoriatic arthritis.

  11. Multiple periodic-soliton solutions of the (3+1)-dimensional generalised shallow water equation

    Science.gov (United States)

    Li, Ye-Zhou; Liu, Jian-Guo

    2018-06-01

    Based on the extended variable-coefficient homogeneous balance method and two new ansätz functions, we construct auto-Bäcklund transformation and multiple periodic-soliton solutions of (3 {+} 1)-dimensional generalised shallow water equations. Completely new periodic-soliton solutions including periodic cross-kink wave, periodic two-solitary wave and breather type of two-solitary wave are obtained. In addition, cross-kink three-soliton and cross-kink four-soliton solutions are derived. Furthermore, propagation characteristics and interactions of the obtained solutions are discussed and illustrated in figures.

  12. H∞ state estimation of generalised neural networks with interval time-varying delays

    Science.gov (United States)

    Saravanakumar, R.; Syed Ali, M.; Cao, Jinde; Huang, He

    2016-12-01

    This paper focuses on studying the H∞ state estimation of generalised neural networks with interval time-varying delays. The integral terms in the time derivative of the Lyapunov-Krasovskii functional are handled by the Jensen's inequality, reciprocally convex combination approach and a new Wirtinger-based double integral inequality. A delay-dependent criterion is derived under which the estimation error system is globally asymptotically stable with H∞ performance. The proposed conditions are represented by linear matrix inequalities. Optimal H∞ norm bounds are obtained easily by solving convex problems in terms of linear matrix inequalities. The advantage of employing the proposed inequalities is illustrated by numerical examples.

  13. Generalisation of the test theory of special relativity to non-inertial frames

    International Nuclear Information System (INIS)

    Abolghasem, G.H.; Khajehpour, M.R.H.; Mansouri, R.

    1989-01-01

    We present a generalised test theory of special relativity, using a non-inertial frame. Within the framework of the special theory of relativity the transport and Einstein synchronisations are equivalent on a rigidly rotating disc. But in any theory with a preferred frame, such an equivalence does not hold. The time difference resulting from the two synchronisation procedures is a measurable quantity within the reach of existing clock systems on the Earth. The final result contains a term which depends on the angular velocity of the rotating system, and hence measures an absolute effect. This term is of crucial importance in our test theory of special relativity. (Author)

  14. The impact of case specificity and generalisable skills on clinical performance: a correlated traits-correlated methods approach.

    Science.gov (United States)

    Wimmers, Paul F; Fung, Cha-Chi

    2008-06-01

    The finding of case or content specificity in medical problem solving moved the focus of research away from generalisable skills towards the importance of content knowledge. However, controversy about the content dependency of clinical performance and the generalisability of skills remains. This study aimed to explore the relative impact of both perspectives (case specificity and generalisable skills) on different components (history taking, physical examination, communication) of clinical performance within and across cases. Data from a clinical performance examination (CPX) taken by 350 Year 3 students were used in a correlated traits-correlated methods (CTCM) approach using confirmatory factor analysis, whereby 'traits' refers to generalisable skills and 'methods' to individual cases. The baseline CTCM model was analysed and compared with four nested models using structural equation modelling techniques. The CPX consisted of three skills components and five cases. Comparison of the four different models with the least-restricted baseline CTCM model revealed that a model with uncorrelated generalisable skills factors and correlated case-specific knowledge factors represented the data best. The generalisable processes found in history taking, physical examination and communication were responsible for half the explained variance, in comparison with the variance related to case specificity. Conclusions Pure knowledge-based and pure skill-based perspectives on clinical performance both seem too one-dimensional and new evidence supports the idea that a substantial amount of variance contributes to both aspects of performance. It could be concluded that generalisable skills and specialised knowledge go hand in hand: both are essential aspects of clinical performance.

  15. Uncertainty in social dilemmas

    NARCIS (Netherlands)

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size

  16. Uncertainty and Climate Change

    OpenAIRE

    Berliner, L. Mark

    2003-01-01

    Anthropogenic, or human-induced, climate change is a critical issue in science and in the affairs of humankind. Though the target of substantial research, the conclusions of climate change studies remain subject to numerous uncertainties. This article presents a very brief review of the basic arguments regarding anthropogenic climate change with particular emphasis on uncertainty.

  17. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  18. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  19. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  20. Efficacy of oral afoxolaner for the treatment of canine generalised demodicosis

    Directory of Open Access Journals (Sweden)

    Beugnet Frédéric

    2016-01-01

    Full Text Available The efficacy of oral treatment with a chewable tablet containing afoxolaner 2.27% w/w (NexGard®, Merial administered orally was assessed in eight dogs diagnosed with generalised demodicosis and compared with efficacy in eight dogs under treatment with a topical combination of imidacloprid/moxidectin (Advocate®, Bayer. Afoxolaner was administered at the recommended dose (at least 2.5 mg/kg on Days 0, 14, 28 and 56. The topical combination of imidacloprid/moxidectin was given at the same intervals at the recommended concentration. Clinical examinations and deep skin scrapings were performed every month in order to evaluate the effect on mite numbers and the resolution of clinical signs. The percentage reductions of mite counts were 99.2%, 99.9% and 100% on Days 28, 56 and 84, respectively, in the afoxolaner-treated group, compared to 89.8%, 85.2% and 86.6% on Days 28, 56 and 84 in the imidacloprid/moxidectin-treated group. Skin condition of the dogs also improved significantly from Day 28 to Day 84 in the afoxolaner-treated group. Mite reductions were significantly higher on Days 28, 56 and 84 in the afoxolaner-treated group compared to the imidacloprid/moxidectin-treated group. The results of this study demonstrated that afoxolaner, given orally, was effective in treating dogs with generalised demodicosis within a two-month period.

  1. Navigation towards a goal position: from reactive to generalised learned control

    Energy Technology Data Exchange (ETDEWEB)

    Freire da Silva, Valdinei [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil); Selvatici, Antonio Henrique [Universidade Nove de Julho, Rua Vergueiro, 235, Sao Paulo (Brazil); Reali Costa, Anna Helena, E-mail: valdinei.freire@gmail.com, E-mail: antoniohps@uninove.br, E-mail: anna.reali@poli.usp.br [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil)

    2011-03-01

    The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.

  2. Sleep onset uncovers thalamic abnormalities in patients with idiopathic generalised epilepsy

    Directory of Open Access Journals (Sweden)

    Andrew P. Bagshaw

    Full Text Available The thalamus is crucial for sleep regulation and the pathophysiology of idiopathic generalised epilepsy (IGE, and may serve as the underlying basis for the links between the two. We investigated this using EEG-fMRI and a specific emphasis on the role and functional connectivity (FC of the thalamus. We defined three types of thalamic FC: thalamocortical, inter-hemispheric thalamic, and intra-hemispheric thalamic. Patients and controls differed in all three measures, and during wakefulness and sleep, indicating disorder-dependent and state-dependent modification of thalamic FC. Inter-hemispheric thalamic FC differed between patients and controls in somatosensory regions during wakefulness, and occipital regions during sleep. Intra-hemispheric thalamic FC was significantly higher in patients than controls following sleep onset, and disorder-dependent alterations to FC were seen in several thalamic regions always involving somatomotor and occipital regions. As interactions between thalamic sub-regions are indirect and mediated by the inhibitory thalamic reticular nucleus (TRN, the results suggest abnormal TRN function in patients with IGE, with a regional distribution which could suggest a link with the thalamocortical networks involved in the generation of alpha rhythms. Intra-thalamic FC could be a more widely applicable marker beyond patients with IGE. Keywords: Functional connectivity, Generalised epilepsy, Sleep, Thalamic reticular nucleus thalamus

  3. Lepton mixing predictions including Majorana phases from Δ(6n2 flavour symmetry and generalised CP

    Directory of Open Access Journals (Sweden)

    Stephen F. King

    2014-09-01

    Full Text Available Generalised CP transformations are the only known framework which allows to predict Majorana phases in a flavour model purely from symmetry. For the first time generalised CP transformations are investigated for an infinite series of finite groups, Δ(6n2=(Zn×Zn⋊S3. In direct models the mixing angles and Dirac CP phase are solely predicted from symmetry. The Δ(6n2 flavour symmetry provides many examples of viable predictions for mixing angles. For all groups the mixing matrix has a trimaximal middle column and the Dirac CP phase is 0 or π. The Majorana phases are predicted from residual flavour and CP symmetries where α21 can take several discrete values for each n and the Majorana phase α31 is a multiple of π. We discuss constraints on the groups and CP transformations from measurements of the neutrino mixing angles and from neutrinoless double-beta decay and find that predictions for mixing angles and all phases are accessible to experiments in the near future.

  4. Lepton mixing predictions including Majorana phases from Δ(6n2) flavour symmetry and generalised CP

    International Nuclear Information System (INIS)

    King, Stephen F.; Neder, Thomas

    2014-01-01

    Generalised CP transformations are the only known framework which allows to predict Majorana phases in a flavour model purely from symmetry. For the first time generalised CP transformations are investigated for an infinite series of finite groups, Δ(6n 2 )=(Z n ×Z n )⋊S 3 . In direct models the mixing angles and Dirac CP phase are solely predicted from symmetry. The Δ(6n 2 ) flavour symmetry provides many examples of viable predictions for mixing angles. For all groups the mixing matrix has a trimaximal middle column and the Dirac CP phase is 0 or π. The Majorana phases are predicted from residual flavour and CP symmetries where α 21 can take several discrete values for each n and the Majorana phase α 31 is a multiple of π. We discuss constraints on the groups and CP transformations from measurements of the neutrino mixing angles and from neutrinoless double-beta decay and find that predictions for mixing angles and all phases are accessible to experiments in the near future

  5. [A magnetoencephalographic study of generalised developmental disorders. A new proposal for their classification].

    Science.gov (United States)

    Muñoz Yunta, J A; Palau Baduell, M; Salvado Salvado, B; Amo, C; Fernandez Lucas, A; Maestu, F; Ortiz, T

    2004-02-01

    Autistic spectrum disorders (ASD) is a term that is not included in DSM IV or in ICD 10, which are the diagnostic tools most commonly used by clinical professionals but can offer problems in research when it comes to finding homogenous groups. From a neuropaediatric point of view, there is a need for a classification of the generalised disorders affecting development and for this purpose we used Wing's triad, which defines the continuum of the autistic spectrum, and the information provided by magnetoencephalography (MEG) as grouping elements. Specific generalised developmental disorders were taken as being those syndromes that partially expressed some autistic trait, but with their own personality so that they could be considered to be a specific disorder. ASD were classified as being primary, cryptogenic or secondary. The primary disorders, in turn, express a continuum that ranges from Savant syndrome to Asperger's syndrome and the different degrees of early infantile autism. MEG is a functional neuroimaging technique that has enabled us to back up this classification.

  6. Cortical feedback signals generalise across different spatial frequencies of feedforward inputs.

    Science.gov (United States)

    Revina, Yulia; Petro, Lucy S; Muckli, Lars

    2017-09-22

    Visual processing in cortex relies on feedback projections contextualising feedforward information flow. Primary visual cortex (V1) has small receptive fields and processes feedforward information at a fine-grained spatial scale, whereas higher visual areas have larger, spatially invariant receptive fields. Therefore, feedback could provide coarse information about the global scene structure or alternatively recover fine-grained structure by targeting small receptive fields in V1. We tested if feedback signals generalise across different spatial frequencies of feedforward inputs, or if they are tuned to the spatial scale of the visual scene. Using a partial occlusion paradigm, functional magnetic resonance imaging (fMRI) and multivoxel pattern analysis (MVPA) we investigated whether feedback to V1 contains coarse or fine-grained information by manipulating the spatial frequency of the scene surround outside an occluded image portion. We show that feedback transmits both coarse and fine-grained information as it carries information about both low (LSF) and high spatial frequencies (HSF). Further, feedback signals containing LSF information are similar to feedback signals containing HSF information, even without a large overlap in spatial frequency bands of the HSF and LSF scenes. Lastly, we found that feedback carries similar information about the spatial frequency band across different scenes. We conclude that cortical feedback signals contain information which generalises across different spatial frequencies of feedforward inputs. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Navigation towards a goal position: from reactive to generalised learned control

    International Nuclear Information System (INIS)

    Freire da Silva, Valdinei; Selvatici, Antonio Henrique; Reali Costa, Anna Helena

    2011-01-01

    The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.

  8. Beyond the relativistic point particle: A reciprocally invariant system and its generalisation

    International Nuclear Information System (INIS)

    Pavsic, Matej

    2009-01-01

    We investigate a reciprocally invariant system proposed by Low and Govaerts et al., whose action contains both the orthogonal and the symplectic forms and is invariant under global O(2,4) intersection Sp(2,4) transformations. We find that the general solution to the classical equations of motion has no linear term in the evolution parameter, τ, but only the oscillatory terms, and therefore cannot represent a particle propagating in spacetime. As a remedy, we consider a generalisation of the action by adopting a procedure similar to that of Bars et al., who introduced the concept of a τ derivative that is covariant under local Sp(2) transformations between the phase space variables x μ (τ) and p μ (τ). This system, in particular, is similar to a rigid particle whose action contains the extrinsic curvature of the world line, which turns out to be helical in spacetime. Another possible generalisation is the introduction of a symplectic potential proposed by Montesinos. We show how the latter approach is related to Kaluza-Klein theories and to the concept of Clifford space, a manifold whose tangent space at any point is Clifford algebra Cl(8), a promising framework for the unification of particles and forces.

  9. The use of generalised audit software by internal audit functions in a developing country: The purpose of the use of generalised audit software as a data analytics tool

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-11-01

    Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness

  10. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  11. Image restoration, uncertainty, and information.

    Science.gov (United States)

    Yu, F T

    1969-01-01

    Some of the physical interpretations about image restoration are discussed. From the theory of information the unrealizability of an inverse filter can be explained by degradation of information, which is due to distortion on the recorded image. The image restoration is a time and space problem, which can be recognized from the theory of relativity (the problem of image restoration is related to Heisenberg's uncertainty principle in quantum mechanics). A detailed discussion of the relationship between information and energy is given. Two general results may be stated: (1) the restoration of the image from the distorted signal is possible only if it satisfies the detectability condition. However, the restored image, at the best, can only approach to the maximum allowable time criterion. (2) The restoration of an image by superimposing the distorted signal (due to smearing) is a physically unrealizable method. However, this restoration procedure may be achieved by the expenditure of an infinite amount of energy.

  12. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  13. An information-theoretic basis for uncertainty analysis: application to the QUASAR severe accident study

    International Nuclear Information System (INIS)

    Unwin, S.D.; Cazzoli, E.G.; Davis, R.E.; Khatib-Rahbar, M.; Lee, M.; Nourbakhsh, H.; Park, C.K.; Schmidt, E.

    1989-01-01

    The probabilistic characterization of uncertainty can be problematic in circumstances where there is a paucity of supporting data and limited experience on which to base engineering judgement. Information theory provides a framework in which to address this issue through reliance upon entropy-related principles of uncertainty maximization. We describe an application of such principles in the United States Nuclear Regulatory Commission-sponsored program QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors). (author)

  14. Propagation of nonlinear shock waves for the generalised Oskolkov equation and its dynamic motions in the presence of an external periodic perturbation

    Science.gov (United States)

    Ak, Turgut; Aydemir, Tugba; Saha, Asit; Kara, Abdul Hamid

    2018-06-01

    Propagation of nonlinear shock waves for the generalised Oskolkov equation and dynamic motions of the perturbed Oskolkov equation are investigated. Employing the unified method, a collection of exact shock wave solutions for the generalised Oskolkov equations is presented. Collocation finite element method is applied to the generalised Oskolkov equation for checking the accuracy of the proposed method by two test problems including the motion of shock wave and evolution of waves with Gaussian and undular bore initial conditions. Considering an external periodic perturbation, the dynamic motions of the perturbed generalised Oskolkov equation are studied depending on the system parameters with the help of phase portrait and time series plot. The perturbed generalised Oskolkov equation exhibits period-3, quasiperiodic and chaotic motions for some special values of the system parameters, whereas the generalised Oskolkov equation presents shock waves in the absence of external periodic perturbation.

  15. Modeling of uncertainties in statistical inverse problems

    International Nuclear Information System (INIS)

    Kaipio, Jari

    2008-01-01

    In all real world problems, the models that tie the measurements to the unknowns of interest, are at best only approximations for reality. While moderate modeling and approximation errors can be tolerated with stable problems, inverse problems are a notorious exception. Typical modeling errors include inaccurate geometry, unknown boundary and initial data, properties of noise and other disturbances, and simply the numerical approximations of the physical models. In principle, the Bayesian approach to inverse problems, in which all uncertainties are modeled as random variables, is capable of handling these uncertainties. Depending on the type of uncertainties, however, different strategies may be adopted. In this paper we give an overview of typical modeling errors and related strategies within the Bayesian framework.

  16. Analysis of uncertainties of thermal hydraulic calculations

    International Nuclear Information System (INIS)

    Macek, J.; Vavrin, J.

    2002-12-01

    In 1993-1997 it was proposed, within OECD projects, that a common program should be set up for uncertainty analysis by a probabilistic method based on a non-parametric statistical approach for system computer codes such as RELAP, ATHLET and CATHARE and that a method should be developed for statistical analysis of experimental databases for the preparation of the input deck and statistical analysis of the output calculation results. Software for such statistical analyses would then have to be processed as individual tools independent of the computer codes used for the thermal hydraulic analysis and programs for uncertainty analysis. In this context, a method for estimation of a thermal hydraulic calculation is outlined and selected methods of statistical analysis of uncertainties are described, including methods for prediction accuracy assessment based on the discrete Fourier transformation principle. (author)

  17. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  18. The gauge principle vs. the equivalence principle

    International Nuclear Information System (INIS)

    Gates, S.J. Jr.

    1984-01-01

    Within the context of field theory, it is argued that the role of the equivalence principle may be replaced by the principle of gauge invariance to provide a logical framework for theories of gravitation

  19. Equivalence principles and electromagnetism

    Science.gov (United States)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  20. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  1. Principle of accelerator mass spectrometry

    International Nuclear Information System (INIS)

    Matsuzaki, Hiroyuki

    2007-01-01

    The principle of accelerator mass spectrometry (AMS) is described mainly on technical aspects: hardware construction of AMS, measurement of isotope ratio, sensitivity of measurement (measuring limit), measuring accuracy, and application of data. The content may be summarized as follows: rare isotope (often long-lived radioactive isotope) can be detected by various use of the ion energy obtained by the acceleration of ions, a measurable isotope ratio is one of rare isotope to abundant isotopes, and a measured value of isotope ratio is uncertainty to true one. Such a fact must be kept in mind on the use of AMS data to application research. (M.H.)

  2. Semantic 3d City Model to Raster Generalisation for Water Run-Off Modelling

    Science.gov (United States)

    Verbree, E.; de Vries, M.; Gorte, B.; Oude Elberink, S.; Karimlou, G.

    2013-09-01

    Water run-off modelling applied within urban areas requires an appropriate detailed surface model represented by a raster height grid. Accurate simulations at this scale level have to take into account small but important water barriers and flow channels given by the large-scale map definitions of buildings, street infrastructure, and other terrain objects. Thus, these 3D features have to be rasterised such that each cell represents the height of the object class as good as possible given the cell size limitations. Small grid cells will result in realistic run-off modelling but with unacceptable computation times; larger grid cells with averaged height values will result in less realistic run-off modelling but fast computation times. This paper introduces a height grid generalisation approach in which the surface characteristics that most influence the water run-off flow are preserved. The first step is to create a detailed surface model (1:1.000), combining high-density laser data with a detailed topographic base map. The topographic map objects are triangulated to a set of TIN-objects by taking into account the semantics of the different map object classes. These TIN objects are then rasterised to two grids with a 0.5m cell-spacing: one grid for the object class labels and the other for the TIN-interpolated height values. The next step is to generalise both raster grids to a lower resolution using a procedure that considers the class label of each cell and that of its neighbours. The results of this approach are tested and validated by water run-off model runs for different cellspaced height grids at a pilot area in Amersfoort (the Netherlands). Two national datasets were used in this study: the large scale Topographic Base map (BGT, map scale 1:1.000), and the National height model of the Netherlands AHN2 (10 points per square meter on average). Comparison between the original AHN2 height grid and the semantically enriched and then generalised height grids shows

  3. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  4. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  5. Uncertainties and climatic change

    International Nuclear Information System (INIS)

    De Gier, A.M.; Opschoor, J.B.; Van de Donk, W.B.H.J.; Hooimeijer, P.; Jepma, J.; Lelieveld, J.; Oerlemans, J.; Petersen, A.

    2008-01-01

    Which processes in the climate system are misunderstood? How are scientists dealing with uncertainty about climate change? What will be done with the conclusions of the recently published synthesis report of the IPCC? These and other questions were answered during the meeting 'Uncertainties and climate change' that was held on Monday 26 November 2007 at the KNAW in Amsterdam. This report is a compilation of all the presentations and provides some conclusions resulting from the discussions during this meeting. [mk] [nl

  6. Mechanics and uncertainty

    CERN Document Server

    Lemaire, Maurice

    2014-01-01

    Science is a quest for certainty, but lack of certainty is the driving force behind all of its endeavors. This book, specifically, examines the uncertainty of technological and industrial science. Uncertainty and Mechanics studies the concepts of mechanical design in an uncertain setting and explains engineering techniques for inventing cost-effective products. Though it references practical applications, this is a book about ideas and potential advances in mechanical science.

  7. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  8. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  9. The Langevin and generalised Langevin approach to the dynamics of atomic, polymeric and colloidal systems

    CERN Document Server

    Snook, Ian

    2007-01-01

    The Langevin and Generalised Langevin Approach To The Dynamics Of Atomic, Polymeric And Colloidal Systems is concerned with the description of aspects of the theory and use of so-called random processes to describe the properties of atomic, polymeric and colloidal systems in terms of the dynamics of the particles in the system. It provides derivations of the basic equations, the development of numerical schemes to solve them on computers and gives illustrations of application to typical systems.Extensive appendices are given to enable the reader to carry out computations to illustrate many of the points made in the main body of the book.* Starts from fundamental equations* Gives up-to-date illustration of the application of these techniques to typical systems of interest* Contains extensive appendices including derivations, equations to be used in practice and elementary computer codes

  10. The two ∇6R4 type invariants and their higher order generalisation

    International Nuclear Information System (INIS)

    Bossard, Guillaume; Verschinin, Valentin

    2015-01-01

    We show that there are two distinct classes of ∇ 6 R 4 type supersymmetry invariants in maximal supergravity. The second class includes a coupling in F 2 ∇ 4 R 4 that generalises to 1/8 BPS protected F 2k ∇ 4 R 4 couplings. We work out the supersymmetry constraints on the corresponding threshold functions, and argue that the functions in the second class satisfy to homogeneous differential equations for arbitrary k≥1, such that the corresponding exact threshold functions in type II string theory should be proportional to Eisenstein series, which we identify. This analysis explains in particular that the exact ∇ 6 R 4 threshold function is the sum of an Eisenstein function and a solution to an inhomogeneous Poisson equation in string theory.

  11. Generalised chronic musculoskeletal pain as a rational reaction to a life situation?

    Science.gov (United States)

    Steen, E; Haugli, L

    2000-11-01

    While the biomedical model is still the leading paradigm within modern medicine and health care, and people with generalised chronic musculoskeletal pain are frequent users of health care services, their diagnoses are rated as having the lowest prestige among health care personnel. An epistemological framework for understanding relations between body, emotions, mind and meaning is presented. An approach based on a phenomenological epistemology is discussed as a supplement to actions based on the biomedical model. Within the phenomenological frame of understanding, the body is viewed as a subject and carrier of meaning, and therefore chronic pain can be interpreted as a rational reaction to the totality of a person's life situation. Search for possible hidden individual meanings in painful muscles presupposes meeting health personnel who view the person within a holistic frame of reference.

  12. Radiation pneumonitis: generalised lung changes detected by radionuclide imaging following focal lung irradiation

    International Nuclear Information System (INIS)

    Ball, D.; Sephton, R.; Irving, L.; Crennan, E.

    1992-01-01

    The usefulness of a nuclear imaging technique as a means of detecting radiation-induced lung injury is examined. The technique involves the patient inhaling modified technegas TM , a gas-like radiotracer which is an ultra fine particulate dispersion. This crosses the alveolar-capillary membrane and the clearance rate of the tracer from the lungs is presumed to reflect membrane permeability. A case of a patient who, after receiving localised radiotherapy and chemotherapy for lung cancer, developed symptoms and signs of radiation pneumonitis is reported. Pre- and post-radiotherapy investigations using the nuclear technique showed acceleration of rates of tracer clearance from both lungs, consistent with generalised changes in alveolar-capillary membrane permeability. It is suggested that the symptoms of radiation pneumonitis may in part result from pathophysiologic changes in nonirradiated lung which may appear radiologically normal. 4 refs., 2 figs

  13. Generalised tetanus in a 2-week-old foal: use of physiotherapy to aid recovery.

    Science.gov (United States)

    Mykkänen, A K; Hyytiäinen, H K; McGowan, C M

    2011-11-01

    A 2-week-old Estonian Draft foal presented with signs of severe generalised tetanus, recumbency and inability to drink. The suspected source of infection was the umbilicus. Medical treatment was administered, including tetanus antitoxin, antimicrobial therapy and phenobarbital to control tetanic spasms. In addition, an intensive physiotherapy program was carried out during the recovery period. Techniques designed for syndromes involving upper motor neuron spasticity in humans were applied. Exercises aimed at weight-bearing and mobility were executed with the help of a walking-frame. The foal made a complete recovery. To our knowledge, this is the first report of the use of physiotherapy in the treatment of tetanus in horses. © 2011 The Authors. Australian Veterinary Journal © 2011 Australian Veterinary Association.

  14. Decomposition of almost-Poisson structure of generalised Chaplygin's nonholonomic systems

    International Nuclear Information System (INIS)

    Chang, Liu; Peng, Chang; Shi-Xing, Liu; Yong-Xin, Guo

    2010-01-01

    This paper constructs an almost-Poisson structure for the non-self-adjoint dynamical systems, which can be decomposed into a sum of a Poisson bracket and the other almost-Poisson bracket. The necessary and sufficient condition for the decomposition of the almost-Poisson bracket to be two Poisson ones is obtained. As an application, the almost-Poisson structure for generalised Chaplygin's systems is discussed in the framework of the decomposition theory. It proves that the almost-Poisson bracket for the systems can be decomposed into the sum of a canonical Poisson bracket and another two noncanonical Poisson brackets in some special cases, which is useful for integrating the equations of motion

  15. QCD amplitudes with 2 initial spacelike legs via generalised BCFW recursion

    Energy Technology Data Exchange (ETDEWEB)

    Kutak, Krzysztof; Hameren, Andreas van; Serino, Mirko [The H. Niewodniczański Institute of Nuclear Physics, Polish Academy of Sciences, ul. Radzikowskiego 152, 31-342, Cracow (Poland)

    2017-02-02

    We complete the generalisation of the BCFW recursion relation to the off-shell case, allowing for the computation of tree level scattering amplitudes for full High Energy Factorisation (HEF), i.e. with both incoming partons having a non-vanishing transverse momentum. We provide explicit results for color-ordered amplitudes with two off-shell legs in massless QCD up to 4 point, continuing the program begun in two previous papers. For the 4-fermion amplitudes, which are not BCFW-recursible, we perform a diagrammatic computation, so as to offer a complete set of expressions. We explicitly show and discuss some plots of the squared 2→2 matrix elements as functions of the differences in rapidity and azimuthal angle of the final state particles.

  16. Study and development of a generalised input-output system for data base management systems

    International Nuclear Information System (INIS)

    Zidi, Noureddine

    1975-01-01

    This thesis reports a study which aimed at designing and developing a software for the management and execution of all input-output actions of data base management systems. This software is also an interface between data base management systems and the various operating systems. After a recall of general characteristics of database management systems, the author presents the previously developed GRISBI system (rational management of information stored in an integrated database), and describes difficulties faced to adapt this system to the new access method (VSAM, virtual sequential access method). This lead to the search for a more general solution, the development of which is presented in the second part of this thesis: environment of the input-output generalised system, architecture, internal specifications. The last part presents flowcharts and statements of the various routines [fr

  17. Geometric Generalisation of Surrogate Model-Based Optimisation to Combinatorial and Program Spaces

    Directory of Open Access Journals (Sweden)

    Yong-Hyuk Kim

    2014-01-01

    Full Text Available Surrogate models (SMs can profitably be employed, often in conjunction with evolutionary algorithms, in optimisation in which it is expensive to test candidate solutions. The spatial intuition behind SMs makes them naturally suited to continuous problems, and the only combinatorial problems that have been previously addressed are those with solutions that can be encoded as integer vectors. We show how radial basis functions can provide a generalised SM for combinatorial problems which have a geometric solution representation, through the conversion of that representation to a different metric space. This approach allows an SM to be cast in a natural way for the problem at hand, without ad hoc adaptation to a specific representation. We test this adaptation process on problems involving binary strings, permutations, and tree-based genetic programs.

  18. Generalised Adaptive Harmony Search: A Comparative Analysis of Modern Harmony Search

    Directory of Open Access Journals (Sweden)

    Jaco Fourie

    2013-01-01

    Full Text Available Harmony search (HS was introduced in 2001 as a heuristic population-based optimisation algorithm. Since then HS has become a popular alternative to other heuristic algorithms like simulated annealing and particle swarm optimisation. However, some flaws, like the need for parameter tuning, were identified and have been a topic of study for much research over the last 10 years. Many variants of HS were developed to address some of these flaws, and most of them have made substantial improvements. In this paper we compare the performance of three recent HS variants: exploratory harmony search, self-adaptive harmony search, and dynamic local-best harmony search. We compare the accuracy of these algorithms, using a set of well-known optimisation benchmark functions that include both unimodal and multimodal problems. Observations from this comparison led us to design a novel hybrid that combines the best attributes of these modern variants into a single optimiser called generalised adaptive harmony search.

  19. Issues of validity and generalisability in the Grade 12 English Home Language examination

    Directory of Open Access Journals (Sweden)

    du Plessis, Colleen Lynne

    2014-12-01

    Full Text Available Very little research has been devoted to evaluating the national English Home Language (HL curriculum and assessment system. Not only is there a lack of clarity on whether the language subject is being offered at an adequately high level to meet the declared objectives of the curriculum, but the reliability of the results obtained by Grade 12 learners in the exit-level examination has been placed under suspicion. To shed some light on the issue, this study takes a close look at the language component of the school-leaving examination covering the period 2008-2012, to see whether evidence of high language ability can be generated through the current selection of task types and whether the inferred ability can be generalised to non-examination contexts. Of primary interest here are the validity of the construct on which the examination is built and the sub-abilities that are being measured, as well as the validity of the scoring. One of the key findings of the study is that the language papers cannot be considered indicators of advanced and differential language ability, only of basic and general proficiency. The lack of specifications in the design of the examination items and construction of the marking memoranda undermine the validity and reliability of the assessment. As a consequence hereof, the inferences made on the basis of the scores obtained by examinees are highly subjective and cannot be generalised to other domains of language use. The study hopes to draw attention to the importance of the format and design of the examination papers in maintaining educational standards.

  20. Psychosocial work factors, major depressive and generalised anxiety disorders: results from the French national SIP study.

    Science.gov (United States)

    Murcia, Marie; Chastang, Jean-François; Niedhammer, Isabelle

    2013-04-25

    Anxiety and depression are prevalent mental disorders in working populations. The risk factors of these disorders are not completely well known. Developing knowledge on occupational risk factors for mental disorders appears crucial. This study investigates the association between various classical and emergent psychosocial work factors and major depressive and generalised anxiety disorders in the French working population. The study was based on a national random sample of 3765 men and 3944 women of the French working population (SIP 2006 survey). Major Depressive Disorder (MDD) and Generalised Anxiety Disorder (GAD) were measured using a standardised diagnostic interview (MINI). Occupational factors included psychosocial work factors as well as biomechanical, physical, and chemical exposures. Adjustment variables included age, occupation, marital status, social support, and life events. Multivariate analysis was performed using logistic regression analysis. Low decision latitude, overcommitment, and emotional demands were found to be risk factors for both MDD-GAD among both genders. Other risk factors were observed: high psychological demands, low reward, ethical conflict, and job insecurity, but differences were found according to gender and outcome. Significant interaction terms were observed suggesting that low decision latitude, high psychological demands, and job insecurity had stronger effects on mental disorders for men than for women. Given the cross-sectional study design, no causal conclusion could be drawn. This study showed significant associations between classical and emergent psychosocial work factors and MDD-GAD. Preventive actions targeting various psychosocial work factors, including emergent factors, may help to reduce mental disorders at the workplace. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Estimating incidence from prevalence in generalised HIV epidemics: methods and validation.

    Directory of Open Access Journals (Sweden)

    Timothy B Hallett

    2008-04-01

    Full Text Available HIV surveillance of generalised epidemics in Africa primarily relies on prevalence at antenatal clinics, but estimates of incidence in the general population would be more useful. Repeated cross-sectional measures of HIV prevalence are now becoming available for general populations in many countries, and we aim to develop and validate methods that use these data to estimate HIV incidence.Two methods were developed that decompose observed changes in prevalence between two serosurveys into the contributions of new infections and mortality. Method 1 uses cohort mortality rates, and method 2 uses information on survival after infection. The performance of these two methods was assessed using simulated data from a mathematical model and actual data from three community-based cohort studies in Africa. Comparison with simulated data indicated that these methods can accurately estimates incidence rates and changes in incidence in a variety of epidemic conditions. Method 1 is simple to implement but relies on locally appropriate mortality data, whilst method 2 can make use of the same survival distribution in a wide range of scenarios. The estimates from both methods are within the 95% confidence intervals of almost all actual measurements of HIV incidence in adults and young people, and the patterns of incidence over age are correctly captured.It is possible to estimate incidence from cross-sectional prevalence data with sufficient accuracy to monitor the HIV epidemic. Although these methods will theoretically work in any context, we have able to test them only in southern and eastern Africa, where HIV epidemics are mature and generalised. The choice of method will depend on the local availability of HIV mortality data.

  2. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  3. Uncertainty Regarding Waste Handling in Everyday Life

    Directory of Open Access Journals (Sweden)

    Susanne Ewert

    2010-09-01

    Full Text Available According to our study, based on interviews with households in a residential area in Sweden, uncertainty is a cultural barrier to improved recycling. Four causes of uncertainty are identified. Firstly, professional categories not matching cultural categories—people easily discriminate between certain categories (e.g., materials such as plastic and paper but not between others (e.g., packaging and “non-packaging”. Thus a frequent cause of uncertainty is that the basic categories of the waste recycling system do not coincide with the basic categories used in everyday life. Challenged habits—source separation in everyday life is habitual, but when a habit is challenged, by a particular element or feature of the waste system, uncertainty can arise. Lacking fractions—some kinds of items cannot be left for recycling and this makes waste collection incomplete from the user’s point of view and in turn lowers the credibility of the system. Missing or contradictory rules of thumb—the above causes seem to be particularly relevant if no motivating principle or rule of thumb (within the context of use is successfully conveyed to the user. This paper discusses how reducing uncertainty can improve recycling.

  4. Comments on 'On a proposed new test of Heisenberg's principle'

    International Nuclear Information System (INIS)

    Home, D.; Sengupta, S.

    1981-01-01

    A logical fallacy is pointed out in Robinson's analysis (J. Phys. A.; 13:877 (1980)) of a thought experiment purporting to show violation of Heisenberg's uncertainty principle. The real problem concerning the interpretation of Heisenberg's principle is precisely stated. (author)

  5. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  6. Nuclear Data Uncertainties in 2004: A Perspective

    International Nuclear Information System (INIS)

    Smith, Donald L.

    2005-01-01

    Interest in nuclear data uncertainties is growing robustly after having languished for several years. Renewed attention to this topic is being motivated by the practical need for assuring that nuclear systems will be safe, reliable, and cost effective, according to the individual requirements of each specific nuclear technology. Furthermore, applications are emerging in certain areas of basic nuclear science, e.g., in astrophysics, where, until recently, attention has focused mainly on understanding basic concepts and physics principles rather than on dealing with detailed quantitative information. The availability of fast computers and the concurrent development of sophisticated software enable nuclear data uncertainty information to be used more effectively than ever before. For example, data uncertainties and associated methodologies play useful roles in advanced data measurement, analysis, and evaluation procedures. Unfortunately, the current inventory of requisite uncertainty information is rather limited when measured against these evolving demands. Consequently, there is a real need to generate more comprehensive and reasonable nuclear data uncertainty information, and to make this available relatively soon in suitable form for use in the computer codes employed for nuclear analyses and the development of advanced nuclear energy systems. This conference contribution discusses several conceptual and technical issues that need to be addressed in meeting this demand during the next few years. The role of data uncertainties in several areas of nuclear science will also be mentioned briefly. Finally, the opportunities that ultimately will be afforded by the availability of more extensive and reasonable uncertainty information, and some technical challenges to master, will also be explored in this paper

  7. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  8. Statistical Approaches Accomodating Uncertainty in Modern Genomic Data

    DEFF Research Database (Denmark)

    Skotte, Line

    the contributed method applicable to case-control studies as well as mapping of quantitative traits. The contributed method provides a needed association test for quantitative traits in the presence of uncertain genotypes and it further allows correction for population structure in association tests for disease...... the potential of the technological advances. The first of the four papers included in this thesis describes a new method for association mapping that accommodates uncertain genotypes from low-coverage re-sequencing data. The method allows uncertain genotypes using a score statistic based on the joint likelihood...... of the observed phenotypes and the observed sequencing data. This joint likelihood accounts for the genotype uncertainties via the posterior probabilities of each genotype given the observed sequencing data and the phenotype distributions are modelled using a generalised linear model framework which makes...

  9. Analytical Solution of the Schrödinger Equation with Spatially Varying Effective Mass for Generalised Hylleraas Potential

    International Nuclear Information System (INIS)

    Debnath, S.; Maji, Smarajit; Meyur, Sanjib

    2014-01-01

    We have obtained exact solution of the effective mass Schrödinger equation for the generalised Hylleraas potential. The exact bound state energy eigenvalues and corresponding eigenfunctions are presented. The bound state eigenfunctions are obtained in terms of the hypergeometric functions. Results are also given for the special case of potential parameter.

  10. Arthropathy in long-term cured acromegaly is characterised by osteophytes without joint space narrowing: a comparison with generalised osteoarthritis

    NARCIS (Netherlands)

    Wassenaar, M. J. E.; Biermasz, N. R.; Bijsterbosch, J.; Pereira, A. M.; Meulenbelt, I.; Smit, J. W. A.; Roelfsema, F.; Kroon, H. M.; Romijn, J. A.; Kloppenburg, M.

    2011-01-01

    To compare the distribution of osteophytes and joint space narrowing (JSN) between patients with acromegaly and primary generalised osteoarthritis to gain insight into the pathophysiological process of growth hormone (GH) and insulin-like growth factor type I (IGF-I)-mediated osteoarthritis. We

  11. The generalised Marchenko equation and the canonical structure of the A.K.N.S.-Z.S. inverse method

    International Nuclear Information System (INIS)

    Dodd, R.K.; Bullough, R.K.

    1979-01-01

    A generalised Marchenko equation is derived for a 2 X 2 matrix inverse method and it is used to show that, for the subset of equations solvable by the method which can be constructed as defining the flows of Hamiltonians, the inverse transform is a canonical (homogeneous contact) transformation. Baecklund transformations are re-examined from this point of view. (Auth.)

  12. Generalisation of the method of images for the calculation of inviscid potential flow past several arbitrarily moving parallel circular cylinders

    Czech Academy of Sciences Publication Activity Database

    Kharlamov, Alexander A.; Filip, Petr

    2012-01-01

    Roč. 77, č. 1 (2012), s. 77-85 ISSN 0022-0833 Institutional research plan: CEZ:AV0Z20600510 Keywords : circular cylinders * cylinder between two walls * generalised method of images * ideal fluid * potential flow Subject RIV: BK - Fluid Dynamics Impact factor: 1.075, year: 2012

  13. Mutations in THAP1 (DYT6) and generalised dystonia with prominent spasmodic dysphonia: a genetic screening study

    DEFF Research Database (Denmark)

    Djarmati, Ana; Schneider, Susanne A; Lohmann, Katja

    2009-01-01

    -onset generalised dystonia with spasmodic dysphonia. This combination of symptoms might be a characteristic feature of DYT6 dystonia and could be useful in the differential diagnosis of DYT1, DYT4, DYT12, and DYT17 dystonia. In addition to the identified mutations, a rare non-coding substitution in THAP1 might...

  14. Uncertainty in water resources availability in the Okavango River basin as a result of climate change

    Directory of Open Access Journals (Sweden)

    D. A. Hughes

    2011-03-01

    Full Text Available This paper assesses the hydrological response to scenarios of climate change in the Okavango River catchment in Southern Africa. Climate scenarios are constructed representing different changes in global mean temperature from an ensemble of 7 climate models assessed in the IPCC AR4. The results show a substantial change in mean flow associated with a global warming of 2 °C. However, there is considerable uncertainty in the sign and magnitude of the projected changes between different climate models, implying that the ensemble mean is not an appropriate generalised indicator of impact. The uncertainty in response between different climate model patterns is considerably greater than the range due to uncertainty in hydrological model parameterisation. There is also a clear need to evaluate the physical mechanisms associated with the model projected changes in this region. The implications for water resource management policy are considered.

  15. Birthplace Diversity, Income Inequality and Education Gradients in Generalised Trust: The Relevance of Cognitive Skills in 29 Countries. OECD Education Working Papers, No. 164

    Science.gov (United States)

    Borgonovi, Francesca; Pokropek, Artur

    2017-01-01

    The paper examines between-country differences in the mechanisms through which education could promote generalised trust using data from 29 countries participating in the OECD's Survey of Adult Skills (PIAAC). Results indicate that education is strongly associated with generalised trust and that a large part of this association is mediated by…

  16. Dealing with exploration uncertainties

    International Nuclear Information System (INIS)

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  17. The Bohr--Einstein ''weighing-of-energy'' debate and the principle of equivalence

    International Nuclear Information System (INIS)

    Hughes, R.J.

    1990-01-01

    The Bohr--Einstein debate over the ''weighing of energy'' and the validity of the time--energy uncertainty relation is reexamined in the context of gravitation theories that do not respect the equivalence principle. Bohr's use of the equivalence principle is shown to be sufficient, but not necessary, to establish the validity of this uncertainty relation in Einstein's ''weighing-of-energy'' gedanken experiment. The uncertainty relation is shown to hold in any energy-conserving theory of gravity, and so a failure of the equivalence principle does not engender a failure of quantum mechanics. The relationship between the gravitational redshift and the equivalence principle is reviewed

  18. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  19. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  20. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.

  1. Energy and Uncertainty in General Relativity

    Science.gov (United States)

    Cooperstock, F. I.; Dupre, M. J.

    2018-03-01

    The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.

  2. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  3. The VTTVIS line imaging spectrometer - principles, error sources, and calibration

    DEFF Research Database (Denmark)

    Jørgensen, R.N.

    2002-01-01

    work describing the basic principles, potential error sources, and/or adjustment and calibration procedures. This report fulfils the need for such documentationwith special focus on the system at KVL. The PGP based system has several severe error sources, which should be removed prior any analysis......Hyperspectral imaging with a spatial resolution of a few mm2 has proved to have a great potential within crop and weed classification and also within nutrient diagnostics. A commonly used hyperspectral imaging system is based on the Prism-Grating-Prism(PGP) principles produced by Specim Ltd...... in off-axis transmission efficiencies, diffractionefficiencies, and image distortion have a significant impact on the instrument performance. Procedures removing or minimising these systematic error sources are developed and described for the system build at KVL but can be generalised to other PGP...

  4. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  5. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  6. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  7. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  8. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    Science.gov (United States)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  9. The equivalence principle in a quantum world

    DEFF Research Database (Denmark)

    Bjerrum-Bohr, N. Emil J.; Donoghue, John F.; El-Menoufi, Basem Kamal

    2015-01-01

    the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry - general coordinate invariance - that is used to organize the effective field theory......We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When...

  10. Generalised and Fractional Langevin Equations-Implications for Energy Balance Models

    Science.gov (United States)

    Watkins, N. W.; Chapman, S. C.; Chechkin, A.; Ford, I.; Klages, R.; Stainforth, D. A.

    2017-12-01

    Energy Balance Models (EBMs) have a long heritage in climate science, including their use in modelling anomalies in global mean temperature. Many types of EBM have now been studied, and this presentation concerns the stochastic EBMs, which allow direct treatment of climate fluctuations and noise. Some recent stochastic EBMs (e.g. [1]) map on to Langevin's original form of his equation, with temperature anomaly replacing velocity, and other corresponding replacements being made. Considerable sophistication has now been reached in the application of multivariate stochastic Langevin modelling in many areas of climate. Our work is complementary in intent and investigates the Mori-Kubo "Generalised Langevin Equation" (GLE) which incorporates non-Markovian noise and response in a univariate framework, as a tool for modelling GMT [2]. We show how, if it is present, long memory simplifies the GLE to a fractional Langevin equation (FLE). Evidence for long range memory in global temperature, and the success of fractional Gaussian noise in its prediction [5] has already motivated investigation of a power law response model [3,4,5]. We go beyond this work to ask whether an EBM of FLE-type exists, and what its solutions would be. [l] Padilla et al, J. Climate (2011); [2] Watkins, GRL (2013); [3] Rypdal, JGR (2012); [4] Rypdal and Rypdal, J. Climate (2014); [5] Lovejoy et al, ESDD (2015).

  11. Adaptive dynamic programming for discrete-time linear quadratic regulation based on multirate generalised policy iteration

    Science.gov (United States)

    Chun, Tae Yoon; Lee, Jae Young; Park, Jin Bae; Choi, Yoon Ho

    2018-06-01

    In this paper, we propose two multirate generalised policy iteration (GPI) algorithms applied to discrete-time linear quadratic regulation problems. The proposed algorithms are extensions of the existing GPI algorithm that consists of the approximate policy evaluation and policy improvement steps. The two proposed schemes, named heuristic dynamic programming (HDP) and dual HDP (DHP), based on multirate GPI, use multi-step estimation (M-step Bellman equation) at the approximate policy evaluation step for estimating the value function and its gradient called costate, respectively. Then, we show that these two methods with the same update horizon can be considered equivalent in the iteration domain. Furthermore, monotonically increasing and decreasing convergences, so called value iteration (VI)-mode and policy iteration (PI)-mode convergences, are proved to hold for the proposed multirate GPIs. Further, general convergence properties in terms of eigenvalues are also studied. The data-driven online implementation methods for the proposed HDP and DHP are demonstrated and finally, we present the results of numerical simulations performed to verify the effectiveness of the proposed methods.

  12. Statistical inference for the lifetime performance index based on generalised order statistics from exponential distribution

    Science.gov (United States)

    Vali Ahmadi, Mohammad; Doostparast, Mahdi; Ahmadi, Jafar

    2015-04-01

    In manufacturing industries, the lifetime of an item is usually characterised by a random variable X and considered to be satisfactory if X exceeds a given lower lifetime limit L. The probability of a satisfactory item is then ηL := P(X ≥ L), called conforming rate. In industrial companies, however, the lifetime performance index, proposed by Montgomery and denoted by CL, is widely used as a process capability index instead of the conforming rate. Assuming a parametric model for the random variable X, we show that there is a connection between the conforming rate and the lifetime performance index. Consequently, the statistical inferences about ηL and CL are equivalent. Hence, we restrict ourselves to statistical inference for CL based on generalised order statistics, which contains several ordered data models such as usual order statistics, progressively Type-II censored data and records. Various point and interval estimators for the parameter CL are obtained and optimal critical regions for the hypothesis testing problems concerning CL are proposed. Finally, two real data-sets on the lifetimes of insulating fluid and ball bearings, due to Nelson (1982) and Caroni (2002), respectively, and a simulated sample are analysed.

  13. Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.

    Science.gov (United States)

    Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel

    2014-01-01

    Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.

  14. Assessing ozone and nitrogen impact on net primary productivity with a Generalised non-Linear Model

    International Nuclear Information System (INIS)

    De Marco, Alessandra; Screpanti, Augusto; Attorre, Fabio; Proietti, Chiara; Vitale, Marcello

    2013-01-01

    Some studies suggest that in Europe the majority of forest growth increment can be accounted for N deposition and very little by elevated CO 2 . High ozone (O 3 ) concentrations cause reductions in carbon fixation in native plants by offsetting the effects of elevated CO 2 or N deposition. The cause-effect relationships between primary productivity (NPP) of Quercus cerris, Q. ilex and Fagus sylvatica plant species and climate and pollutants (O 3 and N deposition) in Italy have been investigated by application of Generalised Linear/non-Linear regression model (GLZ model). The GLZ model highlighted: i) cumulative O 3 concentration-based indicator (AOT40F) did not significantly affect NPP; ii) a differential action of oxidised and reduced nitrogen depositions to NPP was linked to the geographical location; iii) the species-specific variation of NPP caused by combination of pollutants and climatic variables could be a potentially important drive-factor for the plant species' shift as response to the future climate change. - Highlights: ► GLZ Models emphasized the role of combination of variables affecting NPP. ► A differential action of ox-N and red-N deposition to NPP was observed for plants. ► Different responses to climate and pollutants could affect the plant species' shift. - Ozone and nitrogen depositions have non-linear effects on primary productivity of tree species differently distributed in Italy.

  15. Interference effects of neutral MSSM Higgs bosons with a generalised narrow-width approximation

    International Nuclear Information System (INIS)

    Fuchs, Elina

    2014-11-01

    Mixing effects in the MSSM Higgs sector can give rise to a sizeable interference between the neutral Higgs bosons. On the other hand, factorising a more complicated process into production and decay parts by means of the narrow-width approximation (NWA) simplifies the calculation. The standard NWA, however, does not account for interference terms. Therefore, we introduce a generalisation of the NWA (gNWA) which allows for a consistent treatment of interference effects between nearly mass-degenerate particles. Furthermore, we apply the gNWA at the tree and 1-loop level to an example process where the neutral Higgs bosons h and H are produced in the decay of a heavy neutralino and subsequently decay into a fermion pair. The h-H propagator mixing is found to agree well with the approximation of Breit-Wigner propagators times finite wave-function normalisation factors, both leading to a significant interference contribution. The factorisation of the interference term based on on-shell matrix elements reproduces the full interference result within a precision of better than 1% for the considered process. The gNWA also enables the inclusion of contributions beyond the 1-loop order into the most precise prediction.

  16. Generalised extreme value distributions provide a natural hypothesis for the shape of seed mass distributions.

    Directory of Open Access Journals (Sweden)

    Will Edwards

    Full Text Available Among co-occurring species, values for functionally important plant traits span orders of magnitude, are uni-modal, and generally positively skewed. Such data are usually log-transformed "for normality" but no convincing mechanistic explanation for a log-normal expectation exists. Here we propose a hypothesis for the distribution of seed masses based on generalised extreme value distributions (GEVs, a class of probability distributions used in climatology to characterise the impact of event magnitudes and frequencies; events that impose strong directional selection on biological traits. In tests involving datasets from 34 locations across the globe, GEVs described log10 seed mass distributions as well or better than conventional normalising statistics in 79% of cases, and revealed a systematic tendency for an overabundance of small seed sizes associated with low latitudes. GEVs characterise disturbance events experienced in a location to which individual species' life histories could respond, providing a natural, biological explanation for trait expression that is lacking from all previous hypotheses attempting to describe trait distributions in multispecies assemblages. We suggest that GEVs could provide a mechanistic explanation for plant trait distributions and potentially link biology and climatology under a single paradigm.

  17. Generalised pustular psoriasis – a case report and review of therapeutic approaches

    Directory of Open Access Journals (Sweden)

    Agnieszka Osmola-Mańkowska

    2014-11-01

    Full Text Available Introduction. Generalised pustular psoriasis (GPP is regarded as a rare clinical subtype of psoriasis. The severity of GPP varies from a benign, chronic course to severe and widespread, life-threatening disease. Due to the uncommon nature of GPP, establishing treatment guidelines for this variant of psoriasis is challenging. Objective. To present the case of a patient with GPP with a short review of therapeutic approaches. Case report. Our patient was treated with standard methods such as acitretin, then cyclosporine and methotrexate as well as with biologic drug – infliximab. During the last and the most severe flare, combination therapy with systemic retinoid and PUVA (Re-PUVA was used. Conclusions. The treatment in GPP should be determined individually according to the severity of the disease, age, gender and comorbidities, as well as according to the physician’s experience with particular methods and their side effects. In addition, the availability of therapeutic options should be taken into consideration.

  18. A generalised porous medium approach to study thermo-fluid dynamics in human eyes.

    Science.gov (United States)

    Mauro, Alessandro; Massarotti, Nicola; Salahudeen, Mohamed; Romano, Mario R; Romano, Vito; Nithiarasu, Perumal

    2018-03-22

    The present work describes the application of the generalised porous medium model to study heat and fluid flow in healthy and glaucomatous eyes of different subject specimens, considering the presence of ocular cavities and porous tissues. The 2D computational model, implemented into the open-source software OpenFOAM, has been verified against benchmark data for mixed convection in domains partially filled with a porous medium. The verified model has been employed to simulate the thermo-fluid dynamic phenomena occurring in the anterior section of four patient-specific human eyes, considering the presence of anterior chamber (AC), trabecular meshwork (TM), Schlemm's canal (SC), and collector channels (CC). The computational domains of the eye are extracted from tomographic images. The dependence of TM porosity and permeability on intraocular pressure (IOP) has been analysed in detail, and the differences between healthy and glaucomatous eye conditions have been highlighted, proving that the different physiological conditions of patients have a significant influence on the thermo-fluid dynamic phenomena. The influence of different eye positions (supine and standing) on thermo-fluid dynamic variables has been also investigated: results are presented in terms of velocity, pressure, temperature, friction coefficient and local Nusselt number. The results clearly indicate that porosity and permeability of TM are two important parameters that affect eye pressure distribution. Graphical abstract Velocity contours and vectors for healthy eyes (top) and glaucomatous eyes (bottom) for standing position.

  19. Modelled temperature-dependent excitability behaviour of a generalised human peripheral sensory nerve fibre.

    Science.gov (United States)

    Smit, Jacoba E; Hanekom, Tania; Hanekom, Johan J

    2009-08-01

    The objective of this study was to determine if a recently developed human Ranvier node model, which is based on a modified version of the Hodgkin-Huxley model, could predict the excitability behaviour in human peripheral sensory nerve fibres with diameters ranging from 5.0 to 15.0 microm. The Ranvier node model was extended to include a persistent sodium current and was incorporated into a generalised single cable nerve fibre model. Parameter temperature dependence was included. All calculations were performed in Matlab. Sensory nerve fibre excitability behaviour characteristics predicted by the new nerve fibre model at different temperatures and fibre diameters compared well with measured data. Absolute refractory periods deviated from measured data, while relative refractory periods were similar to measured data. Conduction velocities showed both fibre diameter and temperature dependence and were underestimated in fibres thinner than 12.5 microm. Calculated strength-duration time constants ranged from 128.5 to 183.0 micros at 37 degrees C over the studied nerve fibre diameter range, with chronaxie times about 30% shorter than strength-duration time constants. Chronaxie times exhibited temperature dependence, with values overestimated by a factor 5 at temperatures lower than body temperature. Possible explanations include the deviated absolute refractory period trend and inclusion of a nodal strangulation relationship.

  20. CSAU (Code Scaling, Applicability and Uncertainty)

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1989-01-01

    Best Estimate computer codes have been accepted by the U.S. Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs

  1. Generalised and abdominal adiposity are important risk factors for chronic disease in older people: results from a nationally representative survey.

    Science.gov (United States)

    Hirani, V

    2011-06-01

    To look at the trends in prevalence of generalised (body mass index (BMI) ≥ 25 kg/m2) and abdominal obesity (waist circumference (WC) >102 cm, men; > 88 cm, women) among older people from 1993 to 2008, prevalence of chronic disease by overweight/obesity and WC categories in England 2005 and evaluate the association of these measures with chronic diseases. Analyses of nationally representative cross-sectional population surveys, the Health Survey for England (HSE). Non-institutionalised men and women aged ≥ 65 years (in HSE 2005, 1512 men and 1747 women). Height, weight, waist circumference, blood pressure measurements were taken according to standardised HSE protocols. Information collected on socio-demographic, health behaviour and doctor diagnosed health conditions. Generalised obesity and abdominal obesity increased among men and women from 1993 to 2008. In 2005, the HSE 2005 focussed on older people. 72% of men and 68% of women aged over 65 were either overweight or obese. Prevalence of raised WC was higher in women (58%) than in men (46%). The prevalence of diabetes and arthritis was higher in people with generalised obesity in both sexes. Men were more likely to have had a joint replacement and had a higher prevalence of stroke if they were overweight only but women were more likely to have had a joint replacement only if they were obese (13%) and had a higher risk of falls with generalised obesity. The pattern was similar for the prevalence of chronic diseases by raised WC. Multivariate analysis showed that generalised and abdominal obesity was independently associated with risk of hypertension, diabetes and arthritis in both men and women. In women only, there was an association between generalised obesity and having a fall in the last year (OR: 1.5), and between abdominal obesity and having a joint replacement (OR: 1.9, p=0.01). Complications of obesity such as diabetes, hypertension and arthritis, are more common in men and women aged over 65 who are

  2. Handbook of management under uncertainty

    CERN Document Server

    2001-01-01

    A mere few years ago it would have seemed odd to propose a Handbook on the treatment of management problems within a sphere of uncertainty. Even today, on the threshold of the third millennium, this statement may provoke a certain wariness. In fact, to resort to exact or random data, that is probable date, is quite normal and con­ venient, as we then know where we are going best, where we are proposing to go if all occurs as it is conceived and hoped for. To treat uncertain information, to accept a new principle and from there determined criteria, without being sure of oneself and confiding only in the will to better understand objects and phenomena, constitutes and compromise with a new form of understanding the behaviour of current beings that goes even further than simple rationality. Economic Science and particularly the use of its elements of configuration in the world of management, has imbued several generations with an analytical spirit that has given rise to the elaboration of theories widely accept...

  3. Perceptual uncertainty supports design reasoning

    Science.gov (United States)

    Tseng, Winger S. W.

    2018-06-01

    The unstructured, ambiguous figures used as design cues in the experiment were classified as being at high, moderate, and low ambiguity. Participants were required to use the ideas suggested by the visual cues to design a novel table. Results showed that different levels of ambiguity within the cues significantly influenced the quantity of idea development of expert designers, but not novice designers, whose idea generation remained relatively low across all levels of ambiguity. For experts, as the level of ambiguity in the cue increased so did the number of design ideas that were generated. Most design interpretations created by both experts and novices were affected by geometric contours within the figures. In addition, when viewing cues of high ambiguity, experts produced more interpretative transformations than when viewing cues of moderate or low ambiguity. Furthermore, experts produced significantly more new functions or meanings than novices. We claim that increased ambiguity within presented visual cues engenders uncertainty in designers that facilitates flexible transformations and interpretations that prevent premature commitment to uncreative solutions. Such results could be applied in design learning and education, focused on differences between experts and novices, to generalize the principles and strategies of interpretations by experts during concept sketching to train novices when face design problems, and the development of CACD tools to support designers.

  4. Is the Precautionary Principle Really Incoherent?

    Science.gov (United States)

    Boyer-Kassem, Thomas

    2017-11-01

    The Precautionary Principle has been an increasingly important principle in international treaties since the 1980s. Through varying formulations, it states that when an activity can lead to a catastrophe for human health or the environment, measures should be taken to prevent it even if the cause-and-effect relationship is not fully established scientifically. The Precautionary Principle has been critically discussed from many sides. This article concentrates on a theoretical argument by Peterson (2006) according to which the Precautionary Principle is incoherent with other desiderata of rational decision making, and thus cannot be used as a decision rule that selects an action among several ones. I claim here that Peterson's argument fails to establish the incoherence of the Precautionary Principle, by attacking three of its premises. I argue (i) that Peterson's treatment of uncertainties lacks generality, (ii) that his Archimedian condition is problematic for incommensurability reasons, and (iii) that his explication of the Precautionary Principle is not adequate. This leads me to conjecture that the Precautionary Principle can be envisaged as a coherent decision rule, again. © 2017 Society for Risk Analysis.

  5. Radiation protection principles

    International Nuclear Information System (INIS)

    Ismail Bahari

    2007-01-01

    The presentation outlines the aspects of radiation protection principles. It discussed the following subjects; radiation hazards and risk, the objectives of radiation protection, three principles of the system - justification of practice, optimization of protection and safety, dose limit

  6. Principles of project management

    Science.gov (United States)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  7. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    Science.gov (United States)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    the use of a statistical law with two parameters (here generalised extreme value Type I distribution) and clearly lower than those associated with the use of a three-parameter law (here generalised extreme value Type II distribution). For extreme flood quantiles, the uncertainties are mostly due to the rainfall generator because of the progressive saturation of the hydrological model.

  8. Uncertainty in adaptive capacity

    International Nuclear Information System (INIS)

    Neil Adger, W.; Vincent, K.

    2005-01-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. (authors)

  9. Uncertainties about climate

    International Nuclear Information System (INIS)

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  10. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  11. Decision Making Under Uncertainty

    Science.gov (United States)

    2010-11-01

    A sound approach to rational decision making requires a decision maker to establish decision objectives, identify alternatives, and evaluate those...often violate the axioms of rationality when making decisions under uncertainty. The systematic description of such observations may lead to the...which leads to “anchoring” on the initial value. The fact that individuals have been shown to deviate from rationality when making decisions

  12. Citizen Candidates Under Uncertainty

    OpenAIRE

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  13. Investigation of Free Particle Propagator with Generalized Uncertainty Problem

    International Nuclear Information System (INIS)

    Hassanabadi, H.; Ghobakhloo, F.

    2016-01-01

    We consider the Schrödinger equation with a generalized uncertainty principle for a free particle. We then transform the problem into a second-order ordinary differential equation and thereby obtain the corresponding propagator. The result of ordinary quantum mechanics is recovered for vanishing minimal length parameter.

  14. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  15. Participation under Uncertainty

    International Nuclear Information System (INIS)

    Boudourides, Moses A.

    2003-01-01

    This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke

  16. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  17. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  18. Managing Measurement Uncertainty in Building Acoustics

    Directory of Open Access Journals (Sweden)

    Chiara Scrosati

    2015-12-01

    Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single

  19. The precautionary principle as a rational decision criterion

    International Nuclear Information System (INIS)

    Hovi, Jon

    2001-12-01

    The paper asks if the precautionary principle may be seen as a rational decision criterion. Six main questions are discussed. 1. Does the principle basically represent a particular set of political options or is it a genuine decision criterion? 2. If it is the latter, can it be reduced to any of the existing criteria for decision making under uncertainty? 3. In what kinds of situation is the principle applicable? 4. What is the relation between the precautionary principle and other principles for environmental regulation? 5. How plausible is the principle's claim that the burden of proof should be reversed? 6. Do the proponents of environmental regulation carry no burden of proof at all? A main conclusion is that, for now at least, the principle contains too many unclear elements to satisfy the requirements of precision and consistency that should reasonably be satisfied by a rational decision criterion. (author)

  20. Safety and efficacy of eculizumab in anti-acetylcholine receptor antibody-positive refractory generalised myasthenia gravis (REGAIN)

    DEFF Research Database (Denmark)

    Howard, James F; Utsugisawa, Kimiaki; Benatar, Michael

    2017-01-01

    BACKGROUND: Complement is likely to have a role in refractory generalised myasthenia gravis, but no approved therapies specifically target this system. Results from a phase 2 study suggested that eculizumab, a terminal complement inhibitor, produced clinically meaningful improvements in patients...... with anti-acetylcholine receptor antibody-positive refractory generalised myasthenia gravis. We further assessed the efficacy and safety of eculizumab in this patient population in a phase 3 trial. METHODS: We did a phase 3, randomised, double-blind, placebo-controlled, multicentre study (REGAIN) in 76...... hospitals and specialised clinics in 17 countries across North America, Latin America, Europe, and Asia. Eligible patients were aged at least 18 years, with a Myasthenia Gravis-Activities of Daily Living (MG-ADL) score of 6 or more, Myasthenia Gravis Foundation of America (MGFA) class II-IV disease...

  1. Isotropic LQC and LQC-inspired models with a massless scalar field as generalised Brans-Dicke theories

    Science.gov (United States)

    Rama, S. Kalyana

    2018-06-01

    We explore whether generalised Brans-Dicke theories, which have a scalar field Φ and a function ω (Φ ), can be the effective actions leading to the effective equations of motion of the LQC and the LQC-inspired models, which have a massless scalar field σ and a function f( m). We find that this is possible for isotropic cosmology. We relate the pairs (σ , f) and (Φ , ω ) and, using examples, illustrate these relations. We find that near the bounce of the LQC evolutions for which f(m) = sin m, the corresponding field Φ → 0 and the function ω (Φ ) ∝ Φ ^2. We also find that the class of generalised Brans-Dicke theories, which we had found earlier to lead to non singular isotropic evolutions, may be written as an LQC-inspired model. The relations found here in the isotropic cases do not apply to the anisotropic cases, which perhaps require more general effective actions.

  2. The use of generalised audit software by internal audit functions in a developing country: A maturity level assessment

    OpenAIRE

    D.P. van der Nest; Louis Smidt; Dave Lubbe

    2017-01-01

    This article explores the existing practices of internal audit functions in the locally controlled South African banking industry regarding the use of Generalised Audit Software (GAS), against a benchmark developed from recognised data analytic maturity models, in order to assess the current maturity levels of the locally controlled South African banks in the use of this software for tests of controls. The literature review indicates that the use of GAS by internal audit functions is still at...

  3. Generalised and abdominal adiposity are important risk factors for chronic disease in older people: results from a nationally representative survey

    OpenAIRE

    Vasant Hirani

    2010-01-01

    Objective: To look at the trends in prevalence of generalised (body mass index (BMI)≥ 25 kg/m2) and abdominal obesity (waist circumference (WC) >102cm, men; >88cm, women) among older people from 1993 to 2008, prevalence of chronic disease by overweight/obesity and waist circumference categories in England 2005 and evaluate the association of these measures with chronic diseases.

  4. New exact travelling wave solutions of generalised sinh- Gordon and (2 + 1-dimensional ZK-BBM equations

    Directory of Open Access Journals (Sweden)

    Sachin Kumar

    2012-10-01

    Full Text Available Exact travelling wave solutions have been established for generalised sinh-Gordon andgeneralised (2+1 dimensional ZK-BBM equations by using GG      expansion method whereG  G( satisfies a second-order linear ordinary differential equation. The travelling wave solutionsare expressed by hyperbolic, trigonometric and rational functions.

  5. Generalising via the Case Studies and Adapting the Oil and Gas Industry's Project Execution Concepts to the Construction Industry

    OpenAIRE

    Mejlænder-Larsen, Øystein

    2015-01-01

    The aim of this paper is to explore whether it is possible to generalise findings on project execution in the oil and gas industryrelated to the use of project execution models and a 3D design environment, based on case study research. Besides, sufficientsimilarities between the two industries were assessed and the applicability of the findings from the cases in the oil and gasindustry was assessed. The selected cases (the ongoing ...

  6. The neurobiology of uncertainty: implications for statistical learning.

    Science.gov (United States)

    Hasson, Uri

    2017-01-05

    The capacity for assessing the degree of uncertainty in the environment relies on estimating statistics of temporally unfolding inputs. This, in turn, allows calibration of predictive and bottom-up processing, and signalling changes in temporally unfolding environmental features. In the last decade, several studies have examined how the brain codes for and responds to input uncertainty. Initial neurobiological experiments implicated frontoparietal and hippocampal systems, based largely on paradigms that manipulated distributional features of visual stimuli. However, later work in the auditory domain pointed to different systems, whose activation profiles have interesting implications for computational and neurobiological models of statistical learning (SL). This review begins by briefly recapping the historical development of ideas pertaining to the sensitivity to uncertainty in temporally unfolding inputs. It then discusses several issues at the interface of studies of uncertainty and SL. Following, it presents several current treatments of the neurobiology of uncertainty and reviews recent findings that point to principles that serve as important constraints on future neurobiological theories of uncertainty, and relatedly, SL. This review suggests it may be useful to establish closer links between neurobiological research on uncertainty and SL, considering particularly mechanisms sensitive to local and global structure in inputs, the degree of input uncertainty, the complexity of the system generating the input, learning mechanisms that operate on different temporal scales and the use of learnt information for online prediction.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  7. Electricity restructuring : acting on principles

    International Nuclear Information System (INIS)

    Down, E.; Hoover, G.; Howatson, A.; Rheaume, G.

    2003-01-01

    In the second briefing of this series, the authors explored public policy decisions and political intervention, and their effect on electricity restructuring. Continuous and vigilant regulatory oversight of the electricity industry in Canada is required. The need for improved public policy to reduce uncertainty for private investors who wish to enter the market was made clear using case studies from the United Kingdom, California, Alberta, and Ontario. Clarity and consistency must be the two guiding principles for public policy decisions and political intervention in the sector. By clarity, the authors meant that rules, objectives, and timelines of the restructuring process are clear to all market participants. Market rules, implementation, and consumer expectations must be consistent. refs., 3 figs

  8. Dimensional cosmological principles

    International Nuclear Information System (INIS)

    Chi, L.K.

    1985-01-01

    The dimensional cosmological principles proposed by Wesson require that the density, pressure, and mass of cosmological models be functions of the dimensionless variables which are themselves combinations of the gravitational constant, the speed of light, and the spacetime coordinates. The space coordinate is not the comoving coordinate. In this paper, the dimensional cosmological principle and the dimensional perfect cosmological principle are reformulated by using the comoving coordinate. The dimensional perfect cosmological principle is further modified to allow the possibility that mass creation may occur. Self-similar spacetimes are found to be models obeying the new dimensional cosmological principle

  9. Strengthen forensic entomology in court--the need for data exploration and the validation of a generalised additive mixed model.

    Science.gov (United States)

    Baqué, Michèle; Amendt, Jens

    2013-01-01

    Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.

  10. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  11. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  12. Heterogeneous contribution of microdeletions in the development of common generalised and focal epilepsies

    Science.gov (United States)

    Pérez-Palma, Eduardo; Helbig, Ingo; Klein, Karl Martin; Anttila, Verneri; Horn, Heiko; Reinthaler, Eva Maria; Gormley, Padhraig; Ganna, Andrea; Byrnes, Andrea; Pernhorst, Katharina; Toliat, Mohammad R; Saarentaus, Elmo; Howrigan, Daniel P; Hoffman, Per; Miquel, Juan Francisco; De Ferrari, Giancarlo V; Nürnberg, Peter; Lerche, Holger; Zimprich, Fritz; Neubauer, Bern A; Becker, Albert J; Rosenow, Felix; Perucca, Emilio; Zara, Federico; Weber, Yvonne G; Lal, Dennis

    2017-01-01

    Background Microdeletions are known to confer risk to epilepsy, particularly at genomic rearrangement ‘hotspot’ loci. However, microdeletion burden not overlapping these regions or within different epilepsy subtypes has not been ascertained. Objective To decipher the role of microdeletions outside hotspots loci and risk assessment by epilepsy subtype. Methods We assessed the burden, frequency and genomic content of rare, large microdeletions found in a previously published cohort of 1366 patients with genetic generalised epilepsy (GGE) in addition to two sets of additional unpublished genome-wide microdeletions found in 281 patients with rolandic epilepsy (RE) and 807 patients with adult focal epilepsy (AFE), totalling 2454 cases. Microdeletions were assessed in a combined and subtype-specific approaches against 6746 controls. Results When hotspots are considered, we detected an enrichment of microdeletions in the combined epilepsy analysis (adjusted p=1.06×10−6,OR 1.89, 95% CI 1.51 to 2.35). Epilepsy subtype-specific analyses showed that hotspot microdeletions in the GGE subgroup contribute most of the overall signal (adjusted p=9.79×10−12, OR 7.45, 95% CI 4.20–13.5). Outside hotspots , microdeletions were enriched in the GGE cohort for neurodevelopmental genes (adjusted p=9.13×10−3,OR 2.85, 95% CI 1.62–4.94). No additional signal was observed for RE and AFE. Still, gene-content analysis identified known (NRXN1, RBFOX1 and PCDH7) and novel (LOC102723362) candidate genes across epilepsy subtypes that were not deleted in controls. Conclusions Our results show a heterogeneous effect of recurrent and non-recurrent microdeletions as part of the genetic architecture of GGE and a minor contribution in the aetiology of RE and AFE. PMID:28756411

  13. Gender-based generalisations in school nurses' appraisals of and interventions addressing students' mental health.

    Science.gov (United States)

    Rosvall, Per-Åke; Nilsson, Stefan

    2016-08-30

    There has been an increase of reports describing mental health problems in adolescents, especially girls. School nurses play an important role in supporting young people with health problems. Few studies have considered how the nurses' gender norms may influence their discussions. To investigate this issue, semi-structured interviews focusing on school nurses' work with students who have mental health problems were conducted. Transcripts of interviews with Swedish school nurses (n = 15) from the Help overcoming pain early project (HOPE) were analysed using theories on gender as a theoretical framework and then organised into themes related to the school nurses' provision of contact and intervention. The interviewees were all women, aged between 42-63 years, who had worked as nurses for 13-45 years, and as school nurses for 2-28 years. Five worked in upper secondary schools (for students aged 16-19) and 10 in secondary schools (for students aged 12-16). The results show that school nurses more commonly associated mental health problems with girls. When the school nurses discussed students that were difficult to reach, boys in particular were mentioned. However, very few nurses mentioned specific intervention to address students' mental health problems, and all of the mentioned interventions were focused on girls. Some of the school nurses reported that it was more difficult to initiate a health dialogue with boys, yet none of the nurses had organized interventions for the boys. We conclude that generalisations can sometimes be analytically helpful, facilitating, for instance, the identification of problems in school nurses' work methods and interventions. However, the most important conclusion from our research, which applied a design that is not commonly used, is that more varied approaches, as well as a greater awareness of potential gender stereotype pitfalls, are necessary to meet the needs of diverse student groups.

  14. Musical training generalises across modalities and reveals efficient and adaptive mechanisms for reproducing temporal intervals.

    Science.gov (United States)

    Aagten-Murphy, David; Cappagli, Giulia; Burr, David

    2014-03-01

    Expert musicians are able to time their actions accurately and consistently during a musical performance. We investigated how musical expertise influences the ability to reproduce auditory intervals and how this generalises across different techniques and sensory modalities. We first compared various reproduction strategies and interval length, to examine the effects in general and to optimise experimental conditions for testing the effect of music, and found that the effects were robust and consistent across different paradigms. Focussing on a 'ready-set-go' paradigm subjects reproduced time intervals drawn from distributions varying in total length (176, 352 or 704 ms) or in the number of discrete intervals within the total length (3, 5, 11 or 21 discrete intervals). Overall, Musicians performed more veridical than Non-Musicians, and all subjects reproduced auditory-defined intervals more accurately than visually-defined intervals. However, Non-Musicians, particularly with visual stimuli, consistently exhibited a substantial and systematic regression towards the mean interval. When subjects judged intervals from distributions of longer total length they tended to regress more towards the mean, while the ability to discriminate between discrete intervals within the distribution had little influence on subject error. These results are consistent with a Bayesian model that minimizes reproduction errors by incorporating a central tendency prior weighted by the subject's own temporal precision relative to the current distribution of intervals. Finally a strong correlation was observed between all durations of formal musical training and total reproduction errors in both modalities (accounting for 30% of the variance). Taken together these results demonstrate that formal musical training improves temporal reproduction, and that this improvement transfers from audition to vision. They further demonstrate the flexibility of sensorimotor mechanisms in adapting to

  15. Heterogeneous contribution of microdeletions in the development of common generalised and focal epilepsies.

    Science.gov (United States)

    Pérez-Palma, Eduardo; Helbig, Ingo; Klein, Karl Martin; Anttila, Verneri; Horn, Heiko; Reinthaler, Eva Maria; Gormley, Padhraig; Ganna, Andrea; Byrnes, Andrea; Pernhorst, Katharina; Toliat, Mohammad R; Saarentaus, Elmo; Howrigan, Daniel P; Hoffman, Per; Miquel, Juan Francisco; De Ferrari, Giancarlo V; Nürnberg, Peter; Lerche, Holger; Zimprich, Fritz; Neubauer, Bern A; Becker, Albert J; Rosenow, Felix; Perucca, Emilio; Zara, Federico; Weber, Yvonne G; Lal, Dennis

    2017-09-01

    Microdeletions are known to confer risk to epilepsy, particularly at genomic rearrangement 'hotspot' loci. However, microdeletion burden not overlapping these regions or within different epilepsy subtypes has not been ascertained. To decipher the role of microdeletions outside hotspots loci and risk assessment by epilepsy subtype. We assessed the burden, frequency and genomic content of rare, large microdeletions found in a previously published cohort of 1366 patients with genetic generalised epilepsy (GGE) in addition to two sets of additional unpublished genome-wide microdeletions found in 281 patients with rolandic epilepsy (RE) and 807 patients with adult focal epilepsy (AFE), totalling 2454 cases. Microdeletions were assessed in a combined and subtype-specific approaches against 6746 controls. When hotspots are considered, we detected an enrichment of microdeletions in the combined epilepsy analysis (adjusted p=1.06×10 -6 ,OR 1.89, 95% CI 1.51 to 2.35). Epilepsy subtype-specific analyses showed that hotspot microdeletions in the GGE subgroup contribute most of the overall signal (adjusted p=9.79×10 -12 , OR 7.45, 95% CI 4.20-13.5). Outside hotspots , microdeletions were enriched in the GGE cohort for neurodevelopmental genes (adjusted p=9.13×10 -3 ,OR 2.85, 95% CI 1.62-4.94). No additional signal was observed for RE and AFE. Still, gene-content analysis identified known ( NRXN1 , RBFOX1 and PCDH7 ) and novel ( LOC102723362 ) candidate genes across epilepsy subtypes that were not deleted in controls. Our results show a heterogeneous effect of recurrent and non-recurrent microdeletions as part of the genetic architecture of GGE and a minor contribution in the aetiology of RE and AFE. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Generalised Einstein mass-variation formulae: II Superluminal relative frame velocities

    Directory of Open Access Journals (Sweden)

    James M. Hill

    Full Text Available In part I of this paper we have deduced generalised Einstein mass variation formulae assuming relative frame velocities vc. We again use the notion of the residual mass m0(v which for v>c is defined by the equation m(v=m0(v[(v/c2-1]-1/2 for the actual mass m(v. The residual mass is essentially the actual mass with the Einstein factor removed, and we emphasise that we make no restrictions on m0(v. Using this formal device we deduce corresponding new mass variation formulae applicable to superluminal relative frame velocities, assuming only the extended Lorentz transformations and their consequences, and two invariants that are known to apply in special relativity. The present authors have previously speculated a dual framework such that both the rest mass m0∗ and the residual mass at infinite velocity m∞∗ (by which we mean p∞∗/c, assuming finite momentum at infinity are equally important parameters in the specification of mass as a function of its velocity, and the two arbitrary constants can be so determined. The new formulae involving two arbitrary constants may also be exploited so that the mass remains finite at the speed of light, and two distinct mass profiles are determined as functions of their velocity with the rest mass assumed to be alternatively prescribed at the origin of either frame. The two profiles so obtained (M(U,m(u and (M∗(U,m∗(u although distinct have a common ratio M(U/M∗(U=m(u/m∗(u that is a function of v>c, indicating that observable mass depends upon the frame in which the rest mass is prescribed. Keywords: Special relativity, Einstein mass variation, New formulae

  17. Generalising better: Applying deep learning to integrate deleteriousness prediction scores for whole-exome SNV studies.

    Science.gov (United States)

    Korvigo, Ilia; Afanasyev, Andrey; Romashchenko, Nikolay; Skoblov, Mikhail

    2018-01-01

    Many automatic classifiers were introduced to aid inference of phenotypical effects of uncategorised nsSNVs (nonsynonymous Single Nucleotide Variations) in theoretical and medical applications. Lately, several meta-estimators have been proposed that combine different predictors, such as PolyPhen and SIFT, to integrate more information in a single score. Although many advances have been made in feature design and machine learning algorithms used, the shortage of high-quality reference data along with the bias towards intensively studied in vitro models call for improved generalisation ability in order to further increase classification accuracy and handle records with insufficient data. Since a meta-estimator basically combines different scoring systems with highly complicated nonlinear relationships, we investigated how deep learning (supervised and unsupervised), which is particularly efficient at discovering hierarchies of features, can improve classification performance. While it is believed that one should only use deep learning for high-dimensional input spaces and other models (logistic regression, support vector machines, Bayesian classifiers, etc) for simpler inputs, we still believe that the ability of neural networks to discover intricate structure in highly heterogenous datasets can aid a meta-estimator. We compare the performance with various popular predictors, many of which are recommended by the American College of Medical Genetics and Genomics (ACMG), as well as available deep learning-based predictors. Thanks to hardware acceleration we were able to use a computationally expensive genetic algorithm to stochastically optimise hyper-parameters over many generations. Overfitting was hindered by noise injection and dropout, limiting coadaptation of hidden units. Although we stress that this work was not conceived as a tool comparison, but rather an exploration of the possibilities of deep learning application in ensemble scores, our results show that

  18. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    Science.gov (United States)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  19. Hybrid variational principles and synthesis method for finite element neutron transport calculations

    International Nuclear Information System (INIS)

    Ackroyd, R.T.; Nanneh, M.M.

    1990-01-01

    A family of hybrid variational principles is derived using a generalised least squares method. Neutron conservation is automatically satisfied for the hybrid principles employing two trial functions. No interfaces or reflection conditions need to be imposed on the independent even-parity trial function. For some hybrid principles a single trial function can be employed by relating one parity trial function to the other, using one of the parity transport equation in relaxed form. For other hybrid principles the trial functions can be employed sequentially. Synthesis of transport solutions, starting with the diffusion theory approximation, has been used as a way of reducing the scale of the computation that arises with established finite element methods for neutron transport. (author)

  20. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if