Quantum theory of the generalised uncertainty principle
Bruneton, Jean-Philippe; Larena, Julien
2017-04-01
We extend significantly previous works on the Hilbert space representations of the generalized uncertainty principle (GUP) in 3 + 1 dimensions of the form [X_i,P_j] = i F_{ij} where F_{ij} = f({{P}}^2) δ _{ij} + g({{P}}^2) P_i P_j for any functions f. However, we restrict our study to the case of commuting X's. We focus in particular on the symmetries of the theory, and the minimal length that emerge in some cases. We first show that, at the algebraic level, there exists an unambiguous mapping between the GUP with a deformed quantum algebra and a quadratic Hamiltonian into a standard, Heisenberg algebra of operators and an aquadratic Hamiltonian, provided the boost sector of the symmetries is modified accordingly. The theory can also be mapped to a completely standard Quantum Mechanics with standard symmetries, but with momentum dependent position operators. Next, we investigate the Hilbert space representations of these algebraically equivalent models, and focus specifically on whether they exhibit a minimal length. We carry the functional analysis of the various operators involved, and show that the appearance of a minimal length critically depends on the relationship between the generators of translations and the physical momenta. In particular, because this relationship is preserved by the algebraic mapping presented in this paper, when a minimal length is present in the standard GUP, it is also present in the corresponding Aquadratic Hamiltonian formulation, despite the perfectly standard algebra of this model. In general, a minimal length requires bounded generators of translations, i.e. a specific kind of quantization of space, and this depends on the precise shape of the function f defined previously. This result provides an elegant and unambiguous classification of which universal quantum gravity corrections lead to the emergence of a minimal length.
Schrodinger's Uncertainty Principle?
Indian Academy of Sciences (India)
correlation between x and p. The virtue of Schrodinger's version (5) is that it accounts for this correlation. In spe- cial cases like the free particle and the harmonic oscillator, the 'Schrodinger uncertainty product' even remains constant with time, whereas Heisenberg's does not. The glory of giving the uncertainty principle to ...
Schrodinger's Uncertainty Principle?
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Schrödinger's Uncertainty Principle? - Lilies can be Painted. Rajaram Nityananda. General Article Volume 4 Issue 2 February 1999 pp 24-26. Fulltext. Click here to view fulltext PDF. Permanent link:
Kadane, Joseph B
2011-01-01
An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus
On the uncertainty principle. V
International Nuclear Information System (INIS)
Halpern, O.
1976-01-01
The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)
Uncertainty Principles and Fourier Analysis
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Uncertainty Principles and Fourier Analysis. Alladi Sitaram. General Article Volume 4 Issue 2 February 1999 pp 20-23. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/004/02/0020-0023 ...
Extended uncertainty from first principles
Energy Technology Data Exchange (ETDEWEB)
Costa Filho, Raimundo N., E-mail: rai@fisica.ufc.br [Departamento de Física, Universidade Federal do Ceará, Caixa Postal 6030, Campus do Pici, 60455-760 Fortaleza, Ceará (Brazil); Braga, João P.M., E-mail: philipe@fisica.ufc.br [Instituto de Ciências Exatas e da Natureza-ICEN, Universidade da Integração Internacional da Lusofonia Afro-Brasileira-UNILAB, Campus dos Palmares, 62785-000 Acarape, Ceará (Brazil); Lira, Jorge H.S., E-mail: jorge.lira@mat.ufc.br [Departamento de Matemática, Universidade Federal do Ceará, Caixa Postal 6030, Campus do Pici, 60455-760 Fortaleza, Ceará (Brazil); Andrade, José S., E-mail: soares@fisica.ufc.br [Departamento de Física, Universidade Federal do Ceará, Caixa Postal 6030, Campus do Pici, 60455-760 Fortaleza, Ceará (Brazil)
2016-04-10
A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.
Extended uncertainty from first principles
International Nuclear Information System (INIS)
Costa Filho, Raimundo N.; Braga, João P.M.; Lira, Jorge H.S.; Andrade, José S.
2016-01-01
A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.
An Uncertainty Principle for Quaternion Fourier Transform
BAHRI, Mawardi; HITZER, Eckhard S. M; HAYASHI, Akihisa; ASHINO, Ryuichi
2008-01-01
We review the quaternionic Fourier transform(QFT). Using the properties of the QFT we establish an uncertainty principle for the right-sided QFT.This uncertainty principle prescribes a lower bound on the product of the effective widths of quaternion-valued signals in the spatial and frequency domains. It is shown that only a Gaussian quaternion signal minimizes the uncertainty.
Heisenberg's principle of uncertainty and the uncertainty relations
International Nuclear Information System (INIS)
Redei, Miklos
1987-01-01
The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs
Directory of Open Access Journals (Sweden)
Fu Yuhua
2015-03-01
Full Text Available The most famous contribution of Heisenberg is uncertainty principle. But the original uncertainty principle is improper. Considering all the possible situations (including the case that people can create laws and applying Neutrosophy and Quad-stage Method, this paper presents "certainty-uncertainty principles" with general form and variable dimension fractal form. According to the classification of Neutrosophy, "certainty-uncertainty principles" can be divided into three principles in different conditions: "certainty principle", namely a particle’s position and momentum can be known simultaneously; "uncertainty principle", namely a particle’s position and momentum cannot be known simultaneously; and neutral (fuzzy "indeterminacy principle", namely whether or not a particle’s position and momentum can be known simultaneously is undetermined. The special cases of "certain ty-uncertainty principles" include the original uncertainty principle and Ozawa inequality. In addition, in accordance with the original uncertainty principle, discussing high-speed particle’s speed and track with Newton mechanics is unreasonable; but according to "certaintyuncertainty principles", Newton mechanics can be used to discuss the problem of gravitational defection of a photon orbit around the Sun (it gives the same result of deflection angle as given by general relativity. Finally, for the reason that in physics the principles, laws and the like that are regardless of the principle (law of conservation of energy may be invalid; therefore "certaintyuncertainty principles" should be restricted (or constrained by principle (law of conservation of energy, and thus it can satisfy the principle (law of conservation of energy.
Entropic uncertainty relation based on generalized uncertainty principle
Hsu, Li-Yi; Kawamoto, Shoichi; Wen, Wen-Yu
2017-09-01
We explore the modification of the entropic formulation of uncertainty principle in quantum mechanics which measures the incompatibility of measurements in terms of Shannon entropy. The deformation in question is the type so-called generalized uncertainty principle that is motivated by thought experiments in quantum gravity and string theory and is characterized by a parameter of Planck scale. The corrections are evaluated for small deformation parameters by use of the Gaussian wave function and numerical calculation. As the generalized uncertainty principle has proven to be useful in the study of the quantum nature of black holes, this study would be a step toward introducing an information theory viewpoint to black hole physics.
Uncertainty Analysis Principles and Methods
2007-09-01
total systematic uncertainties be combined in RSS. In many instances, the student’s t-statistic, t95, is set equal to 2 and URSS is replaced by U95...GUM, the total uncertainty UADD, URSS or U95, was offered as type of confi- dence limit. 9595 UxvaluetrueUx +≤≤− In some respects, these limits
Human perception and the uncertainty principle
International Nuclear Information System (INIS)
Harney, R.C.
1976-01-01
The concept of the uncertainty principle that position and momentum cannot be simultaneously specified to arbitrary accuracy is somewhat difficult to reconcile with experience. This note describes order-of-magnitude calculations which quantify the inadequacy of human perception with regards to direct observation of the breakdown of the trajectory concept implied by the uncertainty principle. Even with the best optical microscope, human vision is inadequate by three orders of magnitude. 1 figure
Quantum wells and the generalized uncertainty principle
International Nuclear Information System (INIS)
Blado, Gardo; Owens, Constance; Meyers, Vincent
2014-01-01
The finite and infinite square wells are potentials typically discussed in undergraduate quantum mechanics courses. In this paper, we discuss these potentials in the light of the recent studies of the modification of the Heisenberg uncertainty principle into a generalized uncertainty principle (GUP) as a consequence of attempts to formulate a quantum theory of gravity. The fundamental concepts of the minimal length scale and the GUP are discussed and the modified energy eigenvalues and transmission coefficient are derived. (paper)
Dilaton cosmology and the modified uncertainty principle
International Nuclear Information System (INIS)
Majumder, Barun
2011-01-01
Very recently Ali et al. (2009) proposed a new generalized uncertainty principle (with a linear term in Plank length which is consistent with doubly special relativity and string theory. The classical and quantum effects of this generalized uncertainty principle (termed as modified uncertainty principle or MUP) are investigated on the phase space of a dilatonic cosmological model with an exponential dilaton potential in a flat Friedmann-Robertson-Walker background. Interestingly, as a consequence of MUP, we found that it is possible to get a late time acceleration for this model. For the quantum mechanical description in both commutative and MUP framework, we found the analytical solutions of the Wheeler-DeWitt equation for the early universe and compare our results. We have used an approximation method in the case of MUP.
Uncertainty Principles for the Cherednik Transform
Indian Academy of Sciences (India)
We shall investigate two uncertainty principles for the Cherednik transform on the Euclidean space a a ; Miyachi's theorem and Beurling's theorem. We give an analogue of Miyachi's theorem for the Cherednik transform and under the assumption that a a has a hypergroup structure, an analogue of Beurling's theorem for the ...
A review of the generalized uncertainty principle
International Nuclear Information System (INIS)
Tawfik, Abdel Nasser; Diab, Abdel Magied
2015-01-01
Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. (review)
On a principle of cosmological uncertainty
Romano, Antonio Enea
2012-01-01
We show that cosmological observations are subject to an intrinsic uncertainty which can be expressed in the form of an uncertainty relation similar to the Heisenberg principle. This is a consequence of the fact that the four dimensional space-time metric information is projected into the one-dimensional observational red-shift space, implying a limit on the amount of information which can be extracted about the underlying geometry. Since multiple space-time configurations can lead to the same red-shift, there is an unavoidable uncertainty about the determination of the space-time geometry. This suggests the existence of a limit about of the amount of information that cosmological observations can reveal about our Universe that no experiment could ever overcame, conceptually similar to what happens in quantum mechanics.
Open Timelike Curves Violate Heisenberg's Uncertainty Principle
Pienaar, J. L.; Ralph, T. C.; Myers, C. R.
2013-02-01
Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg’s uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.
Open timelike curves violate Heisenberg's uncertainty principle.
Pienaar, J L; Ralph, T C; Myers, C R
2013-02-08
Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg's uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.
Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?
Robertson, Bill
2016-01-01
Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…
Uncertainty as organizing principle of action
DEFF Research Database (Denmark)
Winther-Lindqvist, Ditte Alexandra
2014-01-01
uncertainty as condition for teenage life when confronted with parental serious illness is presented as the main challenge charracterising this situation. based on 26 semi-structured interviews everyday life with an ill parent is described and analysed. a model of uncertainty is suggested which...
Limited entropic uncertainty as new principle of quantum physics
International Nuclear Information System (INIS)
Ion, D.B.; Ion, M.L.
2001-01-01
The Uncertainty Principle (UP) of quantum mechanics discovered by Heisenberg, which constitute the corner-stone of quantum physics, asserts that: there is an irreducible lower bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables. In order to avoid this state-dependence many authors proposed to use the information entropy as a measure of the uncertainty instead of above standard quantitative formulation of the Heisenberg uncertainty principle. In this paper the Principle of Limited Entropic Uncertainty (LEU-Principle), as a new principle in quantum physics, is proved. Then, consistent experimental tests of the LEU-principle, obtained by using the available 49 sets of the pion-nucleus phase shifts, are presented for both, extensive (q=1) and nonextensive (q=0.5 and q=2.0) cases. Some results obtained by the application of LEU-Principle to the diffraction phenomena are also discussed. The main results and conclusions of our paper can be summarized as follows: (i) We introduced a new principle in quantum physics namely the Principle of Limited Entropic Uncertainty (LEU-Principle). This new principle includes in a more general and exact form not only the old Heisenberg uncertainty principle but also introduce an upper limit on the magnitude of the uncertainty in the quantum physics. The LEU-Principle asserts that: 'there is an irreducible lower bound as well as an upper bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables for any extensive and nonextensive (q ≥ 0) quantum systems'; (ii) Two important concrete realizations of the LEU-Principle are explicitly obtained in this paper, namely: (a) the LEU-inequalities for the quantum scattering of spinless particles and (b) the LEU-inequalities for the diffraction on single slit of width 2a. In particular from our general results, in the limit y → +1 we recover in an exact form all the results previously reported. In our paper an
Directory of Open Access Journals (Sweden)
Kostas Kouvaris
2017-04-01
Full Text Available One of the most intriguing questions in evolution is how organisms exhibit suitable phenotypic variation to rapidly adapt in novel selective environments. Such variability is crucial for evolvability, but poorly understood. In particular, how can natural selection favour developmental organisations that facilitate adaptive evolution in previously unseen environments? Such a capacity suggests foresight that is incompatible with the short-sighted concept of natural selection. A potential resolution is provided by the idea that evolution may discover and exploit information not only about the particular phenotypes selected in the past, but their underlying structural regularities: new phenotypes, with the same underlying regularities, but novel particulars, may then be useful in new environments. If true, we still need to understand the conditions in which natural selection will discover such deep regularities rather than exploiting 'quick fixes' (i.e., fixes that provide adaptive phenotypes in the short term, but limit future evolvability. Here we argue that the ability of evolution to discover such regularities is formally analogous to learning principles, familiar in humans and machines, that enable generalisation from past experience. Conversely, natural selection that fails to enhance evolvability is directly analogous to the learning problem of over-fitting and the subsequent failure to generalise. We support the conclusion that evolving systems and learning systems are different instantiations of the same algorithmic principles by showing that existing results from the learning domain can be transferred to the evolution domain. Specifically, we show that conditions that alleviate over-fitting in learning systems successfully predict which biological conditions (e.g., environmental variation, regularity, noise or a pressure for developmental simplicity enhance evolvability. This equivalence provides access to a well-developed theoretical
Kouvaris, Kostas; Clune, Jeff; Kounios, Loizos; Brede, Markus; Watson, Richard A
2017-04-01
One of the most intriguing questions in evolution is how organisms exhibit suitable phenotypic variation to rapidly adapt in novel selective environments. Such variability is crucial for evolvability, but poorly understood. In particular, how can natural selection favour developmental organisations that facilitate adaptive evolution in previously unseen environments? Such a capacity suggests foresight that is incompatible with the short-sighted concept of natural selection. A potential resolution is provided by the idea that evolution may discover and exploit information not only about the particular phenotypes selected in the past, but their underlying structural regularities: new phenotypes, with the same underlying regularities, but novel particulars, may then be useful in new environments. If true, we still need to understand the conditions in which natural selection will discover such deep regularities rather than exploiting 'quick fixes' (i.e., fixes that provide adaptive phenotypes in the short term, but limit future evolvability). Here we argue that the ability of evolution to discover such regularities is formally analogous to learning principles, familiar in humans and machines, that enable generalisation from past experience. Conversely, natural selection that fails to enhance evolvability is directly analogous to the learning problem of over-fitting and the subsequent failure to generalise. We support the conclusion that evolving systems and learning systems are different instantiations of the same algorithmic principles by showing that existing results from the learning domain can be transferred to the evolution domain. Specifically, we show that conditions that alleviate over-fitting in learning systems successfully predict which biological conditions (e.g., environmental variation, regularity, noise or a pressure for developmental simplicity) enhance evolvability. This equivalence provides access to a well-developed theoretical framework from
Kouvaris, Kostas; Clune, Jeff; Brede, Markus; Watson, Richard A.
2017-01-01
One of the most intriguing questions in evolution is how organisms exhibit suitable phenotypic variation to rapidly adapt in novel selective environments. Such variability is crucial for evolvability, but poorly understood. In particular, how can natural selection favour developmental organisations that facilitate adaptive evolution in previously unseen environments? Such a capacity suggests foresight that is incompatible with the short-sighted concept of natural selection. A potential resolution is provided by the idea that evolution may discover and exploit information not only about the particular phenotypes selected in the past, but their underlying structural regularities: new phenotypes, with the same underlying regularities, but novel particulars, may then be useful in new environments. If true, we still need to understand the conditions in which natural selection will discover such deep regularities rather than exploiting ‘quick fixes’ (i.e., fixes that provide adaptive phenotypes in the short term, but limit future evolvability). Here we argue that the ability of evolution to discover such regularities is formally analogous to learning principles, familiar in humans and machines, that enable generalisation from past experience. Conversely, natural selection that fails to enhance evolvability is directly analogous to the learning problem of over-fitting and the subsequent failure to generalise. We support the conclusion that evolving systems and learning systems are different instantiations of the same algorithmic principles by showing that existing results from the learning domain can be transferred to the evolution domain. Specifically, we show that conditions that alleviate over-fitting in learning systems successfully predict which biological conditions (e.g., environmental variation, regularity, noise or a pressure for developmental simplicity) enhance evolvability. This equivalence provides access to a well-developed theoretical framework from
Hawking effect and Unruh effect from the uncertainty principle
Giné, Jaume
2018-01-01
The exact expressions of the Hawking temperature and Unruh temperature are deduced using the uncertainty principle of Heisenberg and the vacuum fluctuations that cause the appearance of a particle-antiparticle pair close to an event horizon.
Uncertainty Principles on Two Step Nilpotent Lie Groups
Indian Academy of Sciences (India)
Abstract. We extend an uncertainty principle due to Cowling and Price to two step nilpotent Lie groups, which generalizes a classical theorem of Hardy. We also prove an analogue of Heisenberg inequality on two step nilpotent Lie groups.
Uncertainty principles on two step nilpotent Lie groups
Indian Academy of Sciences (India)
Abstract. We extend an uncertainty principle due to Cowling and Price to two step nilpotent Lie groups, which generalizes a classical theorem of Hardy. We also prove an analogue of Heisenberg inequality on two step nilpotent Lie groups.
The role of general relativity in the uncertainty principle
International Nuclear Information System (INIS)
Padmanabhan, T.
1986-01-01
The role played by general relativity in quantum mechanics (especially as regards the uncertainty principle) is investigated. It is confirmed that the validity of time-energy uncertainty does depend on gravitational time dilation. It is also shown that there exists an intrinsic lower bound to the accuracy with which acceleration due to gravity can be measured. The motion of equivalence principle in quantum mechanics is clarified. (author)
Heisenberg, Matrix Mechanics, and the Uncertainty Principle 4-6 ...
Indian Academy of Sciences (India)
These investigations climaxed with the advent of quan- tum mechanics in the 1920s. Under the leadership of ... discollraged. uot to say repelled, ... by the lack of visuali:t:ability" in matrix mechanics. (SchrodillgeT~s formalism deals with the .... because of its profound consequences. The Uncertainty Principle. The Uncertainty ...
Uncertainty principle for angular position and angular momentum
International Nuclear Information System (INIS)
Franke-Arnold, Sonja; Barnett, Stephen M; Yao, Eric; Leach, Jonathan; Courtial, Johannes; Padgett, Miles
2004-01-01
The uncertainty principle places fundamental limits on the accuracy with which we are able to measure the values of different physical quantities (Heisenberg 1949 The Physical Principles of the Quantum Theory (New York: Dover); Robertson 1929 Phys. Rev. 34 127). This has profound effects not only on the microscopic but also on the macroscopic level of physical systems. The most familiar form of the uncertainty principle relates the uncertainties in position and linear momentum. Other manifestations include those relating uncertainty in energy to uncertainty in time duration, phase of an electromagnetic field to photon number and angular position to angular momentum (Vaccaro and Pegg 1990 J. Mod. Opt. 37 17; Barnett and Pegg 1990 Phys. Rev. A 41 3427). In this paper, we report the first observation of the last of these uncertainty relations and derive the associated states that satisfy the equality in the uncertainty relation. We confirm the form of these states by detailed measurement of the angular momentum of a light beam after passage through an appropriate angular aperture. The angular uncertainty principle applies to all physical systems and is particularly important for systems with cylindrical symmetry
Polar Wavelet Transform and the Associated Uncertainty Principles
Shah, Firdous A.; Tantary, Azhar Y.
2018-02-01
The polar wavelet transform- a generalized form of the classical wavelet transform has been extensively used in science and engineering for finding directional representations of signals in higher dimensions. The aim of this paper is to establish new uncertainty principles associated with the polar wavelet transforms in L2(R2). Firstly, we study some basic properties of the polar wavelet transform and then derive the associated generalized version of Heisenberg-Pauli-Weyl inequality. Finally, following the idea of Beckner (Proc. Amer. Math. Soc. 123, 1897-1905 1995), we drive the logarithmic version of uncertainty principle for the polar wavelet transforms in L2(R2).
Lacunary Fourier Series and a Qualitative Uncertainty Principle for ...
Indian Academy of Sciences (India)
We define lacunary Fourier series on a compact connected semisimple Lie group . If f ∈ L 1 ( G ) has lacunary Fourier series and vanishes on a non empty open subset of , then we prove that vanishes identically. This result can be viewed as a qualitative uncertainty principle.
Uncertainty principles on two step nilpotent Lie groups
Indian Academy of Sciences (India)
Springer Verlag Heidelberg #4 2048 1996 Dec 15 10:16:45
As a meta-theorem in harmonic analysis, the uncertainty principles can be summarized as: A nonzero function ... localization is interpreted as very rapid decay, this meta-theorem becomes the following theorem due to ... material leading to a proof of the Plancherel theorem via the description of the Hilbert–. Schmidt norm of ...
“Stringy” coherent states inspired by generalized uncertainty principle
International Nuclear Information System (INIS)
Ghosh, Subir; Roy, Pinaki
2012-01-01
Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.
Gauge theories under incorporation of a generalized uncertainty principle
International Nuclear Information System (INIS)
Kober, Martin
2010-01-01
There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.
International Nuclear Information System (INIS)
Romanowicz, Renata; Young, Peter C.
2003-01-01
Stochastic Transfer Function (STF) and Generalised Likelihood Uncertainty Estimation (GLUE) techniques are outlined and applied to an environmental problem concerned with marine dose assessment. The goal of both methods in this application is the estimation and prediction of the environmental variables, together with their associated probability distributions. In particular, they are used to estimate the amount of radionuclides transferred to marine biota from a given source: the British Nuclear Fuel Ltd (BNFL) repository plant in Sellafield, UK. The complexity of the processes involved, together with the large dispersion and scarcity of observations regarding radionuclide concentrations in the marine environment, require efficient data assimilation techniques. In this regard, the basic STF methods search for identifiable, linear model structures that capture the maximum amount of information contained in the data with a minimal parameterisation. They can be extended for on-line use, based on recursively updated Bayesian estimation and, although applicable to only constant or time-variable parameter (non-stationary) linear systems in the form used in this paper, they have the potential for application to non-linear systems using recently developed State Dependent Parameter (SDP) non-linear STF models. The GLUE based-methods, on the other hand, formulate the problem of estimation using a more general Bayesian approach, usually without prior statistical identification of the model structure. As a result, they are applicable to almost any linear or non-linear stochastic model, although they are much less efficient both computationally and in their use of the information contained in the observations. As expected in this particular environmental application, it is shown that the STF methods give much narrower confidence limits for the estimates due to their more efficient use of the information contained in the data. Exploiting Monte Carlo Simulation (MCS) analysis
The 'Herbivory Uncertainty Principle': application in a cerrado site
Directory of Open Access Journals (Sweden)
CA Gadotti
Full Text Available Researchers may alter the ecology of their studied organisms, even carrying out apparently beneficial activities, as in herbivory studies, when they may alter herbivory damage. We tested whether visit frequency altered herbivory damage, as predicted by the 'Herbivory Uncertainty Principle'. In a cerrado site, we established 80 quadrats, in which we sampled all woody individuals. We used four visit frequencies (high, medium, low, and control, quantifying, at the end of three months, herbivory damage for each species in each treatment. We did not corroborate the 'Herbivory Uncertainty Principle', since visiting frequency did not alter herbivory damage, at least when the whole plant community was taken into account. However, when we analysed each species separately, four out of 11 species presented significant differences in herbivory damage, suggesting that the researcher is not independent of its measurements. The principle could be tested in other ecological studies in which it may occur, such as those on animal behaviour, human ecology, population dynamics, and conservation.
Universal uncertainty principle in the measurement operator formalism
International Nuclear Information System (INIS)
Ozawa, Masanao
2005-01-01
Heisenberg's uncertainty principle has been understood to set a limitation on measurements; however, the long-standing mathematical formulation established by Heisenberg, Kennard, and Robertson does not allow such an interpretation. Recently, a new relation was found to give a universally valid relation between noise and disturbance in general quantum measurements, and it has become clear that the new relation plays a role of the first principle to derive various quantum limits on measurement and information processing in a unified treatment. This paper examines the above development on the noise-disturbance uncertainty principle in the model-independent approach based on the measurement operator formalism, which is widely accepted to describe a class of generalized measurements in the field of quantum information. We obtain explicit formulae for the noise and disturbance of measurements given by measurement operators, and show that projective measurements do not satisfy the Heisenberg-type noise-disturbance relation that is typical in the gamma-ray microscope thought experiments. We also show that the disturbance on a Pauli operator of a projective measurement of another Pauli operator constantly equals √2, and examine how this measurement violates the Heisenberg-type relation but satisfies the new noise-disturbance relation
The Precautionary Principle and statistical approaches to uncertainty
DEFF Research Database (Denmark)
Keiding, Niels; Budtz-Jørgensen, Esben
2004-01-01
for setting safe exposure levels are essentially derived from these classical statistical ideas, and we outline how uncertainties in the exposure and response measurements affect the no observed adverse effect level, the Benchmark approach and the "Hockey Stick" model. A particular problem concerns model...... is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures...... uncertainty: usually these procedures assume that the class of models describing dose/response is known with certainty; this assumption is, however, often violated, perhaps particularly often when epidemiological data form the source of the risk assessment, and regulatory authorities have occasionally...
Uncertainty principles on two step nilpotent Lie groups
Indian Academy of Sciences (India)
Springer Verlag Heidelberg #4 2048 1996 Dec 15 10:16:45
Uncertainty principles. 297 coadjoint action. Given any l ∈ ∗ there exist a subalgebra l of which is maximal with respect to the property l ([l , l ]) = 0. (2.3). Thus we have a character χl : exp( l ) → T given by χl (exp X) = e2πil (X),X ∈ l . Let πl = indG exp(hl )χl . Then. (1) πl is an irreducible unitary representation of G. (2) If.
The Precautionary Principle and statistical approaches to uncertainty
DEFF Research Database (Denmark)
Keiding, Niels; Budtz-Jørgensen, Esben
2004-01-01
The central challenge from the Precautionary Principle to statistical methodology is to help delineate (preferably quantitatively) the possibility that some exposure is hazardous, even in cases where this is not established beyond reasonable doubt. The classical approach to hypothesis testing...... is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures...... uncertainty: usually these procedures assume that the class of models describing dose/response is known with certainty; this assumption is, however, often violated, perhaps particularly often when epidemiological data form the source of the risk assessment, and regulatory authorities have occasionally...
Reducing Uncertainty: Implementation of Heisenberg Principle to Measure Company Performance
Directory of Open Access Journals (Sweden)
Anna Svirina
2015-08-01
Full Text Available The paper addresses the problem of uncertainty reduction in estimation of future company performance, which is a result of wide range of enterprise's intangible assets probable efficiency. To reduce this problem, the paper suggests to use quantum economy principles, i.e. implementation of Heisenberg principle to measure efficiency and potential of intangible assets of the company. It is proposed that for intangibles it is not possible to estimate both potential and efficiency at a certain time point. To provide a proof for these thesis, the data on resources potential and efficiency from mid-Russian companies was evaluated within deterministic approach, which did not allow to evaluate probability of achieving certain resource efficiency, and quantum approach, which allowed to estimate the central point around which the probable efficiency of resources in concentrated. Visualization of these approaches was performed by means of LabView software. It was proven that for tangible assets performance estimation a deterministic approach should be used; while for intangible assets the quantum approach allows better quality of future performance prediction. On the basis of these findings we proposed the holistic approach towards estimation of company resource efficiency in order to reduce uncertainty in modeling company performance.
Directory of Open Access Journals (Sweden)
Karl Friston
2010-01-01
Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.
Malgieri, Massimiliano; Tenni, Antonio; Onorato, Pasquale; De Ambrosis, Anna
2016-09-01
In this paper we present a reasoning line for introducing the Pauli exclusion principle in the context of an introductory course on quantum theory based on the sum over paths approach. We start from the argument originally introduced by Feynman in ‘QED: The Strange Theory of Light and Matter’ and improve it by discussing with students modern experimental evidence from the famous Hong-Ou-Mandel experiment with indistinguishable photons and its generalised version using electrons. The experiments can be analysed in a rather simple way using Feynman’s method of ‘arrow multiplication’ for treating processes involving more than one quantum object. The approach described is especially relevant in the formation of high school physics teachers to the basics of modern physics.
Black hole complementarity with the generalized uncertainty principle in Gravity's Rainbow
Gim, Yongwan; Um, Hwajin; Kim, Wontae
2018-02-01
When gravitation is combined with quantum theory, the Heisenberg uncertainty principle could be extended to the generalized uncertainty principle accompanying a minimal length. To see how the generalized uncertainty principle works in the context of black hole complementarity, we calculate the required energy to duplicate information for the Schwarzschild black hole. It shows that the duplication of information is not allowed and black hole complementarity is still valid even assuming the generalized uncertainty principle. On the other hand, the generalized uncertainty principle with the minimal length could lead to a modification of the conventional dispersion relation in light of Gravity's Rainbow, where the minimal length is also invariant as well as the speed of light. Revisiting the gedanken experiment, we show that the no-cloning theorem for black hole complementarity can be made valid in the regime of Gravity's Rainbow on a certain combination of parameters.
The Precautionary Principle and statistical approaches to uncertainty
DEFF Research Database (Denmark)
Keiding, Niels; Budtz-Jørgensen, Esben
2003-01-01
Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification...
Generalized uncertainty principle as a consequence of the effective field theory
Energy Technology Data Exchange (ETDEWEB)
Faizal, Mir, E-mail: mirfaizalmir@gmail.com [Irving K. Barber School of Arts and Sciences, University of British Columbia – Okanagan, Kelowna, British Columbia V1V 1V7 (Canada); Department of Physics and Astronomy, University of Lethbridge, Lethbridge, Alberta T1K 3M4 (Canada); Ali, Ahmed Farag, E-mail: ahmed.ali@fsc.bu.edu.eg [Department of Physics, Faculty of Science, Benha University, Benha, 13518 (Egypt); Netherlands Institute for Advanced Study, Korte Spinhuissteeg 3, 1012 CG Amsterdam (Netherlands); Nassar, Ali, E-mail: anassar@zewailcity.edu.eg [Department of Physics, Zewail City of Science and Technology, 12588, Giza (Egypt)
2017-02-10
We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.
Generalized uncertainty principle as a consequence of the effective field theory
Directory of Open Access Journals (Sweden)
Mir Faizal
2017-02-01
Full Text Available We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.
Li, Ziyi
2017-12-01
Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.
The Precautionary Principle and statistical approaches to uncertainty
DEFF Research Database (Denmark)
Keiding, Niels; Budtz-Jørgensen, Esben
2004-01-01
is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures...
Generalized uncertainty principle and entropy of three-dimensional rotating acoustic black hole
International Nuclear Information System (INIS)
Zhao, HuiHua; Li, GuangLiang; Zhang, LiChun
2012-01-01
Using the new equation of state density from the generalized uncertainty principle, we investigate statistics entropy of a 3-dimensional rotating acoustic black hole. When λ introduced in the generalized uncertainty principle takes a specific value, we obtain an area entropy and a correction term associated with the acoustic black hole. In this method, there does not exist any divergence and one needs not the small mass approximation in the original brick-wall model. -- Highlights: ► Statistics entropy of a 3-dimensional rotating acoustic black hole is studied. ► We obtain an area entropy and a correction term associated with it. ► We make λ introduced in the generalized uncertainty principle take a specific value. ► There does not exist any divergence in this method.
DEFF Research Database (Denmark)
Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman
2017-01-01
Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings...... on our insights from the project, we provide first evidence for preliminary design principles for applications that aim to mitigate the transactional risk and uncertainty in decentralized environments using blockchain. Both the artifact and the first evidence for emerging design principles are novel......, contributing to the discourse on the implications that the advent of blockchain technology poses for governing economic activity....
Verification of the uncertainty principle by using diffraction of light waves
Energy Technology Data Exchange (ETDEWEB)
Nikolic, D [Grammar School Pirot, 18 300 Pirot (Serbia); Nesic, Lj, E-mail: gisanikolic@yahoo.com [Faculty of Sciences and Mathematics, University of Nis, 18 000 Nis (Serbia)
2011-03-15
We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the acquisition of the experimental data and their further analysis, we used a computer. Because of its simplicity this experiment is very suitable for demonstration, as well as for a quantitative exercise at universities and final year of high school studies.
Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie
2011-01-01
Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…
International Nuclear Information System (INIS)
Tawfik, A.
2013-01-01
We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible
Harbola, Varun
2011-01-01
In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…
Directory of Open Access Journals (Sweden)
Syed Masood
2016-12-01
Full Text Available In this paper, we will propose the most general form of the deformation of Heisenberg algebra motivated by the generalized uncertainty principle. This deformation of the Heisenberg algebra will deform all quantum mechanical systems. The form of the generalized uncertainty principle used to motivate these results will be motivated by the space fractional quantum mechanics, and non-locality in quantum mechanical systems. We also analyse a specific limit of this generalized deformation for one dimensional system, and in that limit, a nonlocal deformation of the momentum operator generates a local deformation of all one dimensional quantum mechanical systems. We analyse the low energy effects of this deformation on a harmonic oscillator, Landau levels, Lamb shift, and potential barrier. We also demonstrate that this deformation leads to a discretization of space.
Constraining the generalized uncertainty principle with the gravitational wave event GW150914
Directory of Open Access Journals (Sweden)
Zhong-Wen Feng
2017-05-01
Full Text Available In this letter, we show that the dimensionless parameters in the generalized uncertainty principle (GUP can be constrained by the gravitational wave event GW150914, which was discovered by the LIGO Scientific and Virgo Collaborations. Firstly, according to the Heisenberg uncertainty principle (HUP and the data of gravitational wave event GW150914, we derive the standard energy–momentum dispersion relation and calculate the difference between the propagation speed of gravitons and the speed of light, i.e., Δυ. Next, using two proposals regarding the GUP, we also generalize our study to the quantum gravity case and obtain the modified speed of gravitons. Finally, based on the modified speed of gravitons and Δυ, the improved upper bounds on the GUP parameters are obtained. The results show that the upper limits of the GUP parameters β0 and α0 are 2.3×1060 and 1.8×1020.
Energy Technology Data Exchange (ETDEWEB)
Masood, Syed [Department of Physics, International Islamic University, H-10 Sector, Islamabad (Pakistan); Faizal, Mir, E-mail: mirfaizalmir@gmail.com [Irving K. Barber School of Arts and Sciences, University of British Columbia – Okanagan, Kelowna, BC V1V 1V7 (Canada); Department of Physics and Astronomy, University of Lethbridge, Lethbridge, AB T1K 3M4 (Canada); Zaz, Zaid [Department of Electronics and Communication Engineering, University of Kashmir, Srinagar, Kashmir, 190006 (India); Ali, Ahmed Farag [Department of Physics, Faculty of Science, Benha University, Benha, 13518 (Egypt); Raza, Jamil [Department of Physics, International Islamic University, H-10 Sector, Islamabad (Pakistan); Shah, Mushtaq B. [Department of Physics, National Institute of Technology, Srinagar, Kashmir, 190006 (India)
2016-12-10
In this paper, we will propose the most general form of the deformation of Heisenberg algebra motivated by the generalized uncertainty principle. This deformation of the Heisenberg algebra will deform all quantum mechanical systems. The form of the generalized uncertainty principle used to motivate these results will be motivated by the space fractional quantum mechanics, and non-locality in quantum mechanical systems. We also analyse a specific limit of this generalized deformation for one dimensional system, and in that limit, a nonlocal deformation of the momentum operator generates a local deformation of all one dimensional quantum mechanical systems. We analyse the low energy effects of this deformation on a harmonic oscillator, Landau levels, Lamb shift, and potential barrier. We also demonstrate that this deformation leads to a discretization of space.
International Nuclear Information System (INIS)
Masood, Syed; Faizal, Mir; Zaz, Zaid; Ali, Ahmed Farag; Raza, Jamil; Shah, Mushtaq B.
2016-01-01
In this paper, we will propose the most general form of the deformation of Heisenberg algebra motivated by the generalized uncertainty principle. This deformation of the Heisenberg algebra will deform all quantum mechanical systems. The form of the generalized uncertainty principle used to motivate these results will be motivated by the space fractional quantum mechanics, and non-locality in quantum mechanical systems. We also analyse a specific limit of this generalized deformation for one dimensional system, and in that limit, a nonlocal deformation of the momentum operator generates a local deformation of all one dimensional quantum mechanical systems. We analyse the low energy effects of this deformation on a harmonic oscillator, Landau levels, Lamb shift, and potential barrier. We also demonstrate that this deformation leads to a discretization of space.
The Quark-Gluon Plasma Equation of State and the Generalized Uncertainty Principle
Directory of Open Access Journals (Sweden)
L. I. Abou-Salem
2015-01-01
Full Text Available The quark-gluon plasma (QGP equation of state within a minimal length scenario or Generalized Uncertainty Principle (GUP is studied. The Generalized Uncertainty Principle is implemented on deriving the thermodynamics of ideal QGP at a vanishing chemical potential. We find a significant effect for the GUP term. The main features of QCD lattice results were quantitatively achieved in case of nf=0, nf=2, and nf=2+1 flavors for the energy density, the pressure, and the interaction measure. The exciting point is the large value of bag pressure especially in case of nf=2+1 flavor which reflects the strong correlation between quarks in this bag which is already expected. One can notice that the asymptotic behavior which is characterized by Stephan-Boltzmann limit would be satisfied.
Generalized uncertainty principle and the maximum mass of ideal white dwarfs
Energy Technology Data Exchange (ETDEWEB)
Rashidi, Reza, E-mail: reza.rashidi@srttu.edu
2016-11-15
The effects of a generalized uncertainty principle on the structure of an ideal white dwarf star is investigated. The equation describing the equilibrium configuration of the star is a generalized form of the Lane–Emden equation. It is proved that the star always has a finite size. It is then argued that the maximum mass of such an ideal white dwarf tends to infinity, as opposed to the conventional case where it has a finite value.
Heisenberg uncertainty principle and light squeezing in quantum nanoantennas and electric circuits
Slepyan, Gregory Ya.
2016-10-01
The Heisenberg uncertainty principle is one of the cornerstones of quantum mechanics. We show that the observable values of the electromagnetic field in the far- and near-field zones emitted by the quantum nanoantenna are coupled via uncertainty relations of the Heisenberg type. The similar uncertainty inequalities are obtained for the electric currents in the different branches of the quantum networks. Based on these, we predict the mechanism of high-level squeezing of light in the quantum antennas. We show that this mechanism is highly directive. The strong values of squeezing are reaching in the narrow directions of high emission (tops of the main lobes of the radiation pattern). We also discuss the quantum noise manifestation in nanoelectronics and nanophotonics from the point of the electromagnetic compatibility of nanoelectronic devices, densely placed in the limited areas of space.
Generalized Uncertainty Principle and Black Hole Entropy of Higher-Dimensional de Sitter Spacetime
International Nuclear Information System (INIS)
Zhao Haixia; Hu Shuangqi; Zhao Ren; Li Huaifan
2007-01-01
Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking black hole entropy. In particular, many researchers have expressed a vested interest in the coefficient of the logarithmic term of the black hole entropy correction term. In this paper, we calculate the correction value of the black hole entropy by utilizing the generalized uncertainty principle and obtain the correction term caused by the generalized uncertainty principle. Because in our calculation we think that the Bekenstein-Hawking area theorem is still valid after considering the generalized uncertainty principle, we derive that the coefficient of the logarithmic term of the black hole entropy correction term is positive. This result is different from the known result at present. Our method is valid not only for four-dimensional spacetimes but also for higher-dimensional spacetimes. In the whole process, the physics idea is clear and calculation is simple. It offers a new way for studying the entropy correction of the complicated spacetime.
International Nuclear Information System (INIS)
Bosyk, G M; Portesi, M; Holik, F; Plastino, A
2013-01-01
We revisit the connection between the complementarity and uncertainty principles of quantum mechanics within the framework of Mach–Zehnder interferometry. We focus our attention on the trade-off relation between complementary path information and fringe visibility. This relation is equivalent to the uncertainty relation of Schrödinger and Robertson for a suitably chosen pair of observables. We show that it is equivalent as well to the uncertainty inequality provided by Landau and Pollak. We also study the relationship of this trade-off relation with a family of entropic uncertainty relations based on Rényi entropies. There is no equivalence in this case, but the different values of the entropic parameter do define regimes that provides us with a tool to discriminate between non-trivial states of minimum uncertainty. The existence of such regimes agrees with previous results of Luis (2011 Phys. Rev. A 84 034101), although their meaning was not sufficiently clear. We discuss the origin of these regimes with the intention of gaining a deeper understanding of entropic measures. (paper)
The Heisenberg Uncertainty Principle and the Nyquist-Shannon Sampling Theorem
Directory of Open Access Journals (Sweden)
Millette P. A.
2013-07-01
Full Text Available The derivation of the Heisenberg Uncertainty Principle (HUP from the Uncertainty Theorem of Fourier Transform theory demonstrates that the HUP arises from the dependency of momentum on a wave number that exists at the quantum level. It also establishes that the HUP is purely a relationship between the eﬀective widths of Fourier transform pairs of variables (i.e. conjugate variables. We note that the HUP is not a quantum mechanical measurement principle per se. We introduce the Quantum Mechanical equivalent of the Nyquist-Shannon Sampling Theorem of Fourier Transform theory, and show that it is a better principle to describe the measurement limitations of Quantum Mechanics. We show that Brillouin zones in Solid State Physics are a manifestation of the Nyquist-Shannon Sampling Theorem at the quantum level. By comparison with other ﬁelds where Fourier Transform theory is used, we propose that we need todiscern between measurement limitations and inherent limitations when interpreting the impact of the HUP on the nature of the quantum level. We further propose that while measurement limitations result in our perception of indeterminism at the quantum level, there is no evidence that there are any inherent limitations at the quantum level, based on the Nyquist-Shannon Sampling Theorem
Trans-Planckian Effects in Inflationary Cosmology and the Modified Uncertainty Principle
DEFF Research Database (Denmark)
F. Hassan, S.; Sloth, Martin Snoager
2002-01-01
There are good indications that fundamental physics gives rise to a modified space-momentum uncertainty relation that implies the existence of a minimum length scale. We implement this idea in the scalar field theory that describes density perturbations in flat Robertson-Walker space-time. This l......There are good indications that fundamental physics gives rise to a modified space-momentum uncertainty relation that implies the existence of a minimum length scale. We implement this idea in the scalar field theory that describes density perturbations in flat Robertson-Walker space......-time. This leads to a non-linear time-dependent dispersion relation that encodes the effects of Planck scale physics in the inflationary epoch. Unruh type dispersion relations naturally emerge in this approach, while unbounded ones are excluded by the minimum length principle. We also find red-shift induced...
Alonso-Serrano, Ana; DÄ browski, Mariusz P.; Gohar, Hussain
2018-02-01
We investigate the generalized uncertainty principle (GUP) corrections to the entropy content and the information flux of black holes, as well as the corrections to the sparsity of the Hawking radiation at the late stages of evaporation. We find that due to these quantum gravity motivated corrections, the entropy flow per particle reduces its value on the approach to the Planck scale due to a better accuracy in counting the number of microstates. We also show that the radiation flow is no longer sparse when the mass of a black hole approaches Planck mass which is not the case for non-GUP calculations.
Completeness, special functions and uncertainty principles over q-linear grids
International Nuclear Information System (INIS)
Abreu, LuIs Daniel
2006-01-01
We derive completeness criteria for sequences of functions of the form f(xλ n ), where λ n is the nth zero of a suitably chosen entire function. Using these criteria, we construct complete nonorthogonal systems of Fourier-Bessel functions and their q-analogues, as well as other complete sets of q-special functions. We discuss connections with uncertainty principles over q-linear grids and the completeness of certain sets of q-Bessel functions is used to prove that, if a function f and its q-Hankel transform both vanish at the points {q -n } ∞ n=1 , 0 n } ∞ n=-∞
International Nuclear Information System (INIS)
Kim, Wontae; Oh, John J.
2008-01-01
We derive the formula of the black hole entropy with a minimal length of the Planck size by counting quantum modes of scalar fields in the vicinity of the black hole horizon, taking into account the generalized uncertainty principle (GUP). This formula is applied to some intriguing examples of black holes - the Schwarzschild black hole, the Reissner-Nordstrom black hole, and the magnetically charged dilatonic black hole. As a result, it is shown that the GUP parameter can be determined by imposing the black hole entropy-area relationship, which has a Planck length scale and a universal form within the near-horizon expansion
Before and beyond the precautionary principle: Epistemology of uncertainty in science and law
International Nuclear Information System (INIS)
Tallacchini, Mariachiara
2005-01-01
The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society
The statistical fluctuation study of quantum key distribution in means of uncertainty principle
Liu, Dunwei; An, Huiyao; Zhang, Xiaoyu; Shi, Xuemei
2018-03-01
Laser defects in emitting single photon, photon signal attenuation and propagation of error cause our serious headaches in practical long-distance quantum key distribution (QKD) experiment for a long time. In this paper, we study the uncertainty principle in metrology and use this tool to analyze the statistical fluctuation of the number of received single photons, the yield of single photons and quantum bit error rate (QBER). After that we calculate the error between measured value and real value of every parameter, and concern the propagation error among all the measure values. We paraphrase the Gottesman-Lo-Lutkenhaus-Preskill (GLLP) formula in consideration of those parameters and generate the QKD simulation result. In this study, with the increase in coding photon length, the safe distribution distance is longer and longer. When the coding photon's length is N = 10^{11}, the safe distribution distance can be almost 118 km. It gives a lower bound of safe transmission distance than without uncertainty principle's 127 km. So our study is in line with established theory, but we make it more realistic.
International Nuclear Information System (INIS)
Haritz, M.M.
2011-01-01
There is increasing evidence to suggest that adaptation to the inevitable is as relevant to climate change policymaking as mitigation efforts. Both mitigation and adaptation, as well as the unavoidable damage occurring both now and that is predicted to occur, all involve costs at the expense of diverse climate change victims. The allocation of responsibilities - implicit in terms of the burden-sharing mechanisms that currently exist in public and private governance - demands recourse under liability law, especially as it has become clear that most companies will only start reducing emissions if verifiable costs of the economic consequences of climate change, including the likelihood of liability, outweigh the costs of taking precautionary measures. This vitally important book asks: Can the precautionary principle make uncertainty judiciable in the context of liability for the consequences of climate change, and, if so, to what extent? Drawing on the full range of pertinent existing literature and case law, the author examines the precautionary principle both in terms of its content and application and in the context of liability law. She analyses the indirect means offered by existing legislation being used by environmental groups and affected individuals before the courts to challenge both companies and regulators as responsible agents of climate change damage. In the process of responding to its fundamental question, the analysis explores such further questions as the following: (a) What is the role of the precautionary principle in resolving uncertainty in scientific risk assessment when faced with inconclusive evidence, and how does it affect decision-making, particularly in the regulatory choices concerning climate change? To this end, what is the concrete content of the precautionary principle?; (b) How does liability law generally handle scientific uncertainty? What different types of liability exist, and how are they equipped to handle a climate change
Energy Technology Data Exchange (ETDEWEB)
Feng, Z.W.; Zu, X.T. [University of Electronic Science and Technology of China, School of Physical Electronics, Chengdu (China); Li, H.L. [University of Electronic Science and Technology of China, School of Physical Electronics, Chengdu (China); Shenyang Normal University, College of Physics Science and Technology, Shenyang (China); Yang, S.Z. [China West Normal University, Physics and Space Science College, Nanchong (China)
2016-04-15
We investigate the thermodynamics of Schwarzschild-Tangherlini black hole in the context of the generalized uncertainty principle (GUP). The corrections to the Hawking temperature, entropy and the heat capacity are obtained via the modified Hamilton-Jacobi equation. These modifications show that the GUP changes the evolution of the Schwarzschild-Tangherlini black hole. Specially, the GUP effect becomes susceptible when the radius or mass of the black hole approaches the order of Planck scale, it stops radiating and leads to a black hole remnant. Meanwhile, the Planck scale remnant can be confirmed through the analysis of the heat capacity. Those phenomena imply that the GUP may give a way to solve the information paradox. Besides, we also investigate the possibilities to observe the black hole at the Large Hadron Collider (LHC), and the results demonstrate that the black hole cannot be produced in the recent LHC. (orig.)
Directory of Open Access Journals (Sweden)
Xiang-Qian Li
2016-12-01
Full Text Available This study considers the generalized uncertainty principle, which incorporates the central idea of large extra dimensions, to investigate the processes involved when massive spin-1 particles tunnel from Reissner–Nordstrom and Kerr black holes under the effects of quantum gravity. For the black hole, the quantum gravity correction decelerates the increase in temperature. Up to O(1Mf2, the corrected temperatures are affected by the mass and angular momentum of the emitted vector bosons. In addition, the temperature of the Kerr black hole becomes uneven due to rotation. When the mass of the black hole approaches the order of the higher dimensional Planck mass Mf, it stops radiating and yields a black hole remnant.
Effect of Generalized Uncertainty Principle on Main-Sequence Stars and White Dwarfs
Directory of Open Access Journals (Sweden)
Mohamed Moussa
2015-01-01
Full Text Available This paper addresses the effect of generalized uncertainty principle, emerged from different approaches of quantum gravity within Planck scale, on thermodynamic properties of photon, nonrelativistic ideal gases, and degenerate fermions. A modification in pressure, particle number, and energy density are calculated. Astrophysical objects such as main-sequence stars and white dwarfs are examined and discussed as an application. A modification in Lane-Emden equation due to a change in a polytropic relation caused by the presence of quantum gravity is investigated. The applicable range of quantum gravity parameters is estimated. The bounds in the perturbed parameters are relatively large but they may be considered reasonable values in the astrophysical regime.
Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control
International Nuclear Information System (INIS)
Deffner, Sebastian; Campbell, Steve
2017-01-01
One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam–Tamm and the Margolus–Levitin bounds on the quantum speed limit , and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach , where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader. (topical review)
Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control
Deffner, Sebastian; Campbell, Steve
2017-11-01
One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.
Generalised twisted partition functions
Petkova, V B
2001-01-01
We consider the set of partition functions that result from the insertion of twist operators compatible with conformal invariance in a given 2D Conformal Field Theory (CFT). A consistency equation, which gives a classification of twists, is written and solved in particular cases. This generalises old results on twisted torus boundary conditions, gives a physical interpretation of Ocneanu's algebraic construction, and might offer a new route to the study of properties of CFT.
Gale, Christopher K; Millichamp, Jane
2011-01-01
Generalised anxiety disorder is characterised by persistent, excessive and difficult-to-control worry, which may be accompanied by several psychic and somatic symptoms, including suicidality. Generalized anxiety disorder is the most common psychiatric disorder in the primary care, although it is often underrecognised and undertreated. Generalized anxiety disorder is typically a chronic condition with low short- and medium-term remission rates. Clinical presentations often include depression, ...
Chen, Lingshen; Cheng, Hongbo
2018-03-01
The Parikh-Kraus-Wilczeck tunneling radiation of black hole involving a f( R) global monopole is considered based on the generalized uncertainty principle. The influences from global monopole, f( R) gravity and the corrections to the uncertainty appear in the expression of black hole entropy difference. It is found that the global monopole and the revision of general relativity both hinder the black hole from emitting the photons. The two parts as corrections to the uncertainty make the entropy difference of this kind of black hole larger or smaller respectively.
Indian Academy of Sciences (India)
The imperfect understanding of some of the processes and physics in the carbon cycle and chemistry models generate uncertainties in the conversion of emissions to concentration. To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the ...
International Nuclear Information System (INIS)
Youinou, G.; Palmiotti, G.; Salvatorre, M.; Imel, G.; Pardo, R.; Kondev, F.; Paul, M.
2010-01-01
An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL's Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.
DEFF Research Database (Denmark)
Carbone, Marco; Lindley, Sam; Montesi, Fabrizio
2016-01-01
Wadler introduced Classical Processes (CP), a calculus based on a propositions-as-types correspondence between propositions of classical linear logic and session types. Carbone et al. introduced Multiparty Classical Processes, a calculus that generalises CP to multiparty session types, by replacing...... the duality of classical linear logic (relating two types) with a more general notion of coherence (relating an arbitrary number of types). This paper introduces variants of CP and MCP, plus a new intermediate calculus of Globally-governed Classical Processes (GCP). We show a tight relation between...
Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar
2012-05-01
Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.
Generalised Brown Clustering and Roll-up Feature Generation
DEFF Research Database (Denmark)
Derczynski, Leon; Chester, Sean
2016-01-01
Brown clustering is an established technique, used in hundreds of computational linguistics papers each year, to group word types that have similar distributional information. It is unsupervised and can be used to create powerful word representations for machine learning. Despite its improbable...... active set size. Moreover, the generalisation permits a novel approach to feature selection from Brown clusters: We show that the standard approach of shearing the Brown clustering output tree at arbitrary bitlengths is lossy and that features should be chosen instead by rolling up Generalised Brown...... hierarchies. The generalisation and corresponding feature generation is more principled, challenging the way Brown clustering is currently understood and applied....
International Nuclear Information System (INIS)
Smith, Graham; Sneve, Malgorzata K.
2008-01-01
Full text: Radiological protection has a long and distinguished history in taking a balanced approach to optimization. Both utilitarian and individual interests and perspectives are addressed through a process of constrained optimisation, with optimisation intended to lead to the most benefit to the most people, and constraints being operative to limit the degree of inequity among the individuals exposed. At least, expressed simplistically, that is what the recommendations on protection are intended to achieve. This paper examines the difficulties in achieving that objective, based on consideration of the active role of optimisation in regulatory supervision of the historic nuclear legacy. This example is chosen because the application of the ALARA principle has important implications for some very major projects whose objective is remediation of existing legacy facilities. But it is also relevant because timely, effective and cost efficient completion of those projects has implications for confidence in the future development of nuclear power and other uses of radioactive materials. It is also an interesting example because legacy management includes mitigation of some major short and long term hazards, but those mitigating measures themselves involve operations with their own risk, cost and benefit profiles. Like any other complex activity, a legacy management project has to be broken down into logistically feasible parts. However, from a regulatory perspective, simultaneous application of ALARA to worker protection, major accident risk mitigation and long-term environmental and human health protection presents its own challenges. Major uncertainties which exacerbate the problem arise from ill-characterised source terms, estimation of the likelihood of unlikely failures in operational processes, and prospective assessment of radiological impacts over many hundreds of years and longer. The projects themselves are set to run over decades, during which time the
Generalising the staircase models
International Nuclear Information System (INIS)
Dorey, P.; Ravanini, F.
1993-01-01
Systems of integral equations are proposed which generalise those previously encountered in connection with the so-called staircase models. Under the assumption that these equations describe the finite-size effects of relativistic field theories via the thermodynamic Bethe ansatz, analytical and numerical evidence is given for the existence of a variety of new roaming renormalisation group trajectories. For each positive integer k and s=0, .., k-1, these is a one-parameter family of trajectories, passing close by the coset conformal field theories G (k) xG (nk+s) /G ((n+1)k+s) before finally flowing to a massive theory for s=0, or to another coset model for s.=|0. (orig.)
Indian Academy of Sciences (India)
To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...
A violation of the uncertainty principle implies a violation of the second law of thermodynamics.
Hänggi, Esther; Wehner, Stephanie
2013-01-01
Uncertainty relations state that there exist certain incompatible measurements, to which the outcomes cannot be simultaneously predicted. While the exact incompatibility of quantum measurements dictated by such uncertainty relations can be inferred from the mathematical formalism of quantum theory, the question remains whether there is any more fundamental reason for the uncertainty relations to have this exact form. What, if any, would be the operational consequences if we were able to go beyond any of these uncertainty relations? Here we give a strong argument that justifies uncertainty relations in quantum theory by showing that violating them implies that it is also possible to violate the second law of thermodynamics. More precisely, we show that violating the uncertainty relations in quantum mechanics leads to a thermodynamic cycle with positive net work gain, which is very unlikely to exist in nature.
When the uncertainty principle goes up to 11 or how to explain quantum physics with heavy metal
Moriarty, Philip
2018-01-01
There are deep and fascinating links between heavy metal and quantum physics. No, there are. Really. While teaching at the University of Nottingham, physicist Philip Moriarty noticed something odd--a surprising number of his students were heavily into metal music. Colleagues, too: a Venn diagram of physicists and metal fans would show a shocking amount of overlap. What's more, it turns out that heavy metal music is uniquely well-suited to explaining quantum principles. In When the Uncertainty Principle Goes Up to Eleven, Moriarty explains the mysteries of the universe's inner workings via drum beats and feedback: You'll discover how the Heisenberg uncertainty principle comes into play with every chugging guitar riff, what wave interference has to do with Iron Maiden, and why metalheads in mosh pits behave just like molecules in a gas. If you're a metal fan trying to grasp the complexities of quantum physics, a quantum physicist baffled by heavy metal, or just someone who'd like to know how the fundamental sci...
Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates
International Nuclear Information System (INIS)
Grassi, Giacomo; Monni, Suvi; Achard, Frederic; Mollicone, Danilo; Federici, Sandro
2008-01-01
A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation
International Nuclear Information System (INIS)
Dupuy, J.P.; Grinbaum, A.
2005-01-01
The analysis of our epistemic situation regarding singular events, such as abrupt climate change, shows essential limitations in the traditional modes of dealing with uncertainty. Typical cognitive barriers lead to the paralysis of action. What is needed is taking seriously the reality of the future. We argue for the application of the methodology of ongoing normative assessment. We show that it is, paradoxically, a matter of forming a project on the basis of a fixed future which one does not want, and this in a coordinated way at the level of social institutions. Ongoing assessment may be viewed as a prescription to live with uncertainty, in a particular sense of the term, in order for a future catastrophe not to occur. The assessment is necessarily normative in that it must include the anticipation of a retrospective ethical judgment on present choices (notion of moral luck). (authors)
The Generalised Phase Contrast Method
DEFF Research Database (Denmark)
Glückstad, Jesper
An analytic framework and a complete description for the design and optimisation of on-axis centred spatially filtering common path systems are presented. The Generalised Phase Contrast method is derived and introduced as the common denominator for these systems basically extending Zernike......’s original phase contrast scheme into a much wider range of operation and application. It is demonstrated that the Generalised Phase Contrast method can be successfully applied to the interpretation and subsequent optimisation of a number of different, commonly applied spatially filtering architectures...... designs and parameter settings. Finally, a number of original applications facilitated by the parallel light-beam encoding of the Generalised Phase Contrast method are briefly outlined. These include among others, wavefront sensing and generation, advanced usercontrolled optical micro...
Khan, Sobia; Vandermorris, Ashley; Shepherd, John; Begun, James W; Lanham, Holly Jordan; Uhl-Bien, Mary; Berta, Whitney
2018-03-21
Complexity thinking is increasingly being embraced in healthcare, which is often described as a complex adaptive system (CAS). Applying CAS to healthcare as an explanatory model for understanding the nature of the system, and to stimulate changes and transformations within the system, is valuable. A seminar series on systems and complexity thinking hosted at the University of Toronto in 2016 offered a number of insights on applications of CAS perspectives to healthcare that we explore here. We synthesized topics from this series into a set of six insights on how complexity thinking fosters a deeper understanding of accepted ideas in healthcare, applications of CAS to actors within the system, and paradoxes in applications of complexity thinking that may require further debate: 1) a complexity lens helps us better understand the nebulous term "context"; 2) concepts of CAS may be applied differently when actors are cognizant of the system in which they operate; 3) actor responses to uncertainty within a CAS is a mechanism for emergent and intentional adaptation; 4) acknowledging complexity supports patient-centred intersectional approaches to patient care; 5) complexity perspectives can support ways that leaders manage change (and transformation) in healthcare; and 6) complexity demands different ways of implementing ideas and assessing the system. To enhance our exploration of key insights, we augmented the knowledge gleaned from the series with key articles on complexity in the literature. Ultimately, complexity thinking acknowledges the "messiness" that we seek to control in healthcare and encourages us to embrace it. This means seeing challenges as opportunities for adaptation, stimulating innovative solutions to ensure positive adaptation, leveraging the social system to enable ideas to emerge and spread across the system, and even more important, acknowledging that these adaptive actions are part of system behaviour just as much as periods of stability are. By
Generalised compositionality in graph transformation
Ghamarian, A.H.; Rensink, Arend; Ehrig, H; Engels, G.; Kreowski, H.J.; Rozenberg, G.
We present a notion of composition applying both to graphs and to rules, based on graph and rule interfaces along which they are glued. The current paper generalises a previous result in two different ways. Firstly, rules do not have to form pullbacks with their interfaces; this enables graph
Dyads, a generalisation of monads
Fokkinga, M.M.
The concept of dyad is defined as the least common generalisation of monads and co-monads. So, taking some of the ingredients to be the identity, the concept specialises to the concept of monad, and taking other ingredients to be the identity it specialises to co-monads. Except for one axiom, all
International Nuclear Information System (INIS)
Perez, J.F.; Coutinho, F.A.B.; Malta, C.P.
1985-01-01
It is shown that critical long distance behaviour for a two-body potential, defining the finiteness or infinitude of the number of negative eigenvalues of Schrodinger operators in ν-dimensions, are given by v sub(k) (r) = - [ν-2/2r] 2 - 1/(2rlnr) 2 + ... - 1/(2rlnr.lnlnr...ln sub(k)r) 2 where k=0,1... for ν not=2 and k=1,2... if ν=2. This result is a consequence of logarithmic corrections to an inequality known as Uncertainty Principle. If the continuum threshold in the N-body problem is defined by a two-cluster break up our results generate corrections to the existing sufficient conditions for the existence of infinitely many bound states. (Author) [pt
McLeod, David; McLeod, Roger
2008-04-01
The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.
Energy Technology Data Exchange (ETDEWEB)
Marchiolli, M.A., E-mail: marcelo_march@bol.com.br [Avenida General Osório 414, Centro, 14.870-100 Jaboticabal, SP (Brazil); Mendonça, P.E.M.F., E-mail: pmendonca@gmail.com [Academia da Força Aérea, C.P. 970, 13.643-970 Pirassununga, SP (Brazil)
2013-09-15
We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the Massar–Spindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the Wiener–Khinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the Massar–Spindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory. -- Highlights: •Conception of a quantum-algebraic framework embracing a new uncertainty principle for unitary operators. •Determination of new restrictions upon the selective process of signals and wavelet bases. •Demonstration of looser bounds interpolating between the tightest bound and the Massar–Spindel inequality. •Construction of finite ground states properly describing the tightest bound. •Establishment of an important connection with the discrete Weyl function.
Directory of Open Access Journals (Sweden)
Yan-Gang Miao
2015-01-01
Full Text Available As a generalized uncertainty principle (GUP leads to the effects of the minimal length of the order of the Planck scale and UV/IR mixing, some significant physical concepts and quantities are modified or corrected correspondingly. On the one hand, we derive the maximally localized states—the physical states displaying the minimal length uncertainty associated with a new GUP proposed in our previous work. On the other hand, in the framework of this new GUP we calculate quantum corrections to the thermodynamic quantities of the Schwardzschild black hole, such as the Hawking temperature, the entropy, and the heat capacity, and give a remnant mass of the black hole at the end of the evaporation process. Moreover, we compare our results with that obtained in the frameworks of several other GUPs. In particular, we observe a significant difference between the situations with and without the consideration of the UV/IR mixing effect in the quantum corrections to the evaporation rate and the decay time. That is, the decay time can greatly be prolonged in the former case, which implies that the quantum correction from the UV/IR mixing effect may give rise to a radical rather than a tiny influence to the Hawking radiation.
Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian
2018-01-01
In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.
Primary small bowel anastomosis in generalised peritonitis
deGraaf, JS; van Goor, Harry; Bleichrodt, RP
Objective: To find out if primary small bowel anastomosis of the bowel is safe in patients with generalised peritonitis who are treated by planned relaparotomies. Design: Retrospective study. Setting: University hospital, The Netherlands. Subjects. 10 Patients with generalised purulent peritonitis
Generalised hypercementosis: a case report.
Seed, Rachel; Nixon, Paul P
2004-10-01
The following case report describes the clinical and radiographical presentation of a female who attended a general dental practice as a new patient. The patient was diagnosed with generalised hypercementosis, possibly attributable to oral neglect. Hypercementosis is associated with a number of aetiological factors, which may be local or systemic in nature. It is important that the general dental practitioner is aware of these factors and is able to distinguish presentation due to a local cause from that of a systemic disease process. The aims of this paper are to illustrate an unusual presentation of hypercementosis and to discuss the radiographic differentiation that led to diagnosis.
An economic uncertainty principle
Czech Academy of Sciences Publication Activity Database
Vošvrda, Miloslav
2000-01-01
Roč. 8, č. 2 (2000), s. 79-87 ISSN 0572-3043 R&D Projects: GA ČR GA402/97/0007; GA ČR GA402/97/0770 Institutional research plan: AV0Z1075907 Subject RIV: BB - Applied Statistics, Operational Research
Schrodinger's Uncertainty Principle?
Indian Academy of Sciences (India)
4>( x), The hat sign over x and p reminds us that they are operators. We have dropped the suffix x on the momentum p but from now on, we are only looking at its x-component. Even though we know nothing about 1jJ( x) except that it is an allowed wave function, we can be sure that J 4>* ¢dx ~ O. In terms of 1jJ, this reads.
Schrodinger's Uncertainty Principle?
Indian Academy of Sciences (India)
Research Institute,· mainly on applications of optical and statistical ... serves to be better known in the classroom. Let us recall the basic algebraic steps in the text book proof. We consider the wave function (which has a free real parameter a) (x + iap)1jJ == x1jJ(x) + ia( -in81jJ/8x) == 4>( x), The hat sign over x and p reminds ...
Generalised shot noise Cox processes
DEFF Research Database (Denmark)
Møller, Jesper; Torrisi, Giovanni Luca
We introduce a new class of Cox cluster processes called generalised shot-noise processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process which drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can...... be random. Thereby a very large class of models for aggregated or clustered point patterns is obtained. Due to the structure of GSNCPs, a number of useful results can be established. We focus first on deriving summary statistics for GSNCPs and next on how to make simulation for GSNCPs. Particularly, results...... for first and second order moment measures, reduced Palm distributions, the -function, simulation with or without edge effects, and conditional simulation of the intensity function driving a GSNCP are given. Our results are exemplified for special important cases of GSNCPs, and we discuss the relation...
Generalised shot noise Cox processes
DEFF Research Database (Denmark)
Møller, Jesper; Torrisi, Giovanni Luca
2005-01-01
We introduce a class of cox cluster processes called generalised shot noise Cox processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process that drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can...... be random. Thereby, a very large class of models for aggregated or clustered point patterns is obtained. Due to the structure of GSNCPs, a number of useful results can be established. We focus first on deriving summary statistics for GSNCPs and, second, on how to simulate such processes. In particular......, results on first- and second-order moment measures, reduced Palm distributions, the J-function, simulation with or without edge effects, and conditional simulation of the intensity function driving a GSNCP are given. Our results are exemplified in important special cases of GSNCPs, and we discuss...
Miller, Jodie
2014-01-01
This paper explores how young Indigenous students' (Year 2 and 3) generalise growing patterns. Piagetian clinical interviews were conducted to determine how students articulated growing pattern generalisations. Two case studies are presented displaying how students used gesture to support and articulate their generalisations of growing patterns.…
Wagner’s theory of generalised heaps
Hollings, Christopher D
2017-01-01
The theories of V. V. Wagner (1908-1981) on abstractions of systems of binary relations are presented here within their historical and mathematical contexts. This book contains the first translation from Russian into English of a selection of Wagner’s papers, the ideas of which are connected to present-day mathematical research. Along with a translation of Wagner’s main work in this area, his 1953 paper ‘Theory of generalised heaps and generalised groups,’ the book also includes translations of three short precursor articles that provide additional context for his major work. Researchers and students interested in both algebra (in particular, heaps, semiheaps, generalised heaps, semigroups, and groups) and differential geometry will benefit from the techniques offered by these translations, owing to the natural connections between generalised heaps and generalised groups, and the role played by these concepts in differential geometry. This book gives examples from present-day mathematics where ideas r...
GENERALISATION OF SUBMARINE FEATURES ON NAUTICAL CHARTS
Directory of Open Access Journals (Sweden)
E. Guilbert
2012-07-01
Full Text Available On most large scale and middle scale maps, relief is represented by contours and spot heights. In order to adapt the representation to the scale, the terrain is generalised either by smoothing or filtering the terrain model or by simplifying the contours. However this approach is not applicable to nautical chart construction where terrain features are selected according to their importance for navigation. This paper presents an approach for the consideration of feature attributes in the generalisation of a set of contours with respect to nautical chart constraints. Features are defined by sets of contours and a set of generalisation operators applied to features is presented. The definitions are introduced in a multi-agent system in order to perform automatic generalisation of a contour set. Results are discussed on a case study and directions for future work are presented.
Cloverleaf skull with generalised bone dysplasia
Energy Technology Data Exchange (ETDEWEB)
Kozlowski, K.; Warren, P.S.; Fisher, C.C.
1985-09-01
A case of cloverleaf skull with generalised bone dysplasia is reported. The authors believe that bone dysplasia associated with cloverleaf is neither identical with thanatophoric dysplasia nor achondroplasia. Until identity of thanatophoric dysplasia and cloverleaf skull with generalised bone dysplasia is proved the diseases should be looked upon as separate entities and the wording ''thanatophoric dysplasia with cloverleaf skull'' should be abolished.
Vezzaro, Luca; Grum, Morten
2014-01-01
An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored in the drainage network, (ii) the expected runoff volume (calculated by radar-based nowcast models) and – most important – (iii) the estimated uncertainty of the runoff forecasts. The inclusion of uncertainty al...
Generalisability of a composite student selection programme
DEFF Research Database (Denmark)
O'Neill, Lotte Dyhrberg; Korsholm, Lars; Wallstedt, Birgitta
2009-01-01
OBJECTIVES The reliability of individual non-cognitive admission criteria in medical education is controversial. Nonetheless, non-cognitive admission criteria appear to be widely used in selection to medicine to supplement the grades of qualifying examinations. However, very few studies have...... examined the overall test generalisability of composites of non-cognitive admission variables in medical education. We examined the generalisability of a composite process for selection to medicine, consisting of four variables: qualifications (application form information); written motivation (in essay...... format); general knowledge (multiple-choice test), and a semi-structured admission interview. The aim of this study was to estimate the generalisability of a composite selection. METHODS: Data from 307 applicants who participated in the admission to medicine in 2007 were available for analysis. Each...
Generalised Computability and Applications to Hybrid Systems
DEFF Research Database (Denmark)
Korovina, Margarita V.; Kudinov, Oleg V.
2001-01-01
We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...... propose an interesting application to formalisation of hybrid systems. We obtain some class of hybrid systems, which trajectories are computable in the sense of computable analysis. This research was supported in part by the RFBR (grants N 99-01-00485, N 00-01- 00810) and by the Siberian Branch of RAS (a...... grant for young researchers, 2000)...
Exactly marginal deformations from exceptional generalised geometry
Energy Technology Data Exchange (ETDEWEB)
Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford,Andrew Wiles Building, Woodstock Road, Oxford, OX2 6GG (United Kingdom); Gabella, Maxime [Institute for Advanced Study,Einstein Drive, Princeton, NJ 08540 (United States); Graña, Mariana [Institut de Physique Théorique, CEA/Saclay,91191 Gif-sur-Yvette (France); Petrini, Michela [Sorbonne Université, UPMC Paris 05, UMR 7589, LPTHE,75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)
2017-01-27
We apply exceptional generalised geometry to the study of exactly marginal deformations of N=1 SCFTs that are dual to generic AdS{sub 5} flux backgrounds in type IIB or eleven-dimensional supergravity. In the gauge theory, marginal deformations are parametrised by the space of chiral primary operators of conformal dimension three, while exactly marginal deformations correspond to quotienting this space by the complexified global symmetry group. We show how the supergravity analysis gives a geometric interpretation of the gauge theory results. The marginal deformations arise from deformations of generalised structures that solve moment maps for the generalised diffeomorphism group and have the correct charge under the generalised Reeb vector, generating the R-symmetry. If this is the only symmetry of the background, all marginal deformations are exactly marginal. If the background possesses extra isometries, there are obstructions that come from fixed points of the moment maps. The exactly marginal deformations are then given by a further quotient by these extra isometries. Our analysis holds for any N=2 AdS{sub 5} flux background. Focussing on the particular case of type IIB Sasaki-Einstein backgrounds we recover the result that marginal deformations correspond to perturbing the solution by three-form flux at first order. In various explicit examples, we show that our expression for the three-form flux matches those in the literature and the obstruction conditions match the one-loop beta functions of the dual SCFT.
On Generalisation of Polynomials in Complex Plane
Directory of Open Access Journals (Sweden)
Maslina Darus
2010-01-01
Full Text Available The generalised Bell and Laguerre polynomials of fractional-order in complex z-plane are defined. Some properties are studied. Moreover, we proved that these polynomials are univalent solutions for second order differential equations. Also, the Laguerre-type of some special functions are introduced.
Acute generalised exanthematous pustulosis secondary to ...
African Journals Online (AJOL)
2012-11-02
Nov 2, 2012 ... Superficial dermal vessels were mildly dilated and contained marginated neutrophils. Special stains for fungi and acid- fast bacilli were negative and no granulomas, dysplastic or malignant cells were found. A histopathological diagnosis of acute generalised exanthematous pustulosis (AGEP) was made.
The oculocerebral syndrome in association with generalised ...
African Journals Online (AJOL)
A 14-year-old girl with generalised hypopigmentation, mental retardation, abnormal movements, and ocular anomalies is described. It is suggested that she represents a further case of oculocerebral albinism, a rare autosomal recessive condition. Reference is made to previous similar cases.
Exactly marginal deformations from exceptional generalised geometry
International Nuclear Information System (INIS)
Ashmore, Anthony; Gabella, Maxime; Graña, Mariana; Petrini, Michela; Waldram, Daniel
2017-01-01
We apply exceptional generalised geometry to the study of exactly marginal deformations of N=1 SCFTs that are dual to generic AdS 5 flux backgrounds in type IIB or eleven-dimensional supergravity. In the gauge theory, marginal deformations are parametrised by the space of chiral primary operators of conformal dimension three, while exactly marginal deformations correspond to quotienting this space by the complexified global symmetry group. We show how the supergravity analysis gives a geometric interpretation of the gauge theory results. The marginal deformations arise from deformations of generalised structures that solve moment maps for the generalised diffeomorphism group and have the correct charge under the generalised Reeb vector, generating the R-symmetry. If this is the only symmetry of the background, all marginal deformations are exactly marginal. If the background possesses extra isometries, there are obstructions that come from fixed points of the moment maps. The exactly marginal deformations are then given by a further quotient by these extra isometries. Our analysis holds for any N=2 AdS 5 flux background. Focussing on the particular case of type IIB Sasaki-Einstein backgrounds we recover the result that marginal deformations correspond to perturbing the solution by three-form flux at first order. In various explicit examples, we show that our expression for the three-form flux matches those in the literature and the obstruction conditions match the one-loop beta functions of the dual SCFT.
Generalised phase contrast: microscopy, manipulation and more
DEFF Research Database (Denmark)
Palima, Darwin; Glückstad, Jesper
2010-01-01
Generalised phase contrast (GPC) not only leads to more accurate phase imaging beyond thin biological samples, but serves as an enabling framework in developing tools over a wide spectrum of contemporary applications in optics and photonics, including optical trapping and micromanipulation, optic...
Hyperscaling violating solutions in generalised EMD theory
Directory of Open Access Journals (Sweden)
Li Li
2017-04-01
Full Text Available This short note is devoted to deriving scaling but hyperscaling violating solutions in a generalised Einstein–Maxwell-Dilaton theory with an arbitrary number of scalars and vectors. We obtain analytic solutions in some special case and discuss the physical constraints on the allowed parameter range in order to have a well-defined holographic ground-state solution.
Hyperscaling violating solutions in generalised EMD theory
Energy Technology Data Exchange (ETDEWEB)
Li, Li, E-mail: lil416@lehigh.edu [Crete Center for Theoretical Physics, Institute for Theoretical and Computational Physics, Department of Physics, University of Crete, 71003 Heraklion (Greece); Crete Center for Quantum Complexity and Nanotechnology, Department of Physics, University of Crete, 71003 Heraklion (Greece); Department of Physics, Lehigh University, Bethlehem, PA, 18018 (United States)
2017-04-10
This short note is devoted to deriving scaling but hyperscaling violating solutions in a generalised Einstein–Maxwell-Dilaton theory with an arbitrary number of scalars and vectors. We obtain analytic solutions in some special case and discuss the physical constraints on the allowed parameter range in order to have a well-defined holographic ground-state solution.
Directory of Open Access Journals (Sweden)
Vernon Cooray
2016-11-01
Full Text Available The paper describes the net momentum transported by the transient electromagnetic radiation field of a long transient dipole in free space. In the dipole a current is initiated at one end and propagates towards the other end where it is absorbed. The results show that the net momentum transported by the radiation is directed along the axis of the dipole where the currents are propagating. In general, the net momentum P transported by the electromagnetic radiation of the dipole is less than the quantity U / c , where U is the total energy radiated by the dipole and c is the speed of light in free space. In the case of a Hertzian dipole, the net momentum transported by the radiation field is zero because of the spatial symmetry of the radiation field. As the effective wavelength of the current decreases with respect to the length of the dipole (or the duration of the current decreases with respect to the travel time of the current along the dipole, the net momentum transported by the radiation field becomes closer and closer to U / c , and for effective wavelengths which are much shorter than the length of the dipole, P ≈ U / c . The results show that when the condition P ≈ U / c is satisfied, the radiated fields satisfy the condition Δ t Δ U ≥ h / 4 π where Δ t is the duration of the radiation, Δ U is the uncertainty in the dissipated energy and h is the Plank constant.
Mezzasalma, Stefano A
2007-03-15
The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected.
Quantum mechanics of a generalised rigid body
International Nuclear Information System (INIS)
Gripaios, Ben; Sutherland, Dave
2016-01-01
We consider the quantum version of Arnold’s generalisation of a rigid body in classical mechanics. Thus, we quantise the motion on an arbitrary Lie group manifold of a particle whose classical trajectories correspond to the geodesics of any one-sided-invariant metric. We show how the derivation of the spectrum of energy eigenstates can be simplified by making use of automorphisms of the Lie algebra and (for groups of type I) by methods of harmonic analysis. We show how the method can be extended to cosets, generalising the linear rigid rotor. As examples, we consider all connected and simply connected Lie groups up to dimension 3. This includes the universal cover of the archetypical rigid body, along with a number of new exactly solvable models. We also discuss a possible application to the topical problem of quantising a perfect fluid. (paper)
Support vector machines and generalisation in HEP
Bevan, Adrian; Gamboa Goñi, Rodrigo; Hays, Jon; Stevenson, Tom
2017-10-01
We review the concept of Support Vector Machines (SVMs) and discuss examples of their use in a number of scenarios. Several SVM implementations have been used in HEP and we exemplify this algorithm using the Toolkit for Multivariate Analysis (TMVA) implementation. We discuss examples relevant to HEP including background suppression for H → τ + τ ‑ at the LHC with several different kernel functions. Performance benchmarking leads to the issue of generalisation of hyper-parameter selection. The avoidance of fine tuning (over training or over fitting) in MVA hyper-parameter optimisation, i.e. the ability to ensure generalised performance of an MVA that is independent of the training, validation and test samples, is of utmost importance. We discuss this issue and compare and contrast performance of hold-out and k-fold cross-validation. We have extended the SVM functionality and introduced tools to facilitate cross validation in TMVA and present results based on these improvements.
Open quantum generalisation of Hopfield neural networks
Rotondo, P.; Marcuzzi, M.; Garrahan, J. P.; Lesanovsky, I.; Müller, M.
2018-03-01
We propose a new framework to understand how quantum effects may impact on the dynamics of neural networks. We implement the dynamics of neural networks in terms of Markovian open quantum systems, which allows us to treat thermal and quantum coherent effects on the same footing. In particular, we propose an open quantum generalisation of the Hopfield neural network, the simplest toy model of associative memory. We determine its phase diagram and show that quantum fluctuations give rise to a qualitatively new non-equilibrium phase. This novel phase is characterised by limit cycles corresponding to high-dimensional stationary manifolds that may be regarded as a generalisation of storage patterns to the quantum domain.
Quantum field theory in generalised Snyder spaces
International Nuclear Information System (INIS)
Meljanac, S.; Meljanac, D.; Mignemi, S.; Štrajn, R.
2017-01-01
We discuss the generalisation of the Snyder model that includes all possible deformations of the Heisenberg algebra compatible with Lorentz invariance and investigate its properties. We calculate perturbatively the law of addition of momenta and the star product in the general case. We also undertake the construction of a scalar field theory on these noncommutative spaces showing that the free theory is equivalent to the commutative one, like in other models of noncommutative QFT.
Quantum field theory in generalised Snyder spaces
Energy Technology Data Exchange (ETDEWEB)
Meljanac, S.; Meljanac, D. [Rudjer Bošković Institute, Bijenička cesta 54, 10002 Zagreb (Croatia); Mignemi, S., E-mail: smignemi@unica.it [Dipartimento di Matematica e Informatica, Università di Cagliari, viale Merello 92, 09123 Cagliari (Italy); INFN, Sezione di Cagliari, Cittadella Universitaria, 09042 Monserrato (Italy); Štrajn, R. [Dipartimento di Matematica e Informatica, Università di Cagliari, viale Merello 92, 09123 Cagliari (Italy); INFN, Sezione di Cagliari, Cittadella Universitaria, 09042 Monserrato (Italy)
2017-05-10
We discuss the generalisation of the Snyder model that includes all possible deformations of the Heisenberg algebra compatible with Lorentz invariance and investigate its properties. We calculate perturbatively the law of addition of momenta and the star product in the general case. We also undertake the construction of a scalar field theory on these noncommutative spaces showing that the free theory is equivalent to the commutative one, like in other models of noncommutative QFT.
DEFF Research Database (Denmark)
Andersen, Per Kragh; Klein, John P.; Rosthøj, Susanne
2003-01-01
Generalised estimating equation; Generalised linear model; Jackknife pseudo-value; Logistic regression; Markov Model; Multi-state model......Generalised estimating equation; Generalised linear model; Jackknife pseudo-value; Logistic regression; Markov Model; Multi-state model...
Generalising the coupling between spacetime and matter
Energy Technology Data Exchange (ETDEWEB)
Carloni, Sante, E-mail: sante.carloni@gmail.com
2017-03-10
We explore the idea that the coupling between matter and spacetime is more complex than the one originally envisioned by Einstein. We propose that such coupling takes the form of a new fundamental tensor in the Einstein field equations. We then show that the introduction of this tensor can account for dark phenomenology in General Relativity, maintaining a weak field limit compatible with standard Newtonian gravitation. The same paradigm can be applied any other theory of gravitation. We show, as an example, that in the context of conformal gravity a generalised coupling is able to solve compatibility issues between the matter and the gravitational sector.
Generalising the coupling between spacetime and matter
Directory of Open Access Journals (Sweden)
Sante Carloni
2017-03-01
Full Text Available We explore the idea that the coupling between matter and spacetime is more complex than the one originally envisioned by Einstein. We propose that such coupling takes the form of a new fundamental tensor in the Einstein field equations. We then show that the introduction of this tensor can account for dark phenomenology in General Relativity, maintaining a weak field limit compatible with standard Newtonian gravitation. The same paradigm can be applied any other theory of gravitation. We show, as an example, that in the context of conformal gravity a generalised coupling is able to solve compatibility issues between the matter and the gravitational sector.
Generalised model for anisotropic compact stars
Energy Technology Data Exchange (ETDEWEB)
Maurya, S.K. [University of Nizwa, Department of Mathematical and Physical Sciences College of Arts and Science, Nizwa (Oman); Gupta, Y.K. [Raj Kumar Goel Institute of Technology, Department of Mathematics, Ghaziabad, Uttar Pradesh (India); Ray, Saibal [Government College of Engineering and Ceramic Technology, Department of Physics, Kolkata, West Bengal (India); Deb, Debabrata [Indian Institute of Engineering Science and Technology, Shibpur, Department of Physics, Howrah, West Bengal (India)
2016-12-15
In the present investigation an exact generalised model for anisotropic compact stars of embedding class 1 is sought with a general relativistic background. The generic solutions are verified by exploring different physical aspects, viz. energy conditions, mass-radius relation, stability of the models, in connection to their validity. It is observed that the model presented here for compact stars is compatible with all these physical tests and thus physically acceptable as far as the compact star candidates RXJ 1856-37, SAX J 1808.4-3658 (SS1) and SAX J 1808.4-3658 (SS2) are concerned. (orig.)
Generalised empirical method for predicting surface subsidence
International Nuclear Information System (INIS)
Zhang, M.; Bhattacharyya, A.K.
1994-01-01
Based on a simplified strata parameter, i.e. the ratio of total thickness of the strong rock beds in an overburden to the overall thickness of the overburden, a Generalised Empirical Method (GEM) is described for predicting the maximum subsidence and the shape of a complete transverse subsidence profile due to a single completely extracted longwall panel. In the method, a nomogram for predicting the maximum surface subsidence is first developed from the data collected from subsidence measurements worldwide. Then, a method is developed for predicting the shapes of complete transfer subsidence profiles for a horizontal seam and ground surface and is verified by case studies. 13 refs., 9 figs., 2 tabs
Uncertainty Principles and Fourier Analysis
Indian Academy of Sciences (India)
, consider the quantity J (x - a)2If(x)/2dx. -00. (To convince herself that the more concentrated f is around a, the smaller this quantity will be, the reader is encouraged. 00 to solve the following easy exercise: Suppose f If(x)12d,x = -00. 1 and f is ...
Uncertainty Principles and Fourier Analysis
Indian Academy of Sciences (India)
analysis on the part of the reader. Those who are not fa- miliar with Fourier analysis are encouraged to look up Box. 1 along with [3]. (A) Heisenberg's inequality: Let us measure concentration in terms of standard deviation i.e. for a square integrable func-. 00 tion defined on 1R and normalized so that J If(x)12d,x = 1,. -00. 00.
Generalised structures for N=1 AdS backgrounds
Energy Technology Data Exchange (ETDEWEB)
Coimbra, André [Institut für Theoretische Physik & Center for Quantum Engineering and Spacetime Research,Leibniz Universität Hannover,Appelstraße 2, 30167 Hannover (Germany); Strickland-Constable, Charles [Institut de physique théorique, Université Paris Saclay, CEA, CNRS, Orme des Merisiers, F-91191 Gif-sur-Yvette (France)
2016-11-16
We expand upon a claim made in a recent paper [http://arxiv.org/abs/1411.5721] that generic minimally supersymmetric AdS backgrounds of warped flux compactifications of Type II and M theory can be understood as satisfying a straightforward weak integrability condition in the language of E{sub d(d)}×ℝ{sup +} generalised geometry. Namely, they are spaces admitting a generalised G-structure set by the Killing spinor and with constant singlet generalised intrinsic torsion.
DEFF Research Database (Denmark)
Vezzaro, Luca; Grum, Morten
2014-01-01
An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored in the drai......An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored...... and their uncertainty contributed to further improving the performance of drainage systems. The results of this paper will contribute to the wider usage of global RTC methods in the management of urban drainage networks....
Asymptotic Behaviour of Total Generalised Variation
Papafitsoros, Konstantinos
2015-01-01
© Springer International Publishing Switzerland 2015. The recently introduced second order total generalised variation functional TGV2 β,α has been a successful regulariser for image processing purposes. Its definition involves two positive parameters α and β whose values determine the amount and the quality of the regularisation. In this paper we report on the behaviour of TGV2 β,α in the cases where the parameters α, β as well as their ratio β/α becomes very large or very small. Among others, we prove that for sufficiently symmetric two dimensional data and large ratio β/α, TGV2 β,α regularisation coincides with total variation (TV) regularization
Threshold corrections, generalised prepotentials and Eichler integrals
Angelantonj, Carlo; Pioline, Boris
2015-06-12
We continue our study of one-loop integrals associated to BPS-saturated amplitudes in $\\mathcal{N}=2$ heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur-Poincar\\'e series in the complex structure modulus. The closure of Niebur-Poincar\\'e series under modular derivatives implies that such integrals derive from holomorphic prepotentials $f_n$, generalising the familiar prepotential of $\\mathcal{N}=2$ supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involv...
Acute generalised exanthematous pustulosis: An update
Directory of Open Access Journals (Sweden)
Abhishek De
2018-01-01
Full Text Available Acute generalised exanthematous pustulosis (AGEP is a severe cutaneous adverse reaction and is attributed to drugs in more than 90% of cases. It is a rare disease, with an estimated incidence of 1–5 patients per million per year. The clinical manifestations characterised by the rapid development of sterile pustular lesions, fever and leucocytosis. Number of drugs has been reported to be associated with AGEP, most common being the antibiotics. Histopathologically there is intraepidermal pustules and papillary dermal oedema with neutrophilic and eosinophilic infiltrations. Systemic involvement can be present in more severe cases. Early diagnosis with withdrawal of the causative drug is the most important step in the management. Treatment includes supportive care, prevention of antibiotics and use of a potent topical steroid.
A generalised groundwater flow equation using the concept of non ...
African Journals Online (AJOL)
head. This generalised law and the law of conservation of mass are then used to derive a new equation for groundwater flow. Numerical solutions of this equation for various fractional orders of the derivatives are compared with experimental data and the Barker generalised radial flow model for which a fractal dimension for ...
Location of collinear equilibrium points in the generalised ...
African Journals Online (AJOL)
We have discussed the location of collinear equilibrium points in the generalised photogravitational restricted three body problem. The problem is generalised in the sense that both primaries are oblate spheroid. They are source of radiation as well. We have found the solution for the location of collinear point L1. We found ...
Li, Heling; Ren, Jinxiu; Wang, Wenwei; Yang, Bin; Shen, Hongjun
2018-02-01
Using the semi-classical (Thomas-Fermi) approximation, the thermodynamic properties of ideal Fermi gases in a harmonic potential in an n-dimensional space are studied under the generalized uncertainty principle (GUP). The mean particle number, internal energy, heat capacity and other thermodynamic variables of the Fermi system are calculated analytically. Then, analytical expressions of the mean particle number, internal energy, heat capacity, chemical potential, Fermi energy, ground state energy and amendments of the GUP are obtained at low temperatures. The influence of both the GUP and the harmonic potential on the thermodynamic properties of a copper-electron gas and other systems with higher electron densities are studied numerically at low temperatures. We find: (1) When the GUP is considered, the influence of the harmonic potential is very much larger, and the amendments produced by the GUP increase by eight to nine orders of magnitude compared to when no external potential is applied to the electron gas. (2) The larger the particle density, or the smaller the particle masses, the bigger the influence of the GUP. (3) The effect of the GUP increases with the increase in the spatial dimensions. (4) The amendments of the chemical potential, Fermi energy and ground state energy increase with an increase in temperature, while the heat capacity decreases. T F0 is the Fermi temperature of the ideal Fermi system in a harmonic potential. When the temperature is lower than a certain value (0.22 times T F0 for the copper-electron gas, and this value decreases with increasing electron density), the amendment to the internal energy is positive, however, the amendment decreases with increasing temperature. When the temperature increases to the value, the amendment is zero, and when the temperature is higher than the value, the amendment to the internal energy is negative and the absolute value of the amendment increases with increasing temperature. (5) When electron
Deformations of the generalised Picard bundle
International Nuclear Information System (INIS)
Biswas, I.; Brambila-Paz, L.; Newstead, P.E.
2004-08-01
Let X be a nonsingular algebraic curve of genus g ≥ 3, and let Mξ denote the moduli space of stable vector bundles of rank n ≥ 2 and degree d with fixed determinant ξ over X such that n and d are coprime. We assume that if g = 3 then n ≥ 4 and if g = 4 then n ≥ 3, and suppose further that n 0 , d 0 are integers such that n 0 ≥ 1 and nd 0 + n 0 d > nn 0 (2g - 2). Let E be a semistable vector bundle over X of rank n 0 and degree d 0 . The generalised Picard bundle W ξ (E) is by definition the vector bundle over M ξ defined by the direct image p M ξ *(U ξ x p X * E) where U ξ is a universal vector bundle over X x M ξ . We obtain an inversion formula allowing us to recover E from W ξ (E) and show that the space of infinitesimal deformations of W ξ (E) is isomorphic to H 1 (X, End(E)). This construction gives a locally complete family of vector bundles over M ξ parametrised by the moduli space M(n 0 ,d 0 ) of stable bundles of rank n 0 and degree d 0 over X. If (n 0 ,d 0 ) = 1 and W ξ (E) is stable for all E is an element of M(n 0 ,d 0 ), the construction determines an isomorphism from M(n 0 ,d 0 ) to a connected component M 0 of a moduli space of stable sheaves over M ξ . This applies in particular when n 0 = 1, in which case M 0 is isomorphic to the Jacobian J of X as a polarised variety. The paper as a whole is a generalisation of results of Kempf and Mukai on Picard bundles over J, and is also related to a paper of Tyurin on the geometry of moduli of vector bundles. (author)
Generalised derived limits for radioisotopes of plutonium
International Nuclear Information System (INIS)
Simmonds, J.R.; Harrison, N.T.; Linsley, G.S.
1982-01-01
Generalised Derived Limits (GDLs) are evaluated for plutonium isotopes in materials from the terrestrial and aquatic environments and for discharge to atmosphere. They are intended for use as convenient reference levels against which the results of environmental monitoring can be compared and atmospheric discharges assessed. GDLs are calculated using assumptions concerning the habits and location of the critical group of exposed individuals in the population. They are intended for use when the environmental contamination or discharge to atmosphere is less than about 5% of the GDL. If the level of environmental contamination or discharge to the atmosphere exceeds this percentage of the GDL it does not necessarily mean that the dose equivalents to members of the public are approaching the dose equivalent limit. It is rather an indication that it may be appropriate to obtain a more specific derived limit for the particular situation by reviewing the values of the parameters involved in the calculation. GDL values are specified for plutonium radionuclides in air, water, soil, sediments and various foodstuffs derived from the terrestrial and aquatic environments. GDLs are also given for plutonium radionuclides on terrestrial surfaces and for their discharge to atmosphere. (author)
Generalised derived limits for radioisotopes of iodine
International Nuclear Information System (INIS)
Hughes, J.S.; Haywood, S.M.; Simmonds, J.R.
1984-04-01
Generalised Derived Limits (GDLs) are evaluated for iodine-125,129,131,132,133,134,135 in selected materials from the terrestrial and aquatic environments and for discharge to atmosphere. They are intended for use as convenient reference levels against which the results of environmental monitoring can be compared and atmospheric discharges assessed. GDLs are intended for use when the environmental contamination or discharge to atmosphere is less than about 5% of the GDL. If the level of environmental contamination or discharge to the atmosphere exceeds this percentage of the GDL it does not necessarily mean that the dose equivalents to members of the public are approaching the dose equivalent limit. It is rather an indication that it may be appropriate to obtain a more specific derived limit for the particular situation by reviewing the values of the parameters involved in the calculation. GDL values are specified for iodine radionuclides in water, soil, grass, sediments and various foodstuffs derived from the terrestrial and aquatic environments. GDLs are also given for iodine radionuclides on terrestrial surfaces and for their discharge to atmosphere. (author)
Threshold corrections, generalised prepotentials and Eichler integrals
Directory of Open Access Journals (Sweden)
Carlo Angelantonj
2015-08-01
Full Text Available We continue our study of one-loop integrals associated to BPS-saturated amplitudes in N=2 heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur–Poincaré series in the complex structure modulus. The closure of Niebur–Poincaré series under modular derivatives implies that such integrals derive from holomorphic prepotentials fn, generalising the familiar prepotential of N=2 supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involving the Γ0(N Hauptmodul, a full characterisation of holomorphic prepotentials including their quantum monodromies, as well as concrete formulæ for holomorphic Yukawa couplings.
Accept & Reject Statement-Based Uncertainty Models
E. Quaeghebeur (Erik); G. de Cooman; F. Hermans (Felienne)
2015-01-01
textabstractWe develop a framework for modelling and reasoning with uncertainty based on accept and reject statements about gambles. It generalises the frameworks found in the literature based on statements of acceptability, desirability, or favourability and clarifies their relative position. Next
Abad, A
2015-09-15
The purpose of this paper is to introduce an environmental generalised productivity indicator and its ratio-based counterpart. The innovative environmental generalised total factor productivity measures inherit the basic structure of both Hicks-Moorsteen productivity index and Luenberger-Hicks-Moorsteen productivity indicator. This methodological contribution shows that these new environmental generalised total factor productivity measures yield the earlier standard Hicks-Moorsteen index and Luenberger-Hicks-Moorsteen indicator, as well as environmental performance index, as special cases. Copyright © 2015 Elsevier Ltd. All rights reserved.
Supersymmetric backgrounds, the Killing superalgebra, and generalised special holonomy
Energy Technology Data Exchange (ETDEWEB)
Coimbra, André [Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Strickland-Constable, Charles [Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Institut de physique théorique, Université Paris Saclay, CEA, CNRS,Orme des Merisiers, F-91191 Gif-sur-Yvette (France)
2016-11-10
We prove that, for M theory or type II, generic Minkowski flux backgrounds preserving N supersymmetries in dimensions D≥4 correspond precisely to integrable generalised G{sub N} structures, where G{sub N} is the generalised structure group defined by the Killing spinors. In other words, they are the analogues of special holonomy manifolds in E{sub d(d)}×ℝ{sup +} generalised geometry. In establishing this result, we introduce the Kosmann-Dorfman bracket, a generalisation of Kosmann’s Lie derivative of spinors. This allows us to write down the internal sector of the Killing superalgebra, which takes a rather simple form and whose closure is the key step in proving the main result. In addition, we find that the eleven-dimensional Killing superalgebra of these backgrounds is necessarily the supertranslational part of the N-extended super-Poincaré algebra.
Exceptional generalised geometry for massive IIA and consistent reductions
Energy Technology Data Exchange (ETDEWEB)
Cassani, Davide; Felice, Oscar de; Petrini, Michela [LPTHE, Sorbonne Universités UPMC Paris 06, CNRS,4 place Jussieu, F-75005, Paris (France); Strickland-Constable, Charles [Institut de physique théorique, Université Paris Saclay, CEA, CNRS,Orme des Merisiers, F-91191 Gif-sur-Yvette (France); Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)
2016-08-10
We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S{sup 6}, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO(p,7−p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S{sup d}, d=4,3,2, leading to a maximally supersymmetric reduction with gauge group SO(d+1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.
Generalised Scherk-Schwarz reductions from gauged supergravity
Inverso, Gianluca
2017-12-01
A procedure is described to construct generalised Scherk-Schwarz uplifts of gauged supergravities. The internal manifold, fluxes, and consistent truncation Ansatz are all derived from the embedding tensor of the lower-dimensional theory. We first describe the procedure to construct generalised Leibniz parallelisable spaces where the vector components of the frame are embedded in the adjoint representation of the gauge group, as specified by the embedding tensor. This allows us to recover the generalised Scherk-Schwarz reductions known in the literature and to prove a no-go result for the uplift of ω-deformed SO( p, q) gauged maximal supergravities. We then extend the construction to arbitrary generalised Leibniz parallelisable spaces, which turn out to be torus fibrations over manifolds in the class above.
Rational first integrals of geodesic equations and generalised hidden symmetries
International Nuclear Information System (INIS)
Aoki, Arata; Houri, Tsuyoshi; Tomoda, Kentaro
2016-01-01
We discuss novel generalisations of Killing tensors, which are introduced by considering rational first integrals of geodesic equations. We introduce the notion of inconstructible generalised Killing tensors, which cannot be constructed from ordinary Killing tensors. Moreover, we introduce inconstructible rational first integrals, which are constructed from inconstructible generalised Killing tensors, and provide a method for checking the inconstructibility of a rational first integral. Using the method, we show that the rational first integral of the Collinson–O’Donnell solution is not inconstructible. We also provide several examples of metrics admitting an inconstructible rational first integral in two and four-dimensions, by using the Maciejewski–Przybylska system. Furthermore, we attempt to generalise other hidden symmetries such as Killing–Yano tensors. (paper)
Towards a 'pointless' generalisation of Yang-Mills theory
International Nuclear Information System (INIS)
Chan Hongmo; Tsou Sheungtsun
1989-05-01
We examine some generalisations in physical concepts of gauge theories, leading towards a scenario corresponding to non-commutative geometry, where the concept of locality loses its usual meaning of being associated with points on a base manifold and becomes intertwined with the concept of internal symmetry, suggesting thereby a gauge theory of extended objects. Examples are given where such generalised gauge structures can be realised, in particular that of string theory. (author)
Directory of Open Access Journals (Sweden)
Vernon Cooray
2017-02-01
Full Text Available Recently, we published two papers in this journal. One of the papers dealt with the action of the radiation fields generated by a traveling-wave element and the other dealt with the momentum transferred by the same radiation fields and their connection to the time energy uncertainty principle. The traveling-wave element is defined as a conductor through which a current pulse propagates with the speed of light in free space from one end of the conductor to the other without attenuation. The goal of this letter is to combine the information provided in these two papers together and make conclusive statements concerning the connection between the energy dissipated by the radiation fields, the time energy uncertainty principle and the elementary charge. As we will show here, the results presented in these two papers, when combined together, show that the time energy uncertainty principle can be applied to the classical radiation emitted by a traveling-wave element and it results in the prediction that the smallest charge associated with the current that can be detected using radiated energy as a vehicle is on the order of the elementary charge. Based on the results, an expression for the fine structure constant is obtained. This is the first time that an order of magnitude estimation of the elementary charge based on electromagnetic radiation fields is obtained. Even though the results obtained in this paper have to be considered as order of magnitude estimations, a strict interpretation of the derived equations shows that the fine structure constant or the elementary charge may change as the size or the age of the universe increases.
Kolen, B.
2013-01-01
Evacuation is a measure taken to potentially reduce the loss of life and damage to movable goods. This thesis focuses on the Netherlands as a representative urbanized delta and flood risk management. The central element of this thesis is uncertainty. Evacuation has benefits but can be costly.
International Nuclear Information System (INIS)
1997-10-01
Plans for disposing of radioactive wastes have raised a number of unique and mostly philosophical problems, mainly due to the very long time-scales which have to be considered. While there is general agreement on disposal concepts and on many aspects of a safety philosophy, consensus on a number of issues remains to be achieved. The IAEA established a subgroup under the International Radioactive Waste Management Advisory Committee (INWAC). The subgroup started its work in 1991 as the ''INWAC Subgroup on Principles and Criteria for Radioactive Waste Disposal''. With the reorganization in 1995 of IAEA senior advisory committees in the nuclear safety area, the title of the group was changed to ''Working Group on Principles and Criteria for Radioactive Waste Disposal''. The working group is intended to provide an open forum for: (1) the discussion and resolution of contentious issues, especially those with an international component, in the area of principles and criteria for safe disposal of waste; (2) the review and analysis of new ideas and concepts in the subject area; (3) establishing areas of consensus; (4) the consideration of issues related to safety principles and criteria in the IAEA's Radioactive Waste Safety Standards (RADWASS) programme; (5) the exchange of information on national safety criteria and policies for radioactive waste disposal. This is the third report of the working group and it deals with the subject of regulatory decision making under conditions of uncertainty which is a matter of concern with respect to disposal of radioactive wastes underground. 14 refs
Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann
2010-05-01
To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a
Collision entropy and optimal uncertainty
Bosyk, G. M.; Portesi, M.; Plastino, A.
2011-01-01
We propose an alternative measure of quantum uncertainty for pairs of arbitrary observables in the 2-dimensional case, in terms of collision entropies. We derive the optimal lower bound for this entropic uncertainty relation, which results in an analytic function of the overlap of the corresponding eigenbases. Besides, we obtain the minimum uncertainty states. We compare our relation with other formulations of the uncertainty principle.
Reducing the generalised Sudoku problem to the Hamiltonian cycle problem
Directory of Open Access Journals (Sweden)
Michael Haythorpe
2016-12-01
Full Text Available The generalised Sudoku problem with N symbols is known to be NP-complete, and hence is equivalent to any other NP-complete problem, even for the standard restricted version where N is a perfect square. In particular, generalised Sudoku is equivalent to the, classical, Hamiltonian cycle problem. A constructive algorithm is given that reduces generalised Sudoku to the Hamiltonian cycle problem, where the resultant instance of Hamiltonian cycle problem is sparse, and has O(N3 vertices. The Hamiltonian cycle problem instance so constructed is a directed graph, and so a (known conversion to undirected Hamiltonian cycle problem is also provided so that it can be submitted to the best heuristics. A simple algorithm for obtaining the valid Sudoku solution from the Hamiltonian cycle is provided. Techniques to reduce the size of the resultant graph are also discussed.
Thoracic involvement in generalised lymphatic anomaly (or lymphangiomatosis
Directory of Open Access Journals (Sweden)
Francesca Luisi
2016-06-01
Full Text Available Generalised lymphatic anomaly (GLA, also known as lymphangiomatosis, is a rare disease caused by congenital abnormalities of lymphatic development. It usually presents in childhood but can also be diagnosed in adults. GLA encompasses a wide spectrum of clinical manifestations ranging from single-organ involvement to generalised disease. Given the rarity of the disease, most of the information regarding it comes from case reports. To date, no clinical trials concerning treatment are available. This review focuses on thoracic GLA and summarises possible diagnostic and therapeutic approaches.
Uncertainty and complementarity in axiomatic quantum mechanics
International Nuclear Information System (INIS)
Lahti, P.J.
1980-01-01
An investigation of the uncertainty principle and the complementarity principle is carried through. The physical content of these principles and their representation in the conventional Hilbert space formulation of quantum mechanics forms a natural starting point. Thereafter is presented more general axiomatic framework for quantum mechanics, namely, a probability function formulation of the theory. Two extra axioms are stated, reflecting the ideas of the uncertainty principle and the complementarity principle, respectively. The quantal features of these axioms are explicated. (author)
A note on a generalisation of Weyl's theory of gravitation
International Nuclear Information System (INIS)
Dereli, T.; Tucker, R.W.
1982-01-01
A scale-invariant gravitational theory due to Bach and Weyl is generalised by the inclusion of space-time torsion. The difference between the arbitrary and zero torsion constrained variations of the Weyl action is elucidated. Conformal rescaling properties of the gravitational fields are discussed. A new class of classical solutions with torsion is presented. (author)
A ten-year histopathological study of generalised lymphadenopathy ...
African Journals Online (AJOL)
This study was undertaken to study the histopathology of generalised lymphadenopathy in India, as well as the demographics of the study population. Method: This study was conducted for a period of 10 years (August 1997-July 2007), of which eight years were retrospective, from August 1997-July 2005, and two years ...
A ten-year histopathological study of generalised lymphadenopathy ...
African Journals Online (AJOL)
2010-07-31
Jul 31, 2010 ... non-Hodgkin's lymphoma, and 18 cases (7.37%) were metastatic malignancy. Conclusion: In this study, the most common cause of generalised lymphadenopathy was granulomatous lymphadenitis, followed by reactive lymphadenitis. Among the neoplastic lesions, metastatic malignancy accounted for ...
Gait analysis of adults with generalised joint hypermobility
DEFF Research Database (Denmark)
Simonsen, Erik B; Tegner, Heidi; Alkjær, Tine
2012-01-01
BACKGROUND: The majority of adults with Generalised Joint Hypermobility experience symptoms such as pain and joint instability, which is likely to influence their gait pattern. Accordingly, the purpose of the present project was to perform a biomechanical gait analysis on a group of patients...
Generalisation of language and knowledge models for corpus analysis
Loss, Anton
2012-01-01
This paper takes new look on language and knowledge modelling for corpus linguistics. Using ideas of Chaitin, a line of argument is made against language/knowledge separation in Natural Language Processing. A simplistic model, that generalises approaches to language and knowledge, is proposed. One of hypothetical consequences of this model is Strong AI.
Travelling wave solutions of (2 1)-dimensional generalised time ...
Indian Academy of Sciences (India)
Youwei Zhang
2018-02-09
Feb 9, 2018 ... Keywords. Time-fractional Hirota equation; fractional complex transform; complete discrimination system; tanh- expansion; travelling wave. PACS Nos 02.30.Jr; 05.45.Yv; 04.20.Jb. 1. Introduction. We consider the solution of the (2 + 1)-dimensional generalised time-fractional Hirota equation. { i∂ α t u + uxy ...
Equilibrium points in the generalised photogravitational non-planar ...
African Journals Online (AJOL)
We generalised the photogravitational non-planar restricted three body problem by considering the smaller primary as an oblate spheroid. With both the primaries radiating, we located the equilibrium points which lie outside the orbital plane, contrary to the classical case. Besides finding the equations of motion of the ...
Page 1 Compactification of generalised Jacobians 425 Next we ...
Indian Academy of Sciences (India)
Compactification of generalised Jacobians 425. Next we study the infinitesimal deformation of torsion-free sheaves. Let X be a projective integral Gorenstein curve (A curve X as in Propositicin III.1.7, above, is easily seen to be Gorenstein). Let F be a torsion-free coherent. Ox-Module and F., an infinitesimal deformation of F ...
Adapting Metacognitive Therapy to Children with Generalised Anxiety Disorder
DEFF Research Database (Denmark)
Esbjørn, Barbara Hoff; Normann, Nicoline; Reinholdt-Dunne, Marie Louise
2015-01-01
-c) with generalised anxiety disorder (GAD) and create suggestions for an adapted manual. The adaptation was based on the structure and techniques used in MCT for adults with GAD. However, the developmental limitations of children were taken into account. For instance, therapy was aided with worksheets, practical...
Young Indigenous Students en Route to Generalising Growing Patterns
Miller, Jodie
2016-01-01
This paper presents a hypothesised learning trajectory for a Year 3 Indigenous student en route to generalising growing patterns. The trajectory emerged from data collected across a teaching experiment (students n = 18; including a pre-test and three 45-minute mathematics lessons) and clinical interviews (n = 3). A case study of one student is…
Generalised time functions and finiteness of the Lorentzian distance
Rennie, Adam; Whale, Ben E.
2014-01-01
We show that finiteness of the Lorentzian distance is equivalent to the existence of generalised time functions with gradient uniformly bounded away from light cones. To derive this result we introduce new techniques to construct and manipulate achronal sets. As a consequence of these techniques we obtain a functional description of the Lorentzian distance extending the work of Franco and Moretti.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Generalisability of an online randomised controlled trial: an empirical analysis.
Wang, Cheng; Mollan, Katie R; Hudgens, Michael G; Tucker, Joseph D; Zheng, Heping; Tang, Weiming; Ling, Li
2018-02-01
Investigators increasingly use online methods to recruit participants for randomised controlled trials (RCTs). However, the extent to which participants recruited online represent populations of interest is unknown. We evaluated how generalisable an online RCT sample is to men who have sex with men in China. Inverse probability of sampling weights (IPSW) and the G-formula were used to examine the generalisability of an online RCT using model-based approaches. Online RCT data and national cross-sectional study data from China were analysed to illustrate the process of quantitatively assessing generalisability. The RCT (identifier NCT02248558) randomly assigned participants to a crowdsourced or health marketing video for promotion of HIV testing. The primary outcome was self-reported HIV testing within 4 weeks, with a non-inferiority margin of -3%. In the original online RCT analysis, the estimated difference in proportions of HIV tested between the two arms (crowdsourcing and health marketing) was 2.1% (95% CI, -5.4% to 9.7%). The hypothesis that the crowdsourced video was not inferior to the health marketing video to promote HIV testing was not demonstrated. The IPSW and G-formula estimated differences were -2.6% (95% CI, -14.2 to 8.9) and 2.7% (95% CI, -10.7 to 16.2), with both approaches also not establishing non-inferiority. Conducting generalisability analysis of an online RCT is feasible. Examining the generalisability of online RCTs is an important step before an intervention is scaled up. NCT02248558. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Change and uncertainty in quantum systems
International Nuclear Information System (INIS)
Franson, J.D.
1996-01-01
A simple inequality shows that any change in the expectation value of an observable quantity must be associated with some degree of uncertainty. This inequality is often more restrictive than the Heisenberg uncertainty principle. copyright 1996 The American Physical Society
A Generalised Fault Protection Structure Proposed for Uni-grounded Low-Voltage AC Microgrids
Bui, Duong Minh; Chen, Shi-Lin; Lien, Keng-Yu; Jiang, Jheng-Lun
2016-04-01
This paper presents three main configurations of uni-grounded low-voltage AC microgrids. Transient situations of a uni-grounded low-voltage (LV) AC microgrid (MG) are simulated through various fault tests and operation transition tests between grid-connected and islanded modes. Based on transient simulation results, available fault protection methods are proposed for main and back-up protection of a uni-grounded AC microgrid. In addition, concept of a generalised fault protection structure of uni-grounded LVAC MGs is mentioned in the paper. As a result, main contributions of the paper are: (i) definition of different uni-grounded LVAC MG configurations; (ii) analysing transient responses of a uni-grounded LVAC microgrid through line-to-line faults, line-to-ground faults, three-phase faults and a microgrid operation transition test, (iii) proposing available fault protection methods for uni-grounded microgrids, such as: non-directional or directional overcurrent protection, under/over voltage protection, differential current protection, voltage-restrained overcurrent protection, and other fault protection principles not based on phase currents and voltages (e.g. total harmonic distortion detection of currents and voltages, using sequence components of current and voltage, 3I0 or 3V0 components), and (iv) developing a generalised fault protection structure with six individual protection zones to be suitable for different uni-grounded AC MG configurations.
Free Fall and the Equivalence Principle Revisited
Pendrill, Ann-Marie
2017-01-01
Free fall is commonly discussed as an example of the equivalence principle, in the context of a homogeneous gravitational field, which is a reasonable approximation for small test masses falling moderate distances. Newton's law of gravity provides a generalisation to larger distances, and also brings in an inhomogeneity in the gravitational field.…
Directory of Open Access Journals (Sweden)
Joan R. Villalbí
2007-04-01
Full Text Available El ejercicio de la autoridad sanitaria es un servicio básico de la salud pública. Parte de la responsabilidad de los gestores de la salud pública es hacer cumplir normas. Éstas se desarrollan cuando se dan circunstancias que llevan a considerar inadmisibles ciertos riesgos. El grueso del ejercicio de la autoridad sanitaria se basa en la aplicación relativamente sistemática de normativas detalladas de referencia, aunque siempre hay cierta incertidumbre, ejemplificada en la frecuente adopción de medidas cautelares por un inspector sanitario aplicando el principio de precaución. Pero la vigilancia epidemiológica plantea de forma intermitente situaciones de afectación de la salud humana sin normas de referencia, en las que la autoridad sanitaria debe actuar según su criterio, contrapesando los riesgos de intervenir con los de no actuar. En este manuscrito presentamos 3 casos de este tipo: la coerción en el tratamiento de enfermos con tuberculosis bacilífera; la regulación de actividades con haba de soja que plantean riesgos de asma; y la limitación del ejercicio profesional de un médico infectado por el virus de la inmunodeficiencia humana.Implementing health authority is a basic public health service. Part of the responsibility of public health managers is to ensure compliance with regulations. These are developed when certain risks are considered inadmissible. Mostly, the exercise of health authority deals with the routine application of detailed norms, although there is always some uncertainty, as shown by the frequent use of cautionary measures by health officers during inspections. However, epidemiologic surveillance periodically involves situations in which human health is damaged and there is no reference regulation; in these situations, health authorities must act according to their own criteria, weighing the risks of intervention against those of nonintervention. In this article, we present 3 such scenarios: using coercion
Generalised solutions for fully nonlinear PDE systems and existence-uniqueness theorems
Katzourakis, Nikos
2017-07-01
We introduce a new theory of generalised solutions which applies to fully nonlinear PDE systems of any order and allows for merely measurable maps as solutions. This approach bypasses the standard problems arising by the application of Distributions to PDEs and is not based on either integration by parts or on the maximum principle. Instead, our starting point builds on the probabilistic representation of derivatives via limits of difference quotients in the Young measures over a toric compactification of the space of jets. After developing some basic theory, as a first application we consider the Dirichlet problem and we prove existence-uniqueness-partial regularity of solutions to fully nonlinear degenerate elliptic 2nd order systems and also existence of solutions to the ∞-Laplace system of vectorial Calculus of Variations in L∞.
Generalising the logistic map through the q-product
International Nuclear Information System (INIS)
Pessoa, R W S; Borges, E P
2011-01-01
We investigate a generalisation of the logistic map as x n+1 = 1 - ax n x qmap x n (-1 ≤ x n ≤ 1, 0 map → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for q map > 1 at the edge of chaos, particularly at the first critical point a c , that depends on the value of q map . Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at a c (q map ), and connections with nonextensive statistical mechanics are explored.
Object recognition and generalisation during habituation in horses
DEFF Research Database (Denmark)
Christensen, Janne Winther; Zharkikh, Tjatjana; Chovaux, Elodie
2011-01-01
The ability of horses to habituate to frightening stimuli greatly increases safety in the horse–human relationship. A recent experiment suggested, however, that habituation to frightening visual stimuli is relatively stimulus-specific in horses and that shape and colour are important factors...... for object generalisation (Christensen et al., 2008). In a series of experiments, we aimed to further explore the ability of horses (n = 30, 1 and 2-year-old mares) to recognise and generalise between objects during habituation. TEST horses (n = 15) were habituated to a complex object, composed of five...... simple objects of varying shape and colour, whereas CONTROL horses (n = 15) were habituated to the test arena, but not to the complex object. In the first experiment, we investigated whether TEST horses subsequently reacted less to i) simple objects that were previously part of the complex object (i...
Learning and Generalisation in Neural Networks with Local Preprocessing
Kutsia, Merab
2007-01-01
We study learning and generalisation ability of a specific two-layer feed-forward neural network and compare its properties to that of a simple perceptron. The input patterns are mapped nonlinearly onto a hidden layer, much larger than the input layer, and this mapping is either fixed or may result from an unsupervised learning process. Such preprocessing of initially uncorrelated random patterns results in the correlated patterns in the hidden layer. The hidden-to-output mapping of the net...
Rare case of generalised aggressive periodontitis in the primary dentition
Spoerri, A; Signorelli, C; Erb, J; van Waes, H; Schmidlin, P R
2014-01-01
BACKGROUND Generalised aggressive periodontitis (AP) in the prepubescent age is an exceptionally rare disease in the primary dentition of otherwise healthy children. Characteristics of AP are gingival inflammation, deep periodontal pockets, bone loss, tooth mobility and even tooth loss. The most common way of treating this disease is the extraction of all the involved primary teeth. CASE REPORT A 4-year-old girl presented with signs of severe gingival inflammation. Clinical examination rev...
Generalisation of geographic information cartographic modelling and applications
Mackaness, William A; Sarjakoski, L Tiina
2011-01-01
Theoretical and Applied Solutions in Multi Scale MappingUsers have come to expect instant access to up-to-date geographical information, with global coverage--presented at widely varying levels of detail, as digital and paper products; customisable data that can readily combined with other geographic information. These requirements present an immense challenge to those supporting the delivery of such services (National Mapping Agencies (NMA), Government Departments, and private business. Generalisation of Geographic Information: Cartographic Modelling and Applications provides detailed review
A study of idiopathic generalised epilepsy in an Irish population.
LENUS (Irish Health Repository)
Mullins, G M
2012-02-03
Idiopathic generalised epilepsy (IGE) is subdivided into syndromes based on clinical and EEG features. PURPOSE: The aim of this study was to characterise all cases of IGE with supportive EEG abnormalities in terms of gender differences, seizure types reported, IGE syndromes, family history of epilepsy and EEG findings. We also calculated the limited duration prevalence of IGE in our cohort. METHODS: Data on abnormal EEGs were collected retrospectively from two EEG databases at two tertiary referral centres for neurology. Clinical information was obtained from EEG request forms, standardised EEG questionnaires and medical notes of patients. RESULTS: two hundred twenty-three patients met our inclusion criteria, 89 (39.9%) male and 134 (60.1%) females. Tonic clonic seizures were the most common seizure type reported, 162 (72.65%) having a generalised tonic clonic seizure (GTCS) at some time. IGE with GTCS only (EGTCSA) was the most common syndrome in our cohort being present in 94 patients (34 male, 60 female), with 42 (15 male, 27 female) patients diagnosed with Juvenile myoclonic epilepsy (JME), 23 (9 male, 14 female) with Juvenile absence epilepsy (JAE) and 20 (9 male, 11 female) with childhood absence epilepsy (CAE). EEG studies in all patients showed generalised epileptiform activity. CONCLUSIONS: More women than men were diagnosed with generalised epilepsy. Tonic clonic seizures were the most common seizure type reported. EGTCSA was the most frequent syndrome seen. Gender differences were evident for JAE and JME as previously reported and for EGTCSA, which was not reported to date, and reached statistical significance for EGTCA and JME.
Decommissioning funding: ethics, implementation, uncertainties
International Nuclear Information System (INIS)
2006-01-01
This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)
A Generalised Approach to Petri Nets and Algebraic Specifications
International Nuclear Information System (INIS)
Sivertsen, Terje
1998-02-01
The present report represents a continuation of the work on Petri nets and algebraic specifications. The reported research has focused on generalising the approach introduced in HWR-454, with the aim of facilitating the translation of a wider class of Petri nets into algebraic specification. This includes autonomous Petri nets with increased descriptive power, as well as non-autonomous Petri nets allowing the modelling of systems (1) involving extensive data processing; (2) with transitions synchronized on external events; (3) whose evolutions are time dependent. The generalised approach has the important property of being modular in the sense that the translated specifications can be gradually extended to include data processing, synchronization, and timing. The report also discusses the relative merits of state-based and transition-based specifications, and includes a non-trivial case study involving automated proofs of a large number of interrelated theorems. The examples in the report illustrate the use of the new HRP Prover. Of particular importance in this context is the automatic transformation between state-based and transitionbased specifications. It is expected that the approach introduced in HWR-454 and generalised in the present report will prove useful in future work on combination of wide variety of specification techniques
Uncertainty and Complementarity in Axiomatic Quantum Mechanics
Lahti, Pekka J.
1980-11-01
In this work an investigation of the uncertainty principle and the complementarity principle is carried through. A study of the physical content of these principles and their representation in the conventional Hilbert space formulation of quantum mechanics forms a natural starting point for this analysis. Thereafter is presented more general axiomatic framework for quantum mechanics, namely, a probability function formulation of the theory. In this general framework two extra axioms are stated, reflecting the ideas of the uncertainty principle and the complementarity principle, respectively. The quantal features of these axioms are explicated. The sufficiency of the state system guarantees that the observables satisfying the uncertainty principle are unbounded and noncompatible. The complementarity principle implies a non-Boolean proposition structure for the theory. Moreover, nonconstant complementary observables are always noncompatible. The uncertainty principle and the complementarity principle, as formulated in this work, are mutually independent. Some order is thus brought into the confused discussion about the interrelations of these two important principles. A comparison of the present formulations of the uncertainty principle and the complementarity principle with the Jauch formulation of the superposition principle is also given. The mutual independence of the three fundamental principles of the quantum theory is hereby revealed.
Lindley, Dennis V
2013-01-01
Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.
Uncertainty principles for the Cherednik transform
Indian Academy of Sciences (India)
University of Hassan II, Casablanca, Morocco. 2Department of Mathematics, Keio University at Fujisawa, 5322 Endo,. Kanagawa 252-8520, Japan. 3School of Science and Technology, Kwansei Gakuin University, 2-1 Gakuen,. Sanda-City 6691337, Japan. E-mail: ra_daher@yahoo.fr. MS received 9 March 2011. Abstract.
Uncertainty principles for the Cherednik transform
Indian Academy of Sciences (India)
Department of Mathematics, Faculty of Sciences of Ain Chock, University of Hassan II, Casablanca, Morocco; Department of Mathematics, Keio University at Fujisawa, 5322 Endo, Kanagawa 252-8520, Japan; School of Science and Technology, Kwansei Gakuin University, 2-1 Gakuen, Sanda-City 6691337, Japan ...
On quantization, the generalised Schroedinger equation and classical mechanics
International Nuclear Information System (INIS)
Jones, K.R.W.
1991-01-01
A ψ-dependent linear functional operator, was defined, which solves the problem of quantization in non-relativistic quantum mechanics. Weyl ordering is implemented automatically and permits derivation of many of the quantum to classical correspondences. The parameter λ presents a natural C ∞ deformation of the dynamical structure of quantum mechanics via a non-linear integro-differential 'Generalised Schroedinger Equation', admitting an infinite family of soliton solutions. All these solutions are presented and it is shown that this equation gives an exact dynamic and energetic reproduction of classical mechanics with the correct measurement theoretic limit. 23 refs
Generalised boundary terms for higher derivative theories of gravity
Energy Technology Data Exchange (ETDEWEB)
Teimouri, Ali; Talaganis, Spyridon; Edholm, James [Consortium for Fundamental Physics, Lancaster University,North West Drive, Lancaster, LA1 4YB (United Kingdom); Mazumdar, Anupam [Consortium for Fundamental Physics, Lancaster University,North West Drive, Lancaster, LA1 4YB (United Kingdom); Kapteyn Astronomical Institute, University of Groningen,9700 AV Groningen (Netherlands)
2016-08-24
In this paper we wish to find the corresponding Gibbons-Hawking-York term for the most general quadratic in curvature gravity by using Coframe slicing within the Arnowitt-Deser-Misner (ADM) decomposition of spacetime in four dimensions. In order to make sure that the higher derivative gravity is ghost and tachyon free at a perturbative level, one requires infinite covariant derivatives, which yields a generalised covariant infinite derivative theory of gravity. We will be exploring the boundary term for such a covariant infinite derivative theory of gravity.
Generalised extreme value statistics and sum of correlated variables
Bertin, Eric; Clusel, Maxime
2006-01-01
To appear in J.Phys.A; We show that generalised extreme value statistics -the statistics of the k-th largest value among a large set of random variables- can be mapped onto a problem of random sums. This allows us to identify classes of non-identical and (generally) correlated random variables with a sum distributed according to one of the three (k-dependent) asymptotic distributions of extreme value statistics, namely the Gumbel, Frechet and Weibull distributions. These classes, as well as t...
Building Abelian Functions with Generalised Baker-Hirota Operators
Directory of Open Access Journals (Sweden)
Matthew England
2012-06-01
Full Text Available We present a new systematic method to construct Abelian functions on Jacobian varieties of plane, algebraic curves. The main tool used is a symmetric generalisation of the bilinear operator defined in the work of Baker and Hirota. We give explicit formulae for the multiple applications of the operators, use them to define infinite sequences of Abelian functions of a prescribed pole structure and deduce the key properties of these functions. We apply the theory on the two canonical curves of genus three, presenting new explicit examples of vector space bases of Abelian functions. These reveal previously unseen similarities between the theories of functions associated to curves of the same genus.
Generalised Hermite–Gaussian beams and mode transformations
International Nuclear Information System (INIS)
Wang, Yi; Chen, Yujie; Zhang, Yanfeng; Chen, Hui; Yu, Siyuan
2016-01-01
Generalised Hermite–Gaussian modes (gHG modes), an extended notion of Hermite–Gaussian modes (HG modes), are formed by the summation of normal HG modes with a characteristic function α, which can be used to unite conventional HG modes and Laguerre–Gaussian modes (LG modes). An infinite number of normalised orthogonal modes can thus be obtained by modulation of the function α. The gHG mode notion provides a useful tool in analysis of the deformation and transformation phenomena occurring in propagation of HG and LG modes with astigmatic perturbation. (paper)
Limits of the generalised Tomimatsu-Sato gravitational fields
International Nuclear Information System (INIS)
Cosgrove, C.M.
1977-01-01
In a previous paper (Cosgrove. J. Phys. A.; 10:1481 (1977)), the author presented a new three-parameter family of exact asymptotically flat stationary axisymmetric vacuum solutions of Einstein's equations which contains the solutions of Kerr and Tomimatsu-Sato (TS) as special cases. In this paper, two interesting special cases of the previous family which must be constructed by a limiting process are considered. These are interpreted as a 'rotating Curzon metric' and a 'generalised extreme Kerr metric'. In addition, approximate forms for the original metrics are given for the cases of slow rotation and small deformation. (author)
Effect Displays in R for Generalised Linear Models
Directory of Open Access Journals (Sweden)
John Fox
2003-07-01
Full Text Available This paper describes the implementation in R of a method for tabular or graphical display of terms in a complex generalised linear model. By complex, I mean a model that contains terms related by marginality or hierarchy, such as polynomial terms, or main effects and interactions. I call these tables or graphs effect displays. Effect displays are constructed by identifying high-order terms in a generalised linear model. Fitted values under the model are computed for each such term. The lower-order "relatives" of a high-order term (e.g., main effects marginal to an interaction are absorbed into the term, allowing the predictors appearing in the high-order term to range over their values. The values of other predictors are fixed at typical values: for example, a covariate could be fixed at its mean or median, a factor at its proportional distribution in the data, or to equal proportions in its several levels. Variations of effect displays are also described, including representation of terms higher-order to any appearing in the model.
Cosmological principles. II. Physical principles
International Nuclear Information System (INIS)
Harrison, E.R.
1974-01-01
The discussion of cosmological principle covers the uniformity principle of the laws of physics, the gravitation and cognizability principles, and the Dirac creation, chaos, and bootstrap principles. (U.S.)
Quantifying soil hydraulic properties and their uncertainties by modified GLUE method
Yan, Yifan; Liu, Jianli; Zhang, Jiabao; Li, Xiaopeng; Zhao, Yongchao
2017-07-01
Nonlinear least squares algorithm is commonly used to fit the evaporation experiment data and to obtain the `optimal' soil hydraulic model parameters. But the major defects of nonlinear least squares algorithm include non-uniqueness of the solution to inverse problems and its inability to quantify uncertainties associated with the simulation model. In this study, it is clarified by applying retention curve and a modified generalised likelihood uncertainty estimation method to model calibration. Results show that nonlinear least squares gives good fits to soil water retention curve and unsaturated water conductivity based on data observed by Wind method. And meanwhile, the application of generalised likelihood uncertainty estimation clearly demonstrates that a much wider range of parameters can fit the observations well. Using the `optimal' solution to predict soil water content and conductivity is very risky. Whereas, 95% confidence interval generated by generalised likelihood uncertainty estimation quantifies well the uncertainty of the observed data. With a decrease of water content, the maximum of nash and sutcliffe value generated by generalised likelihood uncertainty estimation performs better and better than the counterpart of nonlinear least squares. 95% confidence interval quantifies well the uncertainties and provides preliminary sensitivities of parameters.
Control configuration selection for bilinear systems via generalised Hankel interaction index array
DEFF Research Database (Denmark)
Shaker, Hamid Reza; Tahavori, Maryamsadat
2015-01-01
way, an iterative method for solving the generalised Sylvester equation is proposed. The generalised cross-gramian is used to form the generalised Hankel interaction index array. The generalised Hankel interaction index array is used for control configuration selection of MIMO bilinear processes. Most...... importantly, since for each element of generalised Hankel interaction index array just one generalised Sylvester equation is needed to be solved, the proposed control configuration selection method is computationally more efficient than its gramian-based counterparts.......Decentralised and partially decentralised control strategies are very popular in practice. To come up with a suitable decentralised or partially decentralised control structure, it is important to select the appropriate input and output pairs for control design. This procedure is called control...
Liu, Baoding
2015-01-01
When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...
Optimising, generalising and integrating educational practice using neuroscience
Colvin, Robert
2016-07-01
Practical collaboration at the intersection of education and neuroscience research is difficult because the combined discipline encompasses both the activity of microscopic neurons and the complex social interactions of teachers and students in a classroom. Taking a pragmatic view, this paper discusses three education objectives to which neuroscience can be effectively applied: optimising, generalising and integrating instructional techniques. These objectives are characterised by: (1) being of practical importance; (2) building on existing education and cognitive research; and (3) being infeasible to address based on behavioural experiments alone. The focus of the neuroscientific aspect of collaborative research should be on the activity of the brain before, during and after learning a task, as opposed to performance of a task. The objectives are informed by literature that highlights possible pitfalls with educational neuroscience research, and are described with respect to the static and dynamic aspects of brain physiology that can be measured by current technology.
Generalised pruritus as a presentation of Grave’s disease
Directory of Open Access Journals (Sweden)
Tan CE
2013-05-01
Full Text Available Pruritus is a lesser known symptom of hyperthyroidism, particularly in autoimmune thyroid disorders. This is a case report of a 27-year-old woman who presented with generalised pruritus at a primary care clinic. Incidental findings of tachycardia and a goiter led to the investigations of her thyroid status. The thyroid function test revealed elevated serum free T4 and suppressed thyroid stimulating hormone levels. The anti-thyroid antibodies were positive. She was diagnosed with Graves’ disease and treated with carbimazole until her symptoms subsided. Graves’ disease should be considered as an underlying cause for patients presenting with pruritus. A thorough history and complete physical examination are crucial in making an accurate diagnosis. Underlying causes must be determined before treating the symptoms.
Generalised two target localisation using passive monopulse radar
Jardak, Seifallah
2017-04-07
The simultaneous lobing technique, also known as monopulse technique, has been widely used for fast target localisation and tracking purposes. Many works focused on accurately localising one or two targets lying within a narrow beam centred around the monopulse antenna boresight. In this study, a new approach is proposed, which uses the outputs of four antennas to rapidly localise two point targets present in the hemisphere. If both targets have the same elevation angle, the proposed scheme cannot detect them. To detect such targets, a second set of antennas is required. In this study, to detect two targets at generalised locations, the antenna array is divided into multiple overlapping sets each of four antennas. Two algorithms are proposed to combine the outputs from multiple sets and improve the detection performance. Simulation results show that the algorithm is able to localise both targets with <;2° mean square error in azimuth and elevation.
Visceral obesity and psychosocial stress: a generalised control theory model
Wallace, Rodrick
2016-07-01
The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.
The second critical density and anisotropic generalised condensation
Directory of Open Access Journals (Sweden)
M. Beau
2010-01-01
Full Text Available In this letter we discuss the relevance of the 3D Perfect Bose gas (PBG condensation in extremely elongated vessels for the study of anisotropic condensate coherence and the "quasi-condensate". To this end we analyze the case of exponentially anisotropic (van den Berg boxes, when there are two critical densities ρc<ρm for a generalised Bose-Einstein Condensation (BEC. Here ρc is the standard critical density for the PBG. We consider three examples of anisotropic geometry: slabs, squared beams and "cigars" to demonstrate that the "quasi-condensate" which exists in domain ρc<ρ<ρm is in fact the van den Berg-Lewis-Pulé generalised condensation (vdBLP-GC of the type III with no macroscopic occupation of any mode. We show that for the slab geometry the second critical density ρm is a threshold between quasi-two-dimensional (quasi-2D condensate and the three dimensional (3D regime when there is a coexistence of the "quasi-condensate" with the standard one-mode BEC. On the other hand, in the case of squared beams and "cigars" geometries, critical density ρm separates quasi-1D and 3D regimes. We calculate the value of the difference between ρc, ρm (and between corresponding critical temperatures Tm, Tc to show that the observed space anisotropy of the condensate coherence can be described by a critical exponent γ(T related to the anisotropic ODLRO. We compare our calculations with physical results for extremely elongated traps that manifest "quasi-condensate".
Generalised partition functions: inferences on phase space distributions
Directory of Open Access Journals (Sweden)
R. A. Treumann
2016-06-01
Full Text Available It is demonstrated that the statistical mechanical partition function can be used to construct various different forms of phase space distributions. This indicates that its structure is not restricted to the Gibbs–Boltzmann factor prescription which is based on counting statistics. With the widely used replacement of the Boltzmann factor by a generalised Lorentzian (also known as the q-deformed exponential function, where κ = 1∕|q − 1|, with κ, q ∈ R both the kappa-Bose and kappa-Fermi partition functions are obtained in quite a straightforward way, from which the conventional Bose and Fermi distributions follow for κ → ∞. For κ ≠ ∞ these are subject to the restrictions that they can be used only at temperatures far from zero. They thus, as shown earlier, have little value for quantum physics. This is reasonable, because physical κ systems imply strong correlations which are absent at zero temperature where apart from stochastics all dynamical interactions are frozen. In the classical large temperature limit one obtains physically reasonable κ distributions which depend on energy respectively momentum as well as on chemical potential. Looking for other functional dependencies, we examine Bessel functions whether they can be used for obtaining valid distributions. Again and for the same reason, no Fermi and Bose distributions exist in the low temperature limit. However, a classical Bessel–Boltzmann distribution can be constructed which is a Bessel-modified Lorentzian distribution. Whether it makes any physical sense remains an open question. This is not investigated here. The choice of Bessel functions is motivated solely by their convergence properties and not by reference to any physical demands. This result suggests that the Gibbs–Boltzmann partition function is fundamental not only to Gibbs–Boltzmann but also to a large class of generalised Lorentzian distributions as well as to the
An anisotropic elastoplastic constitutive formulation generalised for orthotropic materials
Mohd Nor, M. K.; Ma'at, N.; Ho, C. S.
2018-03-01
This paper presents a finite strain constitutive model to predict a complex elastoplastic deformation behaviour that involves very high pressures and shockwaves in orthotropic materials using an anisotropic Hill's yield criterion by means of the evolving structural tensors. The yield surface of this hyperelastic-plastic constitutive model is aligned uniquely within the principal stress space due to the combination of Mandel stress tensor and a new generalised orthotropic pressure. The formulation is developed in the isoclinic configuration and allows for a unique treatment for elastic and plastic orthotropy. An isotropic hardening is adopted to define the evolution of plastic orthotropy. The important feature of the proposed hyperelastic-plastic constitutive model is the introduction of anisotropic effect in the Mie-Gruneisen equation of state (EOS). The formulation is further combined with Grady spall failure model to predict spall failure in the materials. The proposed constitutive model is implemented as a new material model in the Lawrence Livermore National Laboratory (LLNL)-DYNA3D code of UTHM's version, named Material Type 92 (Mat92). The combination of the proposed stress tensor decomposition and the Mie-Gruneisen EOS requires some modifications in the code to reflect the formulation of the generalised orthotropic pressure. The validation approach is also presented in this paper for guidance purpose. The \\varvec{ψ} tensor used to define the alignment of the adopted yield surface is first validated. This is continued with an internal validation related to elastic isotropic, elastic orthotropic and elastic-plastic orthotropic of the proposed formulation before a comparison against range of plate impact test data at 234, 450 and {895 ms}^{-1} impact velocities is performed. A good agreement is obtained in each test.
Managing uncertainty for sustainability of complex projects
DEFF Research Database (Denmark)
Brink, Tove
2017-01-01
Purpose – The purpose of this paper is to reveal how management of uncertainty can enable sustainability of complex projects. Design/methodology/approach – The research was conducted from June 2014 to May 2015 using a qualitative deductive approach among operation and maintenance actors in offshore...... wind farms. The research contains a focus group interview with 11 companies, 20 individual interviews and a seminar presenting preliminary findings with 60 participants. Findings – The findings reveal the need for management of uncertainty through two different paths. First, project management needs...... to join efforts. Research limitations/implications – Further research is needed to reveal the generalisability of the findings in other complex project contexts containing “unknown unknowns”. Practical implications – The research leads to the development of a tool for uncertainty management...
DEFF Research Database (Denmark)
Damgård, Ivan Bjerre; Jurik, Mads Johan
2001-01-01
We propose a generalisation of Paillier's probabilistic public key system, in which the expansion factor is reduced and which allows to adjust the block length of the scheme even after the public key has been fixed, without loosing the homomorphic property. We show that the generalisation is as s...
Generalisability in economic evaluation studies in healthcare: a review and case studies.
Sculpher, M J; Pang, F S; Manca, A; Drummond, M F; Golder, S; Urdahl, H; Davies, L M; Eastwood, A
2004-12-01
country), MLM can facilitate correct estimates of the uncertainty in cost-effectiveness results, and also a means of estimating location-specific cost-effectiveness. The review of applied economic studies based on decision analytic models showed that few studies were explicit about their target decision-maker(s)/jurisdictions. The studies in the review generally made more effort to ensure that their cost inputs were specific to their target jurisdiction than their effectiveness parameters. Standard sensitivity analysis was the main way of dealing with uncertainty in the models, although few studies looked explicitly at variability between locations. The modelling case study illustrated how effectiveness and cost data can be made location-specific. In particular, on the effectiveness side, the example showed the separation of location-specific baseline events and pooled estimates of relative treatment effect, where the latter are assumed exchangeable across locations. A large number of factors are mentioned in the literature that might be expected to generate variation in the cost-effectiveness of healthcare interventions across locations. Several papers have demonstrated differences in the volume and cost of resource use between locations, but few studies have looked at variability in outcomes. In applied trial-based cost-effectiveness studies, few studies provide sufficient evidence for decision-makers to establish the relevance or to adjust the results of the study to their location of interest. Very few studies utilised statistical methods formally to assess the variability in results between locations. In applied economic studies based on decision models, most studies either stated their target decision-maker/jurisdiction or provided sufficient information from which this could be inferred. There was a greater tendency to ensure that cost inputs were specific to the target jurisdiction than clinical parameters. Methods to assess generalisability and variability in
DEFF Research Database (Denmark)
Heydorn, Kaj; Anglov, Thomas
2002-01-01
Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...
An all-but-one entropic uncertainty relation, and application to password-based identification
Bouman, N.J.; Fehr, S.; González-Guillén, C.; Schaffner, C.
2013-01-01
Entropic uncertainty relations are quantitative characterizations of Heisenberg’s uncertainty principle, which make use of an entropy measure to quantify uncertainty. We propose a new entropic uncertainty relation. It is the first such uncertainty relation that lower bounds the uncertainty in the
Superstatistical generalised Langevin equation: non-Gaussian viscoelastic anomalous diffusion
Ślęzak, Jakub; Metzler, Ralf; Magdziarz, Marcin
2018-02-01
Recent advances in single particle tracking and supercomputing techniques demonstrate the emergence of normal or anomalous, viscoelastic diffusion in conjunction with non-Gaussian distributions in soft, biological, and active matter systems. We here formulate a stochastic model based on a generalised Langevin equation in which non-Gaussian shapes of the probability density function and normal or anomalous diffusion have a common origin, namely a random parametrisation of the stochastic force. We perform a detailed analysis demonstrating how various types of parameter distributions for the memory kernel result in exponential, power law, or power-log law tails of the memory functions. The studied system is also shown to exhibit a further unusual property: the velocity has a Gaussian one point probability density but non-Gaussian joint distributions. This behaviour is reflected in the relaxation from a Gaussian to a non-Gaussian distribution observed for the position variable. We show that our theoretical results are in excellent agreement with stochastic simulations.
Interpretation of human pointing by African elephants: generalisation and rationality.
Smet, Anna F; Byrne, Richard W
2014-11-01
Factors influencing the abilities of different animals to use cooperative social cues from humans are still unclear, in spite of long-standing interest in the topic. One of the few species that have been found successful at using human pointing is the African elephant (Loxodonta africana); despite few opportunities for learning about pointing, elephants follow a pointing gesture in an object-choice task, even when the pointing signal and experimenter's body position are in conflict, and when the gesture itself is visually subtle. Here, we show that the success of captive African elephants at using human pointing is not restricted to situations where the pointing signal is sustained until the time of choice: elephants followed human pointing even when the pointing gesture was withdrawn before they had responded to it. Furthermore, elephants rapidly generalised their response to a type of social cue they were unlikely to have seen before: pointing with the foot. However, unlike young children, they showed no sign of evaluating the 'rationality' of this novel pointing gesture according to its visual context: that is, whether the experimenter's hands were occupied or not.
Effects of Community African Drumming on Generalised Anxiety in Teenagers
Directory of Open Access Journals (Sweden)
David Akombo
2013-07-01
Full Text Available The purpose of this study was to test the effects of community music projects (CMPs, such as after-school African drumming circles, on academic performance and generalised anxiety in adolescents. Adolescents from a Junior High (7th, 8th, and 9th graders, age range from 12-14 in the State of Utah (USA participated in the study. A one-sample t-test found a significant difference in reading scores (df(4 p=.004. A paired samples t-test found a significant relationship between the maths trait anxiety score pre-intervention and the total state anxiety score pre-test (df(4 p=.033. A paired samples t-test found a significant relationship between the reading trait anxiety score post-intervention and the total state anxiety score post-test (df(4 p=.030. This research demonstrates the effectiveness of community music such as drumming for reducing anxiety and also for improving academic performance in adolescents. CMPs are recommended as a non-invasive intervention modality for adolescents.
Generalised block bootstrap and its use in meteorology
Directory of Open Access Journals (Sweden)
L. Varga
2017-06-01
Full Text Available In an earlier paper, Rakonczai et al.(2014 emphasised the importance of investigating the effective sample size in case of autocorrelated data. The simulations were based on the block bootstrap methodology. However, the discreteness of the usual block size did not allow for exact calculations. In this paper we propose a new generalisation of the block bootstrap methodology, which allows for any positive real number as expected block size. We relate it to the existing optimisation procedures and apply it to a temperature data set. Our other focus is on statistical tests, where quite often the actual sample size plays an important role, even in the case of relatively large samples. This is especially the case for copulas. These are used for investigating the dependencies among data sets. As in quite a few real applications the time dependence cannot be neglected, we investigated the effect of this phenomenon on the used test statistic. The critical value can be computed by the proposed new block bootstrap simulation, where the block size is determined by fitting a VAR model to the observations. The results are illustrated for models of the used temperature data.
Rare case of generalised aggressive periodontitis in the primary dentition.
Spoerri, A; Signorelli, C; Erb, J; van Waes, H; Schmidlin, P R
2014-12-01
Generalised aggressive periodontitis (AP) in the prepubescent age is an exceptionally rare disease in the primary dentition of otherwise healthy children. Characteristics of AP are gingival inflammation, deep periodontal pockets, bone loss, tooth mobility and even tooth loss. The most common way of treating this disease is the extraction of all the involved primary teeth. A 4-year-old girl presented with signs of severe gingival inflammation. Clinical examination revealed deep pockets, increased tooth mobility and bone loss. Microbiological testing revealed the presence of a typical periopathogenic flora consisting of Aggregatibacter actinomycetemcomitans and the typical members of the red complex (Porphyromonas gingivalis, Prevotella intermedia and Treponema denticola). The patient underwent tooth extraction of all primary teeth except the primary canines, followed by thorough root debridement and treatment with systemic antibiotics (amoxicillin plus metronidazole). Regular clinical and microbiological examinations over 4 years showed no signs of recurrence of a periodontitis, even in the erupted permanent teeth. Early diagnosis and consequent early treatment of aggressive periodontitis can stop the disease and therefore avoid the development of a periodontal disease in the permanent dentition. A close collaboration between specialists of different disciplines is required for a favourable outcome.
Sketching the pion's valence-quark generalised parton distribution
Directory of Open Access Journals (Sweden)
C. Mezrag
2015-02-01
Full Text Available In order to learn effectively from measurements of generalised parton distributions (GPDs, it is desirable to compute them using a framework that can potentially connect empirical information with basic features of the Standard Model. We sketch an approach to such computations, based upon a rainbow-ladder (RL truncation of QCD's Dyson–Schwinger equations and exemplified via the pion's valence dressed-quark GPD, Hπv(x,ξ,t. Our analysis focuses primarily on ξ=0, although we also capitalise on the symmetry-preserving nature of the RL truncation by connecting Hπv(x,ξ=±1,t with the pion's valence-quark parton distribution amplitude. We explain that the impulse-approximation used hitherto to define the pion's valence dressed-quark GPD is generally invalid owing to omission of contributions from the gluons which bind dressed-quarks into the pion. A simple correction enables us to identify a practicable improvement to the approximation for Hπv(x,0,t, expressed as the Radon transform of a single amplitude. Therewith we obtain results for Hπv(x,0,t and the associated impact-parameter dependent distribution, qπv(x,|b→⊥|, which provide a qualitatively sound picture of the pion's dressed-quark structure at a hadronic scale. We evolve the distributions to a scale ζ=2 GeV, so as to facilitate comparisons in future with results from experiment or other nonperturbative methods.
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2009-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....
DEFF Research Database (Denmark)
Nguyen, Daniel Xuyen
. This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles......This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...
Fearing shades of grey: individual differences in fear responding towards generalisation stimuli.
Arnaudova, Inna; Krypotos, Angelos-Miltiadis; Effting, Marieke; Kindt, Merel; Beckers, Tom
2017-09-01
Individual differences in fear generalisation have been proposed to play a role in the aetiology and/or maintenance of anxiety disorders, but few data are available to directly support that claim. The research that is available has focused mostly on generalisation of peripheral and central physiological fear responses. Far less is known about the generalisation of avoidance, the behavioural component of fear. In two experiments, we evaluated how neuroticism, a known vulnerability factor for anxiety, modulates an array of fear responses, including avoidance tendencies, towards generalisation stimuli (GS). Participants underwent differential fear conditioning, in which one conditioned stimulus (CS+) was repeatedly paired with an aversive outcome (shock; unconditioned stimulus, US), whereas another was not (CS-). Fear generalisation was observed across measures in Experiment 1 (US expectancy and evaluative ratings) and Experiment 2 (US expectancy, evaluative ratings, skin conductance, startle responses, safety behaviours), with overall highest responding to the CS+, lowest to the CS- and intermediate responding to the GSs. Neuroticism had very little impact on fear generalisation (but did affect GS recognition rates in Experiment 1), in line with the idea that fear generalisation is largely an adaptive process.
Classical r-matrices for the generalised Chern–Simons formulation of 3d gravity
Osei, Prince K.; Schroers, Bernd J.
2018-04-01
We study the conditions for classical r-matrices to be compatible with the generalised Chern–Simons action for 3d gravity. Compatibility means solving the classical Yang–Baxter equations with a prescribed symmetric part for each of the real Lie algebras and bilinear pairings arising in the generalised Chern–Simons action. We give a new construction of r-matrices via a generalised complexification and derive a non-linear set of matrix equations determining the most general compatible r-matrix. We exhibit new families of solutions and show that they contain some known r-matrices for special parameter values.
The Generalised Ecosystem Modelling Approach in Radiological Assessment
Energy Technology Data Exchange (ETDEWEB)
Klos, Richard
2008-03-15
An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment
Generalised ballooning theory of two-dimensional tokamak modes
Abdoul, P. A.; Dickinson, D.; Roach, C. M.; Wilson, H. R.
2018-02-01
In this work, using solutions from a local gyrokinetic flux-tube code combined with higher order ballooning theory, a new analytical approach is developed to reconstruct the global linear mode structure with associated global mode frequency. In addition to the isolated mode (IM), which usually peaks on the outboard mid-plane, the higher order ballooning theory has also captured other types of less unstable global modes: (a) the weakly asymmetric ballooning theory (WABT) predicts a mixed mode (MM) that undergoes a small poloidal shift away from the outboard mid-plane, (b) a relatively more stable general mode (GM) balloons on the top (or bottom) of the tokamak plasma. In this paper, an analytic approach is developed to combine these disconnected analytical limits into a single generalised ballooning theory. This is used to investigate how an IM behaves under the effect of sheared toroidal flow. For small values of flow an IM initially converts into a MM where the results of WABT are recaptured, and eventually, as the flow increases, the mode asymptotically becomes a GM on the top (or bottom) of the plasma. This may be an ingredient in models for understanding why in some experimental scenarios, instead of large edge localised modes (ELMs), small ELMs are observed. Finally, our theory can have other important consequences, especially for calculations involving Reynolds stress driven intrinsic rotation through the radial asymmetry in the global mode structures. Understanding the intrinsic rotation is significant because external torque in a plasma the size of ITER is expected to be relatively low.
Fundamental principles of quantum theory
International Nuclear Information System (INIS)
Bugajski, S.
1980-01-01
After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)
Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon
2018-01-01
The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.
Modelling of extreme minimum rainfall using generalised extreme value distribution for Zimbabwe
Directory of Open Access Journals (Sweden)
Delson Chikobvu
2015-09-01
Full Text Available We modelled the mean annual rainfall for data recorded in Zimbabwe from 1901 to 2009. Extreme value theory was used to estimate the probabilities of meteorological droughts. Droughts can be viewed as extreme events which go beyond and/or below normal rainfall occurrences, such as exceptionally low mean annual rainfall. The duality between the distribution of the minima and maxima was exploited and used to fit the generalised extreme value distribution (GEVD to the data and hence find probabilities of extreme low levels of mean annual rainfall. The augmented Dickey Fuller test confirmed that rainfall data were stationary, while the normal quantile-quantile plot indicated that rainfall data deviated from the normality assumption at both ends of the tails of the distribution. The maximum likelihood estimation method and the Bayesian approach were used to find the parameters of the GEVD. The Kolmogorov-Smirnov and Anderson-Darling goodnessof- fit tests showed that the Weibull class of distributions was a good fit to the minima mean annual rainfall using the maximum likelihood estimation method. The mean return period estimate of a meteorological drought using the threshold value of mean annual rainfall of 473 mm was 8 years. This implies that if in the year there is a meteorological drought then another drought of the same intensity or greater is expected after 8 years. It is expected that the use of Bayesian inference may better quantify the level of uncertainty associated with the GEVD parameter estimates than with the maximum likelihood estimation method. The Markov chain Monte Carlo algorithm for the GEVD was applied to construct the model parameter estimates using the Bayesian approach. These findings are significant because results based on non-informative priors (Bayesian method and the maximum likelihood method approach are expected to be similar.
International Nuclear Information System (INIS)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software
Energy Technology Data Exchange (ETDEWEB)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.
International Nuclear Information System (INIS)
Toedt, C.; Hoetzinger, H.; Salbeck, R.; Beyer, H.K.
1989-01-01
Whereas generalised neufibromatosis is a relatively frequent disease its combined occurence in conjunction with agenesia of the corpus callosum is extremely rare and probably a casual coincidence. (orig.) [de
Dosimetric quantities and basic data for the evaluation of generalised derived limits
International Nuclear Information System (INIS)
Harrison, N.T.; Simmonds, J.R.
1980-12-01
The procedures, dosimetric quantities and basic data to be used for the evaluation of Generalised Derived Limits (GDLs) in environmental materials and of Generalised Derived Limits for discharges to atmosphere are described. The dosimetric considerations and the appropriate intake rates for both children and adults are discussed. In most situations in the nuclear industry and in those institutions, hospitals and laboratories which use relatively small quantities of radioactive material, the Generalised Derived Limits provide convenient reference levels against which the results of environmental monitoring can be compared, and atmospheric discharges can be assessed. They are intended for application when the environmental contamination or discharge to atmosphere is less than about 5% of the Generalised Derived Limit; above this level, it will usually be necessary to undertake a more detailed site-specific assessment. (author)
International Nuclear Information System (INIS)
Toedt, C.; Hoetzinger, H.; Salbeck, R.; Beyer, H.K.
1989-01-01
Abuse of ergotamine can release a generalised brain edema and brain infarctions. This can be visualized by CT, MR and angiography. The reason, however, can only be found in the patients history. (orig.) [de
Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S
2017-08-01
Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.
Digital Repository Service at National Institute of Oceanography (India)
Rao, M.M.M.; Murty, T.V.R.; Murthy, K.S.R.; Vasudeva, R.Y.
has been performed to build Generalised Inverse Operator (GIO) and it is operated on the observed anomaly with reference to the calculated anomaly to update model parameters. Data and model resolution matrices are computed to check the correctness...
libmpdata++ 0.1: a library of parallel MPDATA solvers for systems of generalised transport equations
Jaruga, A.; Arabas, S.; Jarecka, D.; Pawlowska, H.; Smolarkiewicz, P. K.; Waruszewski, M.
2014-11-01
This paper accompanies first release of libmpdata++, a C++ library implementing the Multidimensional Positive-Definite Advection Transport Algorithm (MPDATA). The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include: homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; shallow-water system compared with analytical solution (originally derived for a 2-D case); and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.
libmpdata++ 1.0: a library of parallel MPDATA solvers for systems of generalised transport equations
Jaruga, A.; Arabas, S.; Jarecka, D.; Pawlowska, H.; Smolarkiewicz, P. K.; Waruszewski, M.
2015-04-01
This paper accompanies the first release of libmpdata++, a C++ library implementing the multi-dimensional positive-definite advection transport algorithm (MPDATA) on regular structured grid. The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; a shallow-water system compared with analytical solution (originally derived for a 2-D case); and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.
libmpdata++ 1.0: a library of parallel MPDATA solvers for systems of generalised transport equations
Directory of Open Access Journals (Sweden)
A. Jaruga
2015-04-01
Full Text Available This paper accompanies the first release of libmpdata++, a C++ library implementing the multi-dimensional positive-definite advection transport algorithm (MPDATA on regular structured grid. The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; a shallow-water system compared with analytical solution (originally derived for a 2-D case; and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2012-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....
Decommissioning Funding: Ethics, Implementation, Uncertainties
International Nuclear Information System (INIS)
2007-01-01
This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems
The exceptional generalised geometry of supersymmetric AdS flux backgrounds
Energy Technology Data Exchange (ETDEWEB)
Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford, Andrew Wiles Building,Woodstock Road, Oxford, OX2 6GG (United Kingdom); Petrini, Michela [Sorbonne Université, UPMC Paris 06, UMR 7589,LPTHE, 75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)
2016-12-29
We analyse generic AdS flux backgrounds preserving eight supercharges in D=4 and D=5 dimensions using exceptional generalised geometry. We show that they are described by a pair of globally defined, generalised structures, identical to those that appear for flat flux backgrounds but with different integrability conditions. We give a number of explicit examples of such “exceptional Sasaki-Einstein” backgrounds in type IIB supergravity and M-theory. In particular, we give the complete analysis of the generic AdS{sub 5} M-theory backgrounds. We also briefly discuss the structure of the moduli space of solutions. In all cases, one structure defines a “generalised Reeb vector” that generates a Killing symmetry of the background corresponding to the R-symmetry of the dual field theory, and in addition encodes the generic contact structures that appear in the D=4 M-theory and D=5 type IIB cases. Finally, we investigate the relation between generalised structures and quantities in the dual field theory, showing that the central charge and R-charge of BPS wrapped-brane states are both encoded by the generalised Reeb vector, as well as discussing how volume minimisation (the dual of a- and F-maximisation) is encoded.
The generalised anxiety stigma scale (GASS): psychometric properties in a community sample
2011-01-01
Background Although there is substantial concern about negative attitudes to mental illness, little is known about the stigma associated with Generalised Anxiety Disorder (GAD) or its measurement. The aim of this study was to develop a multi-item measure of Generalised Anxiety Disorder stigma (the GASS). Methods Stigma items were developed from a thematic analysis of web-based text about the stigma associated with GAD. Six hundred and seventeen members of the public completed a survey comprising the resulting 20 stigma items and measures designed to evaluate construct validity. Follow-up data were collected for a subset of the participants (n = 212). Results The factor structure comprised two components: Personal Stigma (views about Generalised Anxiety Disorder); and Perceived Stigma (views about the beliefs of most others in the community). There was evidence of good construct validity and reliability for each of the Generalised Anxiety Stigma Scale (GASS) subscales. Conclusions The GASS is a promising brief measure of the stigma associated with Generalised Anxiety Disorder. PMID:22108099
The generalised anxiety stigma scale (GASS: psychometric properties in a community sample
Directory of Open Access Journals (Sweden)
Griffiths Kathleen M
2011-11-01
Full Text Available Abstract Background Although there is substantial concern about negative attitudes to mental illness, little is known about the stigma associated with Generalised Anxiety Disorder (GAD or its measurement. The aim of this study was to develop a multi-item measure of Generalised Anxiety Disorder stigma (the GASS. Methods Stigma items were developed from a thematic analysis of web-based text about the stigma associated with GAD. Six hundred and seventeen members of the public completed a survey comprising the resulting 20 stigma items and measures designed to evaluate construct validity. Follow-up data were collected for a subset of the participants (n = 212. Results The factor structure comprised two components: Personal Stigma (views about Generalised Anxiety Disorder; and Perceived Stigma (views about the beliefs of most others in the community. There was evidence of good construct validity and reliability for each of the Generalised Anxiety Stigma Scale (GASS subscales. Conclusions The GASS is a promising brief measure of the stigma associated with Generalised Anxiety Disorder.
Successful short-term re-learning and generalisation of concepts in semantic dementia.
Suárez-González, Aida; Savage, Sharon A; Caine, Diana
2016-09-28
Patients with semantic dementia (SD) can rapidly and successfully re-learn word labels during cognitive intervention. This new learning, however, usually remains rigid and context-dependent. Conceptual enrichment (COEN) training is a therapy approach aimed to produce more flexible and generalisable learning in SD. In this study we compare generalisation and maintenance of learning after COEN with performance achieved using a classical naming therapy (NT). The study recruited a 62-year-old woman with SD. An AB 1 ACAB 2 experimental design was implemented, with naming performance assessed at baseline, post- intervention, 3 and 6 weeks after the end of each treatment phase. Three generalisation tasks were also assessed pre- and post-intervention. Naming post-intervention improved significantly following both therapies, however, words trained using COEN therapy showed a significantly greater degree of generalisation than those trained under NT. In addition, only words trained with COEN continued to show significant improvements compared with baseline performance when assessed 6 weeks after practice ceased. It was concluded that therapies based on conceptual enrichment of the semantic network facilitate relearning of words and enhance generalisation in patients with SD.
Investment, regulation, and uncertainty
Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose
2014-01-01
As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases. This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745
Abolishing the maximum tension principle
Directory of Open Access Journals (Sweden)
Mariusz P. Da̧browski
2015-09-01
Full Text Available We find the series of example theories for which the relativistic limit of maximum tension Fmax=c4/4G represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.
International Nuclear Information System (INIS)
Mueller, E.Z.
1991-01-01
An equivalent diffusion theory PWR reflector model is presented, which has as its basis Smith's generalisation of Koebke's Equivalent Theory. This method is an adaptation, in one-dimensional slab geometry, of the Generalised Equivalence Theory (GET). Since the method involves the renormalisation of the GET discontinuity factors at nodal interfaces, it is called the Normalised Generalised Equivalence Theory (NGET) method. The advantages of the NGET method for modelling the ex-core nodes of a PWR are summarized. 23 refs
Preston, Jonathan L; Maas, Edwin; Whittle, Jessica; Leece, Megan C; McCabe, Patricia
2016-01-01
Ultrasound visual feedback of the tongue is one treatment option for individuals with persisting speech sound errors. This study evaluated children's performance during acquisition and generalisation of American English rhotics using ultrasound feedback. Three children aged 10-13 with persisting speech sound errors associated with childhood apraxia of speech (CAS) were treated for 14 one-hour sessions. Two of the participants increased the accuracy of their rhotic production during practise trials within treatment sessions, but none demonstrated generalisation to untreated words. Lack of generalisation may be due to a failure to acquire the target with sufficient accuracy during treatment, or to co-existing linguistic weaknesses that are not addressed in a motor-based treatment. Results suggest a need to refine the intervention procedures for CAS and/or a need to identify appropriate candidates for intervention to optimise learning.
Energy Technology Data Exchange (ETDEWEB)
Suescun-Diaz, Daniel [Surcolombiana Univ., Neiva (Colombia). Groupo de Fisica Teorica; Narvaez-Paredes, Mauricio [Javeriana Univ., Cali (Colombia). Groupo de Matematica y Estadistica Aplicada Pontificia; Lozano-Parada, Jamie H. [Univ. del Valle, Cali (Colombia). Dept. de Ingenieria
2016-03-15
In this paper, the generalisation of the 4th-order Adams-Bashforth-Moulton predictor-corrector method is proposed to numerically solve the point kinetic equations of the nuclear reactivity calculations without using the nuclear power history. Due to the nature of the point kinetic equations, different predictor modifiers are used in order improve the precision of the approximations obtained. The results obtained with the prediction formulas and generalised corrections improve the precision when compared with previous methods and are valid for various forms of nuclear power and different time steps.
International Nuclear Information System (INIS)
Landsberg, P.T.
1990-01-01
This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)
Directory of Open Access Journals (Sweden)
Rukiye Gökce
2017-01-01
Full Text Available Number patterns play an important role in the formation of mathematical concepts, as mathematics is treated as a science of patterns and relations, and as it is important to learn mathematics with generalization. With the reform in the middle school mathematics curriculum, the concept of pattern entering the curriculum has brought some learning difficulties in the context of generalization In this study the potential of tasks which are developed by considering student difficulties reported in the literature, algebraic generalization process and task design principles to shape generalization skills is examined. The study was conducted with thirteen students in five weeks (16 hours. Data were collected through notes, video and audio recordings and observations held during the implementation process. The data were analyzed qualitatively. As a result of the research; it has been determined that tasks can play an important role in strategy and notation use, algebraic generalization and effective use of visual models in finding a rule.
Donovan, Caroline L; Holmes, Monique C; Farrell, Lara J
2016-03-01
Intolerance of uncertainty (IU), negative beliefs about worry (NBW), positive beliefs about worry (PBW), negative problem orientation (NPO) and cognitive avoidance (CA) have been found to be integral in the conceptualisation of Generalised Anxiety Disorder (GAD) in adults, yet they have rarely been investigated in children with GAD. This study sought to determine (a) whether IU, NBW, PBW, NPO and CA differ between children diagnosed with GAD and non-anxious children and (b) to examine whether IU, NBW, PBW, NPO and CA differ between parents of children diagnosed with GAD and parents of children without an anxiety disorder. Participants were 50 children (aged 7-12 years), plus one of their parents. The 25 GAD children and 25 non-anxious children were matched on age and gender. Parents and children completed clinical diagnostic interviews, as well as a battery of questionnaires measuring worry, IU, NBW, PBW, NPO and CA. Children with GAD endorsed significantly higher levels of worry, IU, NBW, NPO and CA, but not PBW compared to non-anxious children. Parents of children with GAD did not differ from parents of non-anxious children on any of the variables. The study was limited by it's use of modified adult measures for some variables and a lack of heterogeneity in the sample. The cognitive variables of IU, NBW, NPO and CA may also be important in the conceptualisation and treatment of GAD in children as they are in adults. Copyright © 2015 Elsevier B.V. All rights reserved.
Moiseiwitsch, B L
2004-01-01
This graduate-level text's primary objective is to demonstrate the expression of the equations of the various branches of mathematical physics in the succinct and elegant form of variational principles (and thereby illuminate their interrelationship). Its related intentions are to show how variational principles may be employed to determine the discrete eigenvalues for stationary state problems and to illustrate how to find the values of quantities (such as the phase shifts) that arise in the theory of scattering. Chapter-by-chapter treatment consists of analytical dynamics; optics, wave mecha
Uncertainty, causality and decision: The case of social risks and nuclear risk in particular
International Nuclear Information System (INIS)
Lahidji, R.
2012-01-01
Probability and causality are two indispensable tools for addressing situations of social risk. Causal relations are the foundation for building risk assessment models and identifying risk prevention, mitigation and compensation measures. Probability enables us to quantify risk assessments and to calibrate intervention measures. It therefore seems not only natural, but also necessary to make the role of causality and probability explicit in the definition of decision problems in situations of social risk. Such is the aim of this thesis.By reviewing the terminology of risk and the logic of public interventions in various fields of social risk, we gain a better understanding of the notion and of the issues that one faces when trying to model it. We further elaborate our analysis in the case of nuclear safety, examining in detail how methods and policies have been developed in this field and how they have evolved through time. This leads to a number of observations concerning risk and safety assessments.Generalising the concept of intervention in a Bayesian network allows us to develop a variety of causal Bayesian networks adapted to our needs. In this framework, we propose a definition of risk which seems to be relevant for a broad range of issues. We then offer simple applications of our model to specific aspects of the Fukushima accident and other nuclear safety problems. In addition to specific lessons, the analysis leads to the conclusion that a systematic approach for identifying uncertainties is needed in this area. When applied to decision theory, our tool evolves into a dynamic decision model in which acts cause consequences and are causally interconnected. The model provides a causal interpretation of Savage's conceptual framework, solves some of its paradoxes and clarifies certain aspects. It leads us to considering uncertainty with regard to a problem's causal structure as the source of ambiguity in decision-making, an interpretation which corresponds to a
An attempt to introduce dynamics into generalised exergy considerations
International Nuclear Information System (INIS)
Grubbstroem, Robert W.
2007-01-01
In previous research, the author developed a general abstract framework for the exergy content of a system of finite objects [Grubbstroem RW. Towards a generalized exergy concept. In: van Gool W, Bruggink JJC, editors. Energy and time in the economic and physical sciences. Amsterdam: North-Holland; 1985. p. 41-56]. Each such object is characterised by its initial extensive properties and has an inner energy written as a function of these properties. It was shown that if these objects were allowed to interact, there is a maximum amount of work that can be extracted from the system as a whole, and a general formula for this potential was provided. It was also shown that if one of the objects was allowed to be of infinite magnitude initially, taking on the role as an environment having constant intensive properties, then the formula provided took on the same form as the classical expression for exergy. As a side result, the theoretical considerations demonstrated that the second law of thermodynamics could be interpreted as the inner energy function being a (weakly) convex function of its arguments, when these are chosen as the extensive properties. Since exergy considerations are based on the principle that total entropy is conserved when extracting work, these processes would take an infinite time to complete. In the current paper, instead, a differential-equation approach is introduced to describe the interaction in finite time between given finite objects of a system. Differences in intensive properties between the objects provide a force enabling an exchange of energy and matter. An example of such an interaction is heat conduction. The resulting considerations explain how the power extracted from the system will be limited by the processes being required to perform within finite-time constraints. Applying finite-time processes, in which entropy necessarily is generated, leads to formulating a theory for a maximal power output from the system. It is shown that
Indian Academy of Sciences (India)
that allows one to write down the laws of motion and arrive at the concept of inertia is somehow intimately re- lated to the background of distant parts of the universe. This argument is known as `Mach's principle' and we will analyse its implications further. When expressed in the framework of the absolute space,. Newton's ...
Directory of Open Access Journals (Sweden)
V. A. Grinenko
2011-06-01
Full Text Available The offered material in the article is picked up so that the reader could have a complete representation about concept “safety”, intrinsic characteristics and formalization possibilities. Principles and possible strategy of safety are considered. A material of the article is destined for the experts who are taking up the problems of safety.
Indian Academy of Sciences (India)
popularize science. The underlying idea in Mach's principle is that the origin of inertia or mass of a particle is a dynamical quantity determined by the environ- ... Knowing the latitude of the location of the pendulum it is possible to calculate the Earth's spin period. The two methods give the same answer. At first sight this does ...
Kindt, M.; Bögels, S.M.; Morren, M.
2003-01-01
The present study examined processing bias in children suffering from anxiety disorders. Processing bias was assessed using of the emotional Stroop task in clinically referred children with separation anxiety disorder (SAD), social phobia (SP), and/or generalised anxiety disorder (GAD) and normal
Bogels, S.M.; Snieder, N.; Kindt, M.
2003-01-01
The present study investigated whether children with high symptom levels of either social phobia (SP), separation anxiety disorder (SAD), or generalised anxiety disorder (GAD) are characterised by a specific set of dysfunctional interpretations that are consistent with the cognitive model of their
[Epileptic seizures during childbirth in a patient with idiopathic generalised epilepsy
Voermans, N.C.; Zwarts, M.J.; Renier, W.O.; Bloem, B.R.
2005-01-01
During her first pregnancy, a 37-year-old woman with idiopathic generalised epilepsy that was adequately controlled with lamotrigine experienced a series of epileptic seizures following an elective caesarean section. The attacks were terminated with diazepam. The following day, she developed
DEFF Research Database (Denmark)
Dlugosz, Stephan; Mammen, Enno; Wilke, Ralf
We consider the semiparametric generalised linear regression model which has mainstream empirical models such as the (partially) linear mean regression, logistic and multinomial regression as special cases. As an extension to related literature we allow a misclassified covariate to be interacted...
Modelling Problem-Solving Situations into Number Theory Tasks: The Route towards Generalisation
Papadopoulos, Ioannis; Iatridou, Maria
2010-01-01
This paper examines the way two 10th graders cope with a non-standard generalisation problem that involves elementary concepts of number theory (more specifically linear Diophantine equations) in the geometrical context of a rectangle's area. Emphasis is given on how the students' past experience of problem solving (expressed through interplay…
Multi-Trial Guruswami–Sudan Decoding for Generalised Reed–Solomon Codes
DEFF Research Database (Denmark)
Nielsen, Johan Sebastian Rosenkilde; Zeh, Alexander
2013-01-01
An iterated refinement procedure for the Guruswami–Sudan list decoding algorithm for Generalised Reed–Solomon codes based on Alekhnovich’s module minimisation is proposed. The method is parametrisable and allows variants of the usual list decoding approach. In particular, finding the list...
Anand, Pradeep S; Sagar, Deepak Kumar; Mishra, Supriya; Narang, Sumit; Kamath, Kavitha P; Anil, Sukumaran
To compare the total and differential leukocyte counts in the peripheral blood of generalised aggressive periodontitis patients with that of periodontally healthy subjects in a central Indian population. Seventy-five patients with generalised aggressive periodontitis and 63 periodontally healthy subjects were enrolled for the purpose of the study. All participants received a full-mouth periodontal examination in which probing depth and clinical attachment level were recorded. The haematological variables analysed included total leukocyte count, neutrophil count, lymphocyte count, monocyte count, neutrophil percentage, lymphocyte percentage, monocyte percentage and platelet count. The patient group showed a significantly higher total leukocyte count (7.62 ± 1.70 x 109 cells/l, p = 0.008), neutrophil count (5.06 ± 1.47x109 cells/l, p aggressive periodontitis and elevated total leukocyte (p = 0.012) and neutrophil counts (p = 0.001). The findings of the present study suggest that patients with generalised aggressive periodontitis might also demonstrate a systemic inflammatory response, as evidenced by increased leukocyte counts. This systemic inflammatory response observed in patients with generalised aggressive periodontitis may be associated with an increased risk for cardiovascular diseases.
Vas, A; Laws, P; Marsland, Am; McQuillan, O
2013-09-01
We describe the case of HIV-1 infected patient presenting to hospital with a severe cutaneous adverse drug reaction shortly after commencing dapsone therapy as Pneumocystis jirovecii pneumonia prophylaxis. To the best of our knowledge, acute generalised exanthematous pustulosis has not been reported as a reaction to dapsone in the setting of HIV.
Vicsek, Lilla
2010-01-01
In this paper I discuss some concerns related to the analysis of focus groups: (a) the issue of generalisation; (b) the problems of using numbers and quantifying in the analysis; (c) how the concrete situation of the focus groups could be included in the analysis, and (d) what formats can be used when quoting from focus groups. Problems with…
Generalised Multi-sequence Shift-Register Synthesis using Module Minimisation
DEFF Research Database (Denmark)
Nielsen, Johan Sebastian Rosenkilde
2013-01-01
We show how to solve a generalised version of the Multi-sequence Linear Feedback Shift-Register (MLFSR) problem using minimisation of free modules over F[x]. We show how two existing algorithms for minimising such modules run particularly fast on these instances. Furthermore, we show how one...
A retrospective study of carbamazepine therapy in the treatment of idiopathic generalised epilepsy
LENUS (Irish Health Repository)
O'Connor, G
2011-05-01
Objective: The exacerbation of idiopathic generalised epilepsy (IGE) by some anti-epileptic drugs (AEDs) such as carbamazepine (CBZ) has been well documented. However, it is unclear whether IGE is always worsened by the use of CBZ, or whether some patients with IGE benefit from its use. \\r\
Ferrara, Francesca; Sinclair, Nathalie
2016-01-01
This paper focuses on pattern generalisation as a way to introduce young students to early algebra. We build on research on patterning activities that feature, in their work with algebraic thinking, both looking for sameness recursively in a pattern (especially figural patterns, but also numerical ones) and conjecturing about function-based…
The physical origin of the uncertainty theorem
Giese, Albrecht
2011-09-01
The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Einstein's interpretation.
The physical origins of the uncertainty theorem
Giese, Albrecht
2013-10-01
The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.
Large-uncertainty intelligent states for angular momentum and angle
International Nuclear Information System (INIS)
Goette, Joerg B; Zambrini, Roberta; Franke-Arnold, Sonja; Barnett, Stephen M
2005-01-01
The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In this paper, we highlight the difference between intelligent states and minimum uncertainty states by investigating a class of intelligent states which obey the equality in the angular uncertainty relation while having an arbitrarily large uncertainty product. To develop an understanding for the uncertainties of angle and angular momentum for the large-uncertainty intelligent states we compare exact solutions with analytical approximations in two limiting cases
Uncertainty for Part Density Determination: An Update
Energy Technology Data Exchange (ETDEWEB)
Valdez, Mario Orlando [Los Alamos National Laboratory
2016-12-14
Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.
International Nuclear Information System (INIS)
Brain, P; Strimenopoulou, F; Ivarsson, M; Wilson, F J; Diukova, A; Wise, R G; Berry, E; Jolly, A; Hall, J E
2014-01-01
Conventional analysis of clinical resting electroencephalography (EEG) recordings typically involves assessment of spectral power in pre-defined frequency bands at specific electrodes. EEG is a potentially useful technique in drug development for measuring the pharmacodynamic (PD) effects of a centrally acting compound and hence to assess the likelihood of success of a novel drug based on pharmacokinetic–pharmacodynamic (PK–PD) principles. However, the need to define the electrodes and spectral bands to be analysed a priori is limiting where the nature of the drug-induced EEG effects is initially not known. We describe the extension to human EEG data of a generalised semi-linear canonical correlation analysis (GSLCCA), developed for small animal data. GSLCCA uses data from the whole spectrum, the entire recording duration and multiple electrodes. It provides interpretable information on the mechanism of drug action and a PD measure suitable for use in PK–PD modelling. Data from a study with low (analgesic) doses of the μ-opioid agonist, remifentanil, in 12 healthy subjects were analysed using conventional spectral edge analysis and GSLCCA. At this low dose, the conventional analysis was unsuccessful but plausible results consistent with previous observations were obtained using GSLCCA, confirming that GSLCCA can be successfully applied to clinical EEG data. (paper)
Uncertainty evaluation of data and information fusion within the context of the decision loop
CSIR Research Space (South Africa)
De Villiers, J Pieter
2016-07-01
Full Text Available In this paper, the principle taxonomy of the fusion process, the decision loop, is unified with uncertainty quantification and representation. A typical flow of information in the decision loop takes the form of raw information, uncertainty...
General principles of quantum mechanics
International Nuclear Information System (INIS)
Pauli, W.
1980-01-01
This book is a textbook for a course in quantum mechanics. Starting from the complementarity and the uncertainty principle Schroedingers equation is introduced together with the operator calculus. Then stationary states are treated as eigenvalue problems. Furthermore matrix mechanics are briefly discussed. Thereafter the theory of measurements is considered. Then as approximation methods perturbation theory and the WKB approximation are introduced. Then identical particles, spin, and the exclusion principle are discussed. There after the semiclassical theory of radiation and the relativistic one-particle problem are discussed. Finally an introduction is given into quantum electrodynamics. (HSI)
Uncertainty vs. Information (Invited)
Nearing, Grey
2017-04-01
Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.
Quantum Uncertainty and Fundamental Interactions
Directory of Open Access Journals (Sweden)
Tosto S.
2013-04-01
Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.
Uncertainty inequalities for the Heisenberg group
Indian Academy of Sciences (India)
the siegel upper half-plane, J. Math. Anal. Appl. 208 (1997) 58–70. [8] Hogan J A and Lakey J D, Time-frequency and time-scale methods: Adaptive decompo- sitions, uncertainty principles and sampling, Applied and Numerical Harmonic Analysis. (2005) (Birkhäuser, Boston, Basel, Berlin). [9] Liu H and Peng L, Admissible ...
Wilkesman, Jeff; Kurz, Liliana
2017-01-01
Zymography, the detection, identification, and even quantification of enzyme activity fractionated by gel electrophoresis, has received increasing attention in the last years, as revealed by the number of articles published. A number of enzymes are routinely detected by zymography, especially with clinical interest. This introductory chapter reviews the major principles behind zymography. New advances of this method are basically focused towards two-dimensional zymography and transfer zymography as will be explained in the rest of the chapters. Some general considerations when performing the experiments are outlined as well as the major troubleshooting and safety issues necessary for correct development of the electrophoresis.
International Nuclear Information System (INIS)
Wilson, P.D.
1996-01-01
Some basic explanations are given of the principles underlying the nuclear fuel cycle, starting with the physics of atomic and nuclear structure and continuing with nuclear energy and reactors, fuel and waste management and finally a discussion of economics and the future. An important aspect of the fuel cycle concerns the possibility of ''closing the back end'' i.e. reprocessing the waste or unused fuel in order to re-use it in reactors of various kinds. The alternative, the ''oncethrough'' cycle, discards the discharged fuel completely. An interim measure involves the prolonged storage of highly radioactive waste fuel. (UK)
Role of information theoretic uncertainty relations in quantum theory
International Nuclear Information System (INIS)
Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo
2015-01-01
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed
Generalisation benefits of output gating in a model of prefrontal cortex
Kriete, Trent; Noelle, David C.
2011-06-01
The prefrontal cortex (PFC) plays a central role in flexible cognitive control, including the suppression of habitual responding in favour of situation-appropriate behaviours that can be quite novel. PFC provides a kind of working memory, maintaining the rules, goals, and/or actions that are to control behaviour in the current context. For flexible control, these PFC representations must be sufficiently componential to support systematic generalisation to novel situations. The anatomical structure of PFC can be seen as implementing a componential 'slot-filler' structure, with different components encoded over isolated pools of neurons. Previous PFC models have highlighted the importance of a dynamic gating mechanism to selectively update individual 'slot' contents. In this article, we present simulation results that suggest that systematic generalisation also requires an 'output gating' mechanism that limits the influence of PFC on more posterior brain areas to reflect a small number of representational components at any one time.
The use of oral fluralaner for the treatment of feline generalised demodicosis: a case report.
Matricoti, I; Maina, E
2017-08-01
There is little agreement on the most effective and safest treatment for feline demodicosis. Protocols generally consist of long-lasting therapy courses based on rinses, subcutaneous injections, oral drug administration or repeated spot-on formulation and the efficacy of most of these is poorly documented. Many of these products have also been associated with adverse effects and may be difficult to administer in cats, leading to poor owner compliance and treatment failure. This case report describes the successful use of fluralaner in treating a generalised form of demodicosis caused by Demodex cati in an adult cat that was probably triggered by chronic glucocorticoid administration. After a single oral dose of 28 mg/kg fluralaner, negative skin scrapings were obtained within one month and clinical cure within two months. No side effects were observed. Larger studies are needed to evaluate the efficacy of fluralaner in treating feline generalised demodicosis. © 2017 British Small Animal Veterinary Association.
A Note on the Properties of Generalised Separable Spatial Autoregressive Process
Directory of Open Access Journals (Sweden)
Mahendran Shitan
2009-01-01
Full Text Available Spatial modelling has its applications in many fields like geology, agriculture, meteorology, geography, and so forth. In time series a class of models known as Generalised Autoregressive (GAR has been introduced by Peiris (2003 that includes an index parameter δ. It has been shown that the inclusion of this additional parameter aids in modelling and forecasting many real data sets. This paper studies the properties of a new class of spatial autoregressive process of order 1 with an index. We will call this a Generalised Separable Spatial Autoregressive (GENSSAR Model. The spectral density function (SDF, the autocovariance function (ACVF, and the autocorrelation function (ACF are derived. The theoretical ACF and SDF plots are presented as three-dimensional figures.
Knee function in 10-year-old children and adults with Generalised Joint Hypermobility
DEFF Research Database (Denmark)
Juul-Kristensen, Birgit; Hansen, Henrik; Simonsen, Erik B
2012-01-01
PURPOSE: Knee function is reduced in patients with Benign Joint Hypermobility Syndrome. The aim was to study knee function in children and adults with Generalised Joint Hypermobility (GJH) and Non-GJH (NGJH)). MATERIALS AND METHODS: In a matched comparative study, 39 children and 36 adults (mean ...... age children 10.2years; adults 40.3years) were included, comprising 19 children and 18 adults with GJH (Beighton =5/9; Beighton =4/9), minimum one hypermobile knee, no knee pain (children), and 20 children and 18 adults with NGJH (Beighton......PURPOSE: Knee function is reduced in patients with Benign Joint Hypermobility Syndrome. The aim was to study knee function in children and adults with Generalised Joint Hypermobility (GJH) and Non-GJH (NGJH)). MATERIALS AND METHODS: In a matched comparative study, 39 children and 36 adults (mean...
Alegre, Federico; Amehraye, Asmaa; Evans, Nicholas
2013-01-01
International audience; The vulnerability of automatic speaker verification systems to spoofing is now well accepted. While recent work has shown the potential to develop countermeasures capable of detecting spoofed speech signals, existing solutions typically function well only for specific attacks on which they are optimised. Since the exact nature of spoofing attacks can never be known in practice, there is thus a need for generalised countermeasures which can detect previously unseen spoo...
Vakhnenko, V O; Morrison, A J
2003-01-01
A Baecklund transformation both in bilinear and in ordinary form for the transformed generalised Vakhnenko equation (GVE) is derived. It is shown that the equation has an infinite sequence of conservation laws. An inverse scattering problem is formulated; it has a third-order eigenvalue problem. A procedure for finding the exact N-soliton solution to the GVE via the inverse scattering method is described. The procedure is illustrated by considering the cases N=1 and 2.
Generalised zeta-function regularization for scalar one-loop effective action
Cognola, Guido; Zerbini, Sergio
2004-01-01
The one-loop effective action for a scalar field defined in the ultrastatic space-time where non standard logarithmic terms in the asymptotic heat-kernel expansion are present, is investigated by a generalisation of zeta-function regularisation. It is shown that additional divergences may appear at one-loop level. The one-loop renormalisability of the model is discussed and the one-loop renormalisation group equations are derived.
Aspects of string theory compactifications. D-brane statistics and generalised geometry
International Nuclear Information System (INIS)
Gmeiner, F.
2006-01-01
In this thesis we investigate two different aspects of string theory compactifications. The first part deals with the issue of the huge amount of possible string vacua, known as the landscape. Concretely we investigate a specific well defined subset of type II orientifold compactifications. We develop the necessary tools to construct a very large set of consistent models and investigate their gauge sector on a statistical basis. In particular we analyse the frequency distributions of gauge groups and the possible amount of chiral matter for compactifications to six and four dimensions. In the phenomenologically relevant case of four-dimensional compactifications, special attention is paid to solutions with gauge groups that include those of the standard model, as well as Pati-Salam, SU(5) and flipped SU(5) models. Additionally we investigate the frequency distribution of coupling constants and correlations between the observables in the gauge sector. These results are compared with a recent study of Gepner models. Moreover, we elaborate on questions concerning the finiteness of the number of solutions and the computational complexity of the algorithm. In the second part of this thesis we consider a new mathematical framework, called generalised geometry, to describe the six-manifolds used in string theory compactifications. In particular, the formulation of T-duality and mirror symmetry for nonlinear topological sigma models is investigated. Therefore we provide a reformulation and extension of the known topological A- and B-models to the generalised framework. The action of mirror symmetry on topological D-branes in this setup is presented and the transformation of the boundary conditions is analysed. To extend the considerations to D-branes in type II string theory, we introduce the notion of generalised calibrations. We show that the known calibration conditions of supersymmetric branes in type IIA and IIB can be obtained as special cases. Finally we investigate
Aspects of string theory compactifications. D-brane statistics and generalised geometry
Energy Technology Data Exchange (ETDEWEB)
Gmeiner, F.
2006-05-26
In this thesis we investigate two different aspects of string theory compactifications. The first part deals with the issue of the huge amount of possible string vacua, known as the landscape. Concretely we investigate a specific well defined subset of type II orientifold compactifications. We develop the necessary tools to construct a very large set of consistent models and investigate their gauge sector on a statistical basis. In particular we analyse the frequency distributions of gauge groups and the possible amount of chiral matter for compactifications to six and four dimensions. In the phenomenologically relevant case of four-dimensional compactifications, special attention is paid to solutions with gauge groups that include those of the standard model, as well as Pati-Salam, SU(5) and flipped SU(5) models. Additionally we investigate the frequency distribution of coupling constants and correlations between the observables in the gauge sector. These results are compared with a recent study of Gepner models. Moreover, we elaborate on questions concerning the finiteness of the number of solutions and the computational complexity of the algorithm. In the second part of this thesis we consider a new mathematical framework, called generalised geometry, to describe the six-manifolds used in string theory compactifications. In particular, the formulation of T-duality and mirror symmetry for nonlinear topological sigma models is investigated. Therefore we provide a reformulation and extension of the known topological A- and B-models to the generalised framework. The action of mirror symmetry on topological D-branes in this setup is presented and the transformation of the boundary conditions is analysed. To extend the considerations to D-branes in type II string theory, we introduce the notion of generalised calibrations. We show that the known calibration conditions of supersymmetric branes in type IIA and IIB can be obtained as special cases. Finally we investigate
A study of the one dimensional total generalised variation regularisation problem
Papafitsoros, Konstantinos
2015-03-01
© 2015 American Institute of Mathematical Sciences. In this paper we study the one dimensional second order total generalised variation regularisation (TGV) problem with L2 data fitting term. We examine the properties of this model and we calculate exact solutions using simple piecewise affine functions as data terms. We investigate how these solutions behave with respect to the TGV parameters and we verify our results using numerical experiments.
Johnson, Thomas
2018-01-01
In a recent seminal paper \\cite{D--H--R} of Dafermos, Holzegel and Rodnianski the linear stability of the Schwarzschild family of black hole solutions to the Einstein vacuum equations was established by imposing a double null gauge. In this paper we shall prove that the Schwarzschild family is linearly stable as solutions to the Einstein vacuum equations by imposing instead a generalised wave gauge: all sufficiently regular solutions to the system of equations that result from linearising the...
Effect of lamotrigine on cerebral blood flow in patients with idiopathic generalised epilepsy
Energy Technology Data Exchange (ETDEWEB)
Joo, Eun Yeon [Ewha Womans University, Department of Neurology, College of Medicine, Seoul (Korea); Hong, Seung Bong; Tae, Woo Suk; Han, Sun Jung; Seo, Dae Won [Sungkyunkwan University School of Medicine, Department of Neurology, Samsung Medical Center and Center for Clinical Medicine, SBRI, Gangnam-Gu, Seoul (Korea); Lee, Kyung-Han [Sungkyunkwan University School of Medicine, Department of Nuclear Medicine, Samsung Medical Center and Center for Clinical Medicine, SBRI, Gangnam-Gu, Seoul (Korea); Lee, Mann Hyung [Catholic University of Daegu, College of Pharmacy, Gyeongbuk (Korea)
2006-06-15
The purpose of this study was to investigate the effects of the new anti-epileptic drug, lamotrigine, on cerebral blood flow by performing {sup 99m}Tc-ethylcysteinate dimer (ECD) single-photon emission computed tomography (SPECT) before and after medication in patients with drug-naive idiopathic generalised epilepsy. Interictal {sup 99m}Tc-ECD brain SPECT was performed before drug treatment started and then repeated after lamotrigine medication for 4-5 months in 30 patients with generalised epilepsy (M/F=14/16, 19.3{+-}3.4 years). Seizure types were generalised tonic-clonic seizure in 23 patients and myoclonic seizures in seven. The mean lamotrigine dose used was 214.1{+-}29.1 mg/day. For SPM analysis, all SPECT images were spatially normalised to the standard SPECT template and then smoothed using a 12-mm full-width at half-maximum Gaussian kernel. The paired t test was used to compare pre- and post-lamotrigine SPECT images. SPM analysis of pre- and post-lamotrigine brain SPECT images showed decreased perfusion in bilateral dorsomedial nuclei of thalami, bilateral uncus, right amygdala, left subcallosal gyrus, right superior and inferior frontal gyri, right precentral gyrus, bilateral superior and inferior temporal gyri and brainstem (pons, medulla) after lamotrigine medication at a false discovery rate-corrected p<0.05. No brain region showed increased perfusion after lamotrigine administration. (orig.)
Directory of Open Access Journals (Sweden)
Abbas Mardani
2017-01-01
Full Text Available Rough set theory has been used extensively in fields of complexity, cognitive sciences, and artificial intelligence, especially in numerous fields such as expert systems, knowledge discovery, information system, inductive reasoning, intelligent systems, data mining, pattern recognition, decision-making, and machine learning. Rough sets models, which have been recently proposed, are developed applying the different fuzzy generalisations. Currently, there is not a systematic literature review and classification of these new generalisations about rough set models. Therefore, in this review study, the attempt is made to provide a comprehensive systematic review of methodologies and applications of recent generalisations discussed in the area of fuzzy-rough set theory. On this subject, the Web of Science database has been chosen to select the relevant papers. Accordingly, the systematic and meta-analysis approach, which is called “PRISMA,” has been proposed and the selected articles were classified based on the author and year of publication, author nationalities, application field, type of study, study category, study contribution, and journal in which the articles have appeared. Based on the results of this review, we found that there are many challenging issues related to the different application area of fuzzy-rough set theory which can motivate future research studies.
Robustness to strategic uncertainty
Andersson, O.; Argenton, C.; Weibull, J.W.
We introduce a criterion for robustness to strategic uncertainty in games with continuum strategy sets. We model a player's uncertainty about another player's strategy as an atomless probability distribution over that player's strategy set. We call a strategy profile robust to strategic uncertainty
Fission Spectrum Related Uncertainties
Energy Technology Data Exchange (ETDEWEB)
G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores
2007-10-01
The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.
DEFF Research Database (Denmark)
Nobaew, Banphot; Ryberg, Thomas
2011-01-01
developed by Buckingham. It supplements and extends this framework by offering a more detailed account of how visual principles and elements in games can be analysed. In developing this visual grammar we draw theoretically on existing approaches within: the arts, history, film study, semiotics, multimodal...... analysis, and game studies. We illustrate the theoretical and analytical framework by analysing samples of screenshots and video clips collected from the online game “World of Warcraft” (WoW) where we have conducted our online research. The research data is supplemented by ethnographic data (observation......This paper proposes a new theoretical framework or visual grammar for analysing visual aspects of digital 3D games, and for understanding more deeply the notion of Visual Digital Game Literacy. The framework focuses on the development of a visual grammar by drawing on the digital literacy framework...
Quantum principles and particles
Wilcox, Walter
2012-01-01
QUANTUM PRINCIPLESPerspective and PrinciplesPrelude to Quantum MechanicsStern-Gerlach Experiment Idealized Stern-Gerlach ResultsClassical Model AttemptsWave Functions for Two Physical-Outcome CaseProcess Diagrams, Operators, and Completeness Further Properties of Operators/ModulationOperator ReformulationOperator RotationBra-Ket Notation/Basis StatesTransition AmplitudesThree-Magnet Setup Example-CoherenceHermitian ConjugationUnitary OperatorsA Very Special OperatorMatrix RepresentationsMatrix Wave Function RecoveryExpectation ValuesWrap Up ProblemsFree Particles in One DimensionPhotoelectric EffectCompton EffectUncertainty Relation for PhotonsStability of Ground StatesBohr ModelFourier Transform and Uncertainty RelationsSchrödinger EquationSchrödinger Equation ExampleDirac Delta FunctionsWave Functions and ProbabilityProbability CurrentTime Separable SolutionsCompleteness for Particle StatesParticle Operator PropertiesOperator RulesTime Evolution and Expectation ValuesWrap-UpProblemsSome One-Dimensional So...
Uncertainty in prediction and in inference
International Nuclear Information System (INIS)
Hilgevoord, J.; Uffink, J.
1991-01-01
The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support
International Nuclear Information System (INIS)
Andres, T.H.
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Uncertainty and Cognitive Control
Directory of Open Access Journals (Sweden)
Faisal eMushtaq
2011-10-01
Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.
Regulating fisheries under uncertainty
DEFF Research Database (Denmark)
Hansen, Lars Gårn; Jensen, Frank
2017-01-01
the effects of these uncertainties into a single welfare measure for comparing tax and quota regulation. It is shown that quotas are always preferred to fees when structural economic uncertainty dominates. Since most regulators are subject to this kind of uncertainty, this result is a potentially important......Regulator uncertainty is decisive for whether price or quantity regulation maximizes welfare in fisheries. In this paper, we develop a model of fisheries regulation that includes ecological uncertainly, variable economic uncertainty as well as structural economic uncertainty. We aggregate...... qualification of the pro-price regulation message dominating the fisheries economics literature. We also believe that the model of a fishery developed in this paper could be applied to the regulation of other renewable resources where regulators are subject to uncertainty either directly or with some...
International Nuclear Information System (INIS)
Edgar, S Brian; Ramos, M P Machado
2007-01-01
We demonstrate an integration procedure for the generalised invariant formalism by obtaining a subclass of conformally flat pure radiation spacetimes with a negative cosmological constant. The method used is a development of the methods used earlier for pure radiation spacetimes of Petrov types O and N respectively. This subclass of spacetimes turns out to have one degree of isotropy freedom, so in this paper we have extended the integration procedure for the generalised invariant formalism to spacetimes with isotropy freedom,
Energy Technology Data Exchange (ETDEWEB)
Edgar, S Brian [Department of Mathematics, Linkoepings Universitet Linkoeping, S-581 83 (Sweden); Ramos, M P Machado [Departamento de Matematica para a Ciencia e Tecnologia, Azurem 4800-058 Guimaraes, Universidade do Minho (Portugal)
2007-05-15
We demonstrate an integration procedure for the generalised invariant formalism by obtaining a subclass of conformally flat pure radiation spacetimes with a negative cosmological constant. The method used is a development of the methods used earlier for pure radiation spacetimes of Petrov types O and N respectively. This subclass of spacetimes turns out to have one degree of isotropy freedom, so in this paper we have extended the integration procedure for the generalised invariant formalism to spacetimes with isotropy freedom,.
Resolving uncertainty in chemical speciation determinations
Smith, D. Scott; Adams, Nicholas W. H.; Kramer, James R.
1999-10-01
Speciation determinations involve uncertainty in system definition and experimentation. Identification of appropriate metals and ligands from basic chemical principles, analytical window considerations, types of species and checking for consistency in equilibrium calculations are considered in system definition uncertainty. A systematic approach to system definition limits uncertainty in speciation investigations. Experimental uncertainty is discussed with an example of proton interactions with Suwannee River fulvic acid (SRFA). A Monte Carlo approach was used to estimate uncertainty in experimental data, resulting from the propagation of uncertainties in electrode calibration parameters and experimental data points. Monte Carlo simulations revealed large uncertainties present at high (>9-10) and low (monoprotic ligands. Least-squares fit the data with 21 sites, whereas linear programming fit the data equally well with 9 sites. Multiresponse fitting, involving simultaneous fluorescence and pH measurements, improved model discrimination. Deconvolution of the excitation versus emission fluorescence surface for SRFA establishes a minimum of five sites. Diprotic sites are also required for the five fluorescent sites, and one non-fluorescent monoprotic site was added to accommodate the pH data. Consistent with greater complexity, the multiresponse method had broader confidence limits than the uniresponse methods, but corresponded better with the accepted total carboxylic content for SRFA. Overall there was a 40% standard deviation in total carboxylic content for the multiresponse fitting, versus 10% and 1% for least-squares and linear programming, respectively.
Dziedzic, J; Hill, Q; Skylaris, C-K
2013-12-07
We present a method for the calculation of four-centre two-electron repulsion integrals in terms of localised non-orthogonal generalised Wannier functions (NGWFs). Our method has been implemented in the ONETEP program and is used to compute the Hartree-Fock exchange energy component of Hartree-Fock and Density Functional Theory (DFT) calculations with hybrid exchange-correlation functionals. As the NGWFs are optimised in situ in terms of a systematically improvable basis set which is equivalent to plane waves, it is possible to achieve large basis set accuracy in routine calculations. The spatial localisation of the NGWFs allows us to exploit the exponential decay of the density matrix in systems with a band gap in order to compute the exchange energy with a computational effort that increases linearly with the number of atoms. We describe the implementation of this approach in the ONETEP program for linear-scaling first principles quantum mechanical calculations. We present extensive numerical validation of all the steps in our method. Furthermore, we find excellent agreement in energies and structures for a wide variety of molecules when comparing with other codes. We use our method to perform calculations with the B3LYP exchange-correlation functional for models of myoglobin systems bound with O2 and CO ligands and confirm that the same qualitative behaviour is obtained as when the same myoglobin models are studied with the DFT+U approach which is also available in ONETEP. Finally, we confirm the linear-scaling capability of our method by performing calculations on polyethylene and polyacetylene chains of increasing length.
International Nuclear Information System (INIS)
Dziedzic, J.; Hill, Q.; Skylaris, C.-K.
2013-01-01
We present a method for the calculation of four-centre two-electron repulsion integrals in terms of localised non-orthogonal generalised Wannier functions (NGWFs). Our method has been implemented in the ONETEP program and is used to compute the Hartree-Fock exchange energy component of Hartree-Fock and Density Functional Theory (DFT) calculations with hybrid exchange-correlation functionals. As the NGWFs are optimised in situ in terms of a systematically improvable basis set which is equivalent to plane waves, it is possible to achieve large basis set accuracy in routine calculations. The spatial localisation of the NGWFs allows us to exploit the exponential decay of the density matrix in systems with a band gap in order to compute the exchange energy with a computational effort that increases linearly with the number of atoms. We describe the implementation of this approach in the ONETEP program for linear-scaling first principles quantum mechanical calculations. We present extensive numerical validation of all the steps in our method. Furthermore, we find excellent agreement in energies and structures for a wide variety of molecules when comparing with other codes. We use our method to perform calculations with the B3LYP exchange-correlation functional for models of myoglobin systems bound with O 2 and CO ligands and confirm that the same qualitative behaviour is obtained as when the same myoglobin models are studied with the DFT+U approach which is also available in ONETEP. Finally, we confirm the linear-scaling capability of our method by performing calculations on polyethylene and polyacetylene chains of increasing length
Verification of uncertainty budgets
DEFF Research Database (Denmark)
Heydorn, Kaj; Madsen, B.S.
2005-01-01
The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data...... observed and expected variability is tested by means of the T-test, which follows a chi-square distribution with a number of degrees of freedom determined by the number of replicates. Significant deviations between predicted and observed variability may be caused by a variety of effects, and examples...... will be presented; both underestimation and overestimation may occur, each leading to correcting the influence of uncertainty components according to their influence on the variability of experimental results. Some uncertainty components can be verified only with a very small number of degrees of freedom, because...
International Nuclear Information System (INIS)
Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.
2005-01-01
In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)
Heisenberg, Matrix Mechanics, and the Uncertainty Principle 4-6 ...
Indian Academy of Sciences (India)
ion of prejudices one fonus up to the age of eighteen. What he meant was that our so-called physical intuition is little more than a rough feel for the way the physical world around us behaves on everyday scales. At. very small or very large scales.
NFκB in Neurons? The Uncertainty Principle in Neurobiology
Massa, Paul; Aleyasin, Hossein; Park, David S.; Mao, Xianrong; Barger, Steven W.
2007-01-01
Nuclear factor κB (NFκB) is a dynamically modulated transcription factor with an extensive literature pertaining to widespread actions across species, cell types, and developmental stages. Analysis of NFκB in a complex environment such as neural tissue suffers from a difficulty in simultaneously establishing both activity and location. But much of the available data indicate a profound recalcitrance of NFκB activation in neurons, as compared to most other cell types. Few studies to date have sought to distinguish between the various combinatorial dimers of NFκB family members. Recent research has illustrated the importance of these problems, as well as opportunities to move past them to the nuances manifest through variable activation pathways, subunit complexity, and target sequence preferences. PMID:16573643
Lacunary Fourier series and a qualitative uncertainty principle for ...
Indian Academy of Sciences (India)
Since Ff vanishes on an open set, by Lemma 2.1, Ff vanishes identically. It follows that Trace(π( f )) is zero for π ∈ ˆG. Next, notice that any translate of f has lacunary Fourier series. If g varies in a small enough neighborhood of the identity in G, then applying the above argument to the trans- lated function g f (x) = f (gx) we ...
The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment
Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea
2010-01-01
An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…
Lacunary Fourier series and a qualitative uncertainty principle for ...
Indian Academy of Sciences (India)
−1) dg. Since one can choose a metric on G which is biinvariant, it is clear that Ff vanishes in a small enough neighborhood of identity in G. Also, Ff is a central function and so is determined by its restriction (still denoted by Ff ) to the maximal torus T. A simple computation using Schur's orthogonality relations shows that.
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Development of a generalised equivalent estimation approach for multi-axle vehicle handling dynamics
Ding, Jinquan; Guo, Konghui
2016-01-01
This paper devotes analytical effort in developing the 2M equivalent approach to analyse both the effect of vehicle body roll and n-axle handling on vehicle dynamics. The 1M equivalent vehicle 2DOF equation including an equivalent roll effect was derived from the conventional two-axle 3DOF vehicle model. And the 1M equivalent dynamics concepts were calculated to evaluate the steady-state steering, frequency characteristics, and root locus of the two-axle vehicle with only the effect of body roll. This 1M equivalent approach is extended to a three-axle 3DOF model to derive similar 1M equivalent mathematical identities including an equivalent roll effect. The 1M equivalent wheelbases and stability factor with the effect of the third axle or body roll, and 2M equivalent wheelbase and stability factor including both the effect of body roll and the third-axle handling were derived to evaluate the steady-state steering, frequency characteristics, and root locus of the three-axle vehicle. By using the recursive method, the generalised 1M equivalent wheelbase and stability factor with the effect of n-axle handling and 2M equivalent generalised wheelbase and stability factor including both the effect of body roll and n-axle handling were derived to evaluate the steady-state steering, frequency characteristics, and root locus of the n-axle vehicle. The 2M equivalent approach and developed generalised mathematical handling concepts were validated to be useful and could serve as an important tool for estimating both the effect of vehicle body roll and n-axle handling on multi-axle vehicle dynamics.
Uncertainty in artificial intelligence
Kanal, LN
1986-01-01
How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.
Uncertainties in hydrogen combustion
International Nuclear Information System (INIS)
Stamps, D.W.; Wong, C.C.; Nelson, L.S.
1988-01-01
Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references
Efficacy of oral afoxolaner for the treatment of canine generalised demodicosis
Beugnet, Fr?d?ric; Halos, L?na?g; Larsen, Diane; de Vos, Christa
2016-01-01
The efficacy of oral treatment with a chewable tablet containing afoxolaner 2.27% w/w (NexGard®, Merial) administered orally was assessed in eight dogs diagnosed with generalised demodicosis and compared with efficacy in eight dogs under treatment with a topical combination of imidacloprid/moxidectin (Advocate®, Bayer). Afoxolaner was administered at the recommended dose (at least 2.5 mg/kg) on Days 0, 14, 28 and 56. The topical combination of imidacloprid/moxidectin was given at the same int...
Generalised morphoea with lichen sclerosus et atrophicus and unusual bone changes
Directory of Open Access Journals (Sweden)
Prasad P
1995-01-01
Full Text Available A 26-year-old male patient presented with multiple plaques on the limbs and trunk suggestive of morphoea. He also exhibited multiple, small, atrophic, hypopigmented macules on the left side of the trunk, the histopathology of which was consistent with lichen sclerosus et atrophicus (LSA. The patient developed large ulcers on the left leg and foot, and contractures with flexion deformity of the left ring and little fingers. This combination of generalised morphoea with LSA and unusual osteolytic bone changes is uncommon.
Formulation of a generalised switching CFAR with application to X-band maritime surveillance radar.
Weinberg, Graham V
2015-01-01
A generalisation of a switching based detector is examined, allowing the construction of such detectors for target detection in any clutter model of interest. Such detectors are important in radar signal processing because they are robust solutions to the management of interference. Although formulated in general terms, the theory is applied to the design of a switching constant false alarm rate detector for X-band maritime surveillance radar. It is shown that such a detector manages the problem of interference better than standard detection processes.
Yangian and SUSY symmetry of high spin parton splitting amplitudes in generalised Yang-Mills theory
Kirschner, Roland; Savvidy, George
2017-07-01
We have calculated the high spin parton splitting amplitudes postulating the Yangian symmetry of the scattering amplitudes for tensor gluons. The resulting splitting amplitudes coincide with the earlier calculations, which were based on the BCFW recursion relations. The resulting formula unifies all known splitting probabilities found earlier in gauge field theories. It describes splitting probabilities for integer and half-integer spin particles. We also checked that the splitting probabilities fulfil the generalised Kounnas-Ross 𝒩 = 1 supersymmetry relations hinting to the fact that the underlying theory can be formulated in an explicit supersymmetric manner.
FURTHER GENERALISATIONS OF THE KUMMER-SCHWARZ EQUATION: ALGEBRAIC AND SINGULARITY PROPERTIES
Directory of Open Access Journals (Sweden)
R Sinuvasan
2017-12-01
Full Text Available The Kummer–Schwarz Equation, 2y'y'''− 3(y''2 = 0, has a generalisation, (n − 1y(n−2y(n − ny(n−12 = 0, which shares many properties with the parent form in terms of symmetry and singularity. All equations of the class are integrable in closed form. Here we introduce a new class, (n+q−2y(n−2y(n −(n+q−1y(n−12 = 0, which has different integrability and singularity properties.
Generalised universality of gauge thresholds in heterotic vacua with and without supersymmetry
Angelantonj, Carlo; Tsulaia, Mirian
2015-01-01
We study one-loop quantum corrections to gauge couplings in heterotic vacua with spontaneous supersymmetry breaking. Although in non-supersymmetric constructions these corrections are not protected and are typically model dependent, we show how a universal behaviour of threshold differences, typical of supersymmetric vacua, may still persist. We formulate specific conditions on the way supersymmetry should be broken for this to occur. Our analysis implies a generalised notion of threshold universality even in the case of unbroken supersymmetry, whenever extra charged massless states appear at enhancement points in the bulk of moduli space. Several examples with universality, including non-supersymmetric chiral models in four dimensions, are presented.
Uncertainty modeling process for semantic technology
Directory of Open Access Journals (Sweden)
Rommel N. Carvalho
2016-08-01
Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.
Sciacchitano, A.; Wieneke, Bernhard
2016-01-01
This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It
Van Nooyen, R.R.P.; Hrachowitz, M.; Kolechkina, A.G.
2014-01-01
Even without uncertainty about the model structure or parameters, the output of a hydrological model run still contains several sources of uncertainty. These are: measurement errors affecting the input, the transition from continuous time and space to discrete time and space, which causes loss of
International Nuclear Information System (INIS)
Depres, B.; Dossantos-Uzarralde, P.
2009-01-01
More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers
Physical Uncertainty Bounds (PUB)
Energy Technology Data Exchange (ETDEWEB)
Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.
Efficacy of oral afoxolaner for the treatment of canine generalised demodicosis
Directory of Open Access Journals (Sweden)
Beugnet Frédéric
2016-01-01
Full Text Available The efficacy of oral treatment with a chewable tablet containing afoxolaner 2.27% w/w (NexGard®, Merial administered orally was assessed in eight dogs diagnosed with generalised demodicosis and compared with efficacy in eight dogs under treatment with a topical combination of imidacloprid/moxidectin (Advocate®, Bayer. Afoxolaner was administered at the recommended dose (at least 2.5 mg/kg on Days 0, 14, 28 and 56. The topical combination of imidacloprid/moxidectin was given at the same intervals at the recommended concentration. Clinical examinations and deep skin scrapings were performed every month in order to evaluate the effect on mite numbers and the resolution of clinical signs. The percentage reductions of mite counts were 99.2%, 99.9% and 100% on Days 28, 56 and 84, respectively, in the afoxolaner-treated group, compared to 89.8%, 85.2% and 86.6% on Days 28, 56 and 84 in the imidacloprid/moxidectin-treated group. Skin condition of the dogs also improved significantly from Day 28 to Day 84 in the afoxolaner-treated group. Mite reductions were significantly higher on Days 28, 56 and 84 in the afoxolaner-treated group compared to the imidacloprid/moxidectin-treated group. The results of this study demonstrated that afoxolaner, given orally, was effective in treating dogs with generalised demodicosis within a two-month period.
Efficacy of oral afoxolaner for the treatment of canine generalised demodicosis.
Beugnet, Frédéric; Halos, Lénaïg; Larsen, Diane; de Vos, Christa
2016-01-01
The efficacy of oral treatment with a chewable tablet containing afoxolaner 2.27% w/w (NexGard(®), Merial) administered orally was assessed in eight dogs diagnosed with generalised demodicosis and compared with efficacy in eight dogs under treatment with a topical combination of imidacloprid/moxidectin (Advocate(®), Bayer). Afoxolaner was administered at the recommended dose (at least 2.5 mg/kg) on Days 0, 14, 28 and 56. The topical combination of imidacloprid/moxidectin was given at the same intervals at the recommended concentration. Clinical examinations and deep skin scrapings were performed every month in order to evaluate the effect on mite numbers and the resolution of clinical signs. The percentage reductions of mite counts were 99.2%, 99.9% and 100% on Days 28, 56 and 84, respectively, in the afoxolaner-treated group, compared to 89.8%, 85.2% and 86.6% on Days 28, 56 and 84 in the imidacloprid/moxidectin-treated group. Skin condition of the dogs also improved significantly from Day 28 to Day 84 in the afoxolaner-treated group. Mite reductions were significantly higher on Days 28, 56 and 84 in the afoxolaner-treated group compared to the imidacloprid/moxidectin-treated group. The results of this study demonstrated that afoxolaner, given orally, was effective in treating dogs with generalised demodicosis within a two-month period. © F. Beugnet et al., published by EDP Sciences, 2016.
Hybrid Generalised Additive Type-2 Fuzzy-Wavelet-Neural Network in Dynamic Data Mining
Directory of Open Access Journals (Sweden)
Bodyanskiy Yevgeniy
2015-12-01
Full Text Available In the paper, a new hybrid system of computational intelligence is proposed. This system combines the advantages of neuro-fuzzy system of Takagi-Sugeno-Kang, type-2 fuzzy logic, wavelet neural networks and generalised additive models of Hastie-Tibshirani. The proposed system has universal approximation properties and learning capability based on the experimental data sets which pertain to the neural networks and neuro-fuzzy systems; interpretability and transparency of the obtained results due to the soft computing systems and, first of all, due to type-2 fuzzy systems; possibility of effective description of local signal and process features due to the application of systems based on wavelet transform; simplicity and speed of learning process due to generalised additive models. The proposed system can be used for solving a wide class of dynamic data mining tasks, which are connected with non-stationary, nonlinear stochastic and chaotic signals. Such a system is sufficiently simple in numerical implementation and is characterised by a high speed of learning and information processing.
Navigation towards a goal position: from reactive to generalised learned control
Energy Technology Data Exchange (ETDEWEB)
Freire da Silva, Valdinei [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil); Selvatici, Antonio Henrique [Universidade Nove de Julho, Rua Vergueiro, 235, Sao Paulo (Brazil); Reali Costa, Anna Helena, E-mail: valdinei.freire@gmail.com, E-mail: antoniohps@uninove.br, E-mail: anna.reali@poli.usp.br [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil)
2011-03-01
The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.
Navigation towards a goal position: from reactive to generalised learned control
International Nuclear Information System (INIS)
Freire da Silva, Valdinei; Selvatici, Antonio Henrique; Reali Costa, Anna Helena
2011-01-01
The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.
Directory of Open Access Journals (Sweden)
D.P. van der Nest
2017-11-01
Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness
Image restoration, uncertainty, and information.
Yu, F T
1969-01-01
Some of the physical interpretations about image restoration are discussed. From the theory of information the unrealizability of an inverse filter can be explained by degradation of information, which is due to distortion on the recorded image. The image restoration is a time and space problem, which can be recognized from the theory of relativity (the problem of image restoration is related to Heisenberg's uncertainty principle in quantum mechanics). A detailed discussion of the relationship between information and energy is given. Two general results may be stated: (1) the restoration of the image from the distorted signal is possible only if it satisfies the detectability condition. However, the restored image, at the best, can only approach to the maximum allowable time criterion. (2) The restoration of an image by superimposing the distorted signal (due to smearing) is a physically unrealizable method. However, this restoration procedure may be achieved by the expenditure of an infinite amount of energy.
The gauge principle vs. the equivalence principle
International Nuclear Information System (INIS)
Gates, S.J. Jr.
1984-01-01
Within the context of field theory, it is argued that the role of the equivalence principle may be replaced by the principle of gauge invariance to provide a logical framework for theories of gravitation
Leximin rules for bankruptcy problems under uncertainty
Hinojosa, M. A.; Mármol, A. M.; Sánchez, F. J.
2014-01-01
We model bankruptcy problems under uncertainty under the assumption that there are several possible states of nature, each of which is identified with a different bankruptcy problem. For this multi-dimensional extension of classic bankruptcy problems, we consider situations in which agents exhibit at the same time additive preferences and leximin preferences on their possible results. We propose division rules which combine different rationality principles and guarantee efficiency with respect to leximin preferences.
Equivalence principles and electromagnetism
Ni, W.-T.
1977-01-01
The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.
Principle of accelerator mass spectrometry
International Nuclear Information System (INIS)
Matsuzaki, Hiroyuki
2007-01-01
The principle of accelerator mass spectrometry (AMS) is described mainly on technical aspects: hardware construction of AMS, measurement of isotope ratio, sensitivity of measurement (measuring limit), measuring accuracy, and application of data. The content may be summarized as follows: rare isotope (often long-lived radioactive isotope) can be detected by various use of the ion energy obtained by the acceleration of ions, a measurable isotope ratio is one of rare isotope to abundant isotopes, and a measured value of isotope ratio is uncertainty to true one. Such a fact must be kept in mind on the use of AMS data to application research. (M.H.)
Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation
Tchiguirinskaia, Ioulia; Scherzer, Daniel
2016-04-01
Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate
Comments on 'On a proposed new test of Heisenberg's principle'
International Nuclear Information System (INIS)
Home, D.; Sengupta, S.
1981-01-01
A logical fallacy is pointed out in Robinson's analysis (J. Phys. A.; 13:877 (1980)) of a thought experiment purporting to show violation of Heisenberg's uncertainty principle. The real problem concerning the interpretation of Heisenberg's principle is precisely stated. (author)
International Nuclear Information System (INIS)
Limperopoulos, G.J.
1995-01-01
This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs
Evaluating prediction uncertainty
International Nuclear Information System (INIS)
McKay, M.D.
1995-03-01
The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented
Introduction to uncertainty quantification
Sullivan, T J
2015-01-01
Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...
Uncertainty calculations made easier
International Nuclear Information System (INIS)
Hogenbirk, A.
1994-07-01
The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)
Uncertainty: lotteries and risk
Ávalos, Eloy
2011-01-01
In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.
Sources of Judgmental Uncertainty
1977-09-01
sometimes at the end. To avoid primacy or recency effects , which were not part of this first study, for half of the subjects the orders of information items...summarize, 72 subjects were randomly assigned to two conditions of control and exposed to three conditions of orderliness. Order effects and primacy / recency ...WORDS (Continue on reverie atids If necessary and Identity by block number) ~ Judgmental Uncertainty Primacy / Recency Environmental UncertaintyN1
Decision making under uncertainty
International Nuclear Information System (INIS)
Wu, J.S.; Apostolakis, G.E.; Okrent, D.
1989-01-01
The theory of evidence and the theory of possibility are considered by some analysts as potential models for uncertainty. This paper discusses two issues: how formal probability theory has been relaxed to develop these uncertainty models; and the degree to which these models can be applied to risk assessment. The scope of the second issue is limited to an investigation of their compatibility for combining various pieces of evidence, which is an important problem in PRA
Relational uncertainty in service dyads
DEFF Research Database (Denmark)
Kreye, Melanie
2017-01-01
Purpose: Relational uncertainty determines how relationships develop because it enables the building of trust and commitment. However, relational uncertainty has not been explored in an inter-organisational setting. This paper investigates how organisations experience relational uncertainty in se...
Study and development of a generalised input-output system for data base management systems
International Nuclear Information System (INIS)
Zidi, Noureddine
1975-01-01
This thesis reports a study which aimed at designing and developing a software for the management and execution of all input-output actions of data base management systems. This software is also an interface between data base management systems and the various operating systems. After a recall of general characteristics of database management systems, the author presents the previously developed GRISBI system (rational management of information stored in an integrated database), and describes difficulties faced to adapt this system to the new access method (VSAM, virtual sequential access method). This lead to the search for a more general solution, the development of which is presented in the second part of this thesis: environment of the input-output generalised system, architecture, internal specifications. The last part presents flowcharts and statements of the various routines [fr
Geometric Generalisation of Surrogate Model-Based Optimisation to Combinatorial and Program Spaces
Directory of Open Access Journals (Sweden)
Yong-Hyuk Kim
2014-01-01
Full Text Available Surrogate models (SMs can profitably be employed, often in conjunction with evolutionary algorithms, in optimisation in which it is expensive to test candidate solutions. The spatial intuition behind SMs makes them naturally suited to continuous problems, and the only combinatorial problems that have been previously addressed are those with solutions that can be encoded as integer vectors. We show how radial basis functions can provide a generalised SM for combinatorial problems which have a geometric solution representation, through the conversion of that representation to a different metric space. This approach allows an SM to be cast in a natural way for the problem at hand, without ad hoc adaptation to a specific representation. We test this adaptation process on problems involving binary strings, permutations, and tree-based genetic programs.
Travelling wave solutions of (2++1)-dimensional generalised time-fractional Hirota equation
Zhang, Youwei
2018-03-01
In this article, we have developed new exact analytical solutions of a nonlinear evolution equation that appear in mathematical physics, a (2+1)-dimensional generalised time-fractional Hirota equation, which describes the wave propagation in an erbium-doped nonlinear fibre with higher-order dispersion. By virtue of the tanh-expansion and complete discrimination system by means of fractional complex transform, travelling wave solutions are derived. Wave interaction for the wave propagation strength and angle of field quantity under the long wave limit are analysed: Bell-shape solitons are found and it is found that the complex transform coefficient in the system affects the direction of the wave propagation, patterns of the soliton interaction, distance and direction.
DEFF Research Database (Denmark)
Curth, Nadja Kehler; Brinck-Claussen, Ursula Ødum; Davidsen, Annette Sofie
2017-01-01
Background: People with anxiety disorders represent a significant part of a general practitioner’s patient population. However, there are organisational obstacles for optimal treatment, such as a lack of coordination of illness management and limited access to evidence-based treatment...... such as cognitive behavioral therapy. A limited number of studies suggest that collaborative care has a positive effect on symptoms for people with anxiety disorders. However, most studies are carried out in the USA and none have reported results for social phobia or generalised anxiety disorder separately. Thus......, there is a need for studies carried out in different settings for specific anxiety populations. A Danish model for collaborative care (the Collabri model) has been developed for people diagnosed with depression or anxiety disorders. The model is evaluated through four trials, of which three will be outlined...
Generalised tetanus in a 2-week-old foal: use of physiotherapy to aid recovery.
Mykkänen, A K; Hyytiäinen, H K; McGowan, C M
2011-11-01
A 2-week-old Estonian Draft foal presented with signs of severe generalised tetanus, recumbency and inability to drink. The suspected source of infection was the umbilicus. Medical treatment was administered, including tetanus antitoxin, antimicrobial therapy and phenobarbital to control tetanic spasms. In addition, an intensive physiotherapy program was carried out during the recovery period. Techniques designed for syndromes involving upper motor neuron spasticity in humans were applied. Exercises aimed at weight-bearing and mobility were executed with the help of a walking-frame. The foal made a complete recovery. To our knowledge, this is the first report of the use of physiotherapy in the treatment of tetanus in horses. © 2011 The Authors. Australian Veterinary Journal © 2011 Australian Veterinary Association.
Generalisations of Hamilton's Rule Applied to Non-Additive Public Goods Games with Random Group Size
Directory of Open Access Journals (Sweden)
James A R Marshall
2014-07-01
Full Text Available Inclusive fitness theory has been described as being limited to certain special cases of social evolution. In particular some authors argue that the theory can only be applied to social interactions having additive fitness effects, and involving only pairs of individuals. This article takes an elegant formulation of non-additive public goods games from the literature, and shows how the two main generalisations of Hamilton's rule can be applied to such games when group sizes are random. In doing so inclusive fitness theory is thus applied to a very general class of social dilemmas, thereby providing further evidence for its generality. Interestingly, one of the two predominant versions of Hamilton's rule is found to be mathematically easier to apply to the scenario considered, despite both necessarily giving equivalent predictions.
QCD amplitudes with 2 initial spacelike legs via generalised BCFW recursion
Energy Technology Data Exchange (ETDEWEB)
Kutak, Krzysztof; Hameren, Andreas van; Serino, Mirko [The H. Niewodniczański Institute of Nuclear Physics, Polish Academy of Sciences, ul. Radzikowskiego 152, 31-342, Cracow (Poland)
2017-02-02
We complete the generalisation of the BCFW recursion relation to the off-shell case, allowing for the computation of tree level scattering amplitudes for full High Energy Factorisation (HEF), i.e. with both incoming partons having a non-vanishing transverse momentum. We provide explicit results for color-ordered amplitudes with two off-shell legs in massless QCD up to 4 point, continuing the program begun in two previous papers. For the 4-fermion amplitudes, which are not BCFW-recursible, we perform a diagrammatic computation, so as to offer a complete set of expressions. We explicitly show and discuss some plots of the squared 2→2 matrix elements as functions of the differences in rapidity and azimuthal angle of the final state particles.
Generalised Adaptive Harmony Search: A Comparative Analysis of Modern Harmony Search
Directory of Open Access Journals (Sweden)
Jaco Fourie
2013-01-01
Full Text Available Harmony search (HS was introduced in 2001 as a heuristic population-based optimisation algorithm. Since then HS has become a popular alternative to other heuristic algorithms like simulated annealing and particle swarm optimisation. However, some flaws, like the need for parameter tuning, were identified and have been a topic of study for much research over the last 10 years. Many variants of HS were developed to address some of these flaws, and most of them have made substantial improvements. In this paper we compare the performance of three recent HS variants: exploratory harmony search, self-adaptive harmony search, and dynamic local-best harmony search. We compare the accuracy of these algorithms, using a set of well-known optimisation benchmark functions that include both unimodal and multimodal problems. Observations from this comparison led us to design a novel hybrid that combines the best attributes of these modern variants into a single optimiser called generalised adaptive harmony search.
DEFF Research Database (Denmark)
Lal, Dennis; Ruppert, Ann-Kathrin; Trucks, Holger
2015-01-01
Genetic generalised epilepsy (GGE) is the most common form of genetic epilepsy, accounting for 20% of all epilepsies. Genomic copy number variations (CNVs) constitute important genetic risk factors of common GGE syndromes. In our present genome-wide burden analysis, large (≥ 400 kb) and rare (...%) autosomal microdeletions with high calling confidence (≥ 200 markers) were assessed by the Affymetrix SNP 6.0 array in European case-control cohorts of 1,366 GGE patients and 5,234 ancestry-matched controls. We aimed to: 1) assess the microdeletion burden in common GGE syndromes, 2) estimate the relative...... a strong impact of fundamental neurodevelopmental processes in the pathogenesis of common GGE syndromes....
Snook, Ian
2007-01-01
The Langevin and Generalised Langevin Approach To The Dynamics Of Atomic, Polymeric And Colloidal Systems is concerned with the description of aspects of the theory and use of so-called random processes to describe the properties of atomic, polymeric and colloidal systems in terms of the dynamics of the particles in the system. It provides derivations of the basic equations, the development of numerical schemes to solve them on computers and gives illustrations of application to typical systems.Extensive appendices are given to enable the reader to carry out computations to illustrate many of the points made in the main body of the book.* Starts from fundamental equations* Gives up-to-date illustration of the application of these techniques to typical systems of interest* Contains extensive appendices including derivations, equations to be used in practice and elementary computer codes
Efficient uncertainty minimization for fuzzy spectral clustering.
White, Brian S; Shalloway, David
2009-11-01
Spectral clustering uses the global information embedded in eigenvectors of an inter-item similarity matrix to correctly identify clusters of irregular shape, an ability lacking in commonly used approaches such as k -means and agglomerative clustering. However, traditional spectral clustering partitions items into hard clusters, and the ability to instead generate fuzzy item assignments would be advantageous for the growing class of domains in which cluster overlap and uncertainty are important. Korenblum and Shalloway [Phys. Rev. E 67, 056704 (2003)] extended spectral clustering to fuzzy clustering by introducing the principle of uncertainty minimization. However, this posed a challenging nonconvex global optimization problem that they solved by a brute-force technique unlikely to scale to data sets having more than O(10;{2}) items. Here we develop a method for solving the minimization problem, which can handle data sets at least two orders of magnitude larger. In doing so, we elucidate the underlying structure of uncertainty minimization using multiple geometric representations. This enables us to show how fuzzy spectral clustering using uncertainty minimization is related to and generalizes clustering motivated by perturbative analysis of almost-block-diagonal matrices. Uncertainty minimization can be applied to a wide variety of existing hard spectral clustering approaches, thus transforming them to fuzzy methods.
Issues of validity and generalisability in the Grade 12 English Home Language examination
Directory of Open Access Journals (Sweden)
du Plessis, Colleen Lynne
2014-12-01
Full Text Available Very little research has been devoted to evaluating the national English Home Language (HL curriculum and assessment system. Not only is there a lack of clarity on whether the language subject is being offered at an adequately high level to meet the declared objectives of the curriculum, but the reliability of the results obtained by Grade 12 learners in the exit-level examination has been placed under suspicion. To shed some light on the issue, this study takes a close look at the language component of the school-leaving examination covering the period 2008-2012, to see whether evidence of high language ability can be generated through the current selection of task types and whether the inferred ability can be generalised to non-examination contexts. Of primary interest here are the validity of the construct on which the examination is built and the sub-abilities that are being measured, as well as the validity of the scoring. One of the key findings of the study is that the language papers cannot be considered indicators of advanced and differential language ability, only of basic and general proficiency. The lack of specifications in the design of the examination items and construction of the marking memoranda undermine the validity and reliability of the assessment. As a consequence hereof, the inferences made on the basis of the scores obtained by examinees are highly subjective and cannot be generalised to other domains of language use. The study hopes to draw attention to the importance of the format and design of the examination papers in maintaining educational standards.
Estimating incidence from prevalence in generalised HIV epidemics: methods and validation.
Directory of Open Access Journals (Sweden)
Timothy B Hallett
2008-04-01
Full Text Available HIV surveillance of generalised epidemics in Africa primarily relies on prevalence at antenatal clinics, but estimates of incidence in the general population would be more useful. Repeated cross-sectional measures of HIV prevalence are now becoming available for general populations in many countries, and we aim to develop and validate methods that use these data to estimate HIV incidence.Two methods were developed that decompose observed changes in prevalence between two serosurveys into the contributions of new infections and mortality. Method 1 uses cohort mortality rates, and method 2 uses information on survival after infection. The performance of these two methods was assessed using simulated data from a mathematical model and actual data from three community-based cohort studies in Africa. Comparison with simulated data indicated that these methods can accurately estimates incidence rates and changes in incidence in a variety of epidemic conditions. Method 1 is simple to implement but relies on locally appropriate mortality data, whilst method 2 can make use of the same survival distribution in a wide range of scenarios. The estimates from both methods are within the 95% confidence intervals of almost all actual measurements of HIV incidence in adults and young people, and the patterns of incidence over age are correctly captured.It is possible to estimate incidence from cross-sectional prevalence data with sufficient accuracy to monitor the HIV epidemic. Although these methods will theoretically work in any context, we have able to test them only in southern and eastern Africa, where HIV epidemics are mature and generalised. The choice of method will depend on the local availability of HIV mortality data.
Roatta, S; Micieli, G; Bosone, D; Losano, G; Bini, R; Cavallini, A; Passatore, M
1998-07-15
There is no general agreement regarding several aspects of the role of the sympathetic system on cerebral haemodynamics such as extent of effectiveness, operational range and site of action. This study was planned to identify the effect of a generalised sympathetic activation on the cerebral haemodynamics in healthy humans before it is masked by secondary corrections, metabolic or myogenic in nature. A total of 35 healthy volunteers aged 20-35 underwent a 5 min lasting cold pressor test (CPT) performed on their left hand. The cerebral blood flow (CBF) velocity in the middle cerebral arteries and arterial blood pressure were recorded with transcranial Doppler sonography and with a non-invasive finger-cuff method, respectively. The ratio of arterial blood pressure to mean blood velocity (ABP/Vm) and Pulsatility Index (PI) were calculated throughout each trial. CPT induced an increase in mean ABP (range 2-54 mmHg depending on the subject) and only a slight, though significant, increase in blood velocity in the middle cerebral artery (+2.4 and +4.4% on ipsi- and contralateral side, respectively). During CPT, the ratio ABP/Vm increased and PI decreased in all subjects on both sides. These changes began simultaneously with the increase in blood pressure. The increase in ABP/Vm ratio is attributed to an increase in the cerebrovascular resistance, while the concomitant reduction in PI is interpreted as due to the reduction in the compliance of the middle cerebral artery. The results suggest that generalised increases in the sympathetic discharge, causing increases in ABP, can prevent concomitant increases in CBF by acting on both small resistance and large compliant vessels. This effect is also present when a slight increase in blood pressure occurs, which suggests a moderate increase in the sympathetic discharge, i.e. when ABP remains far below the upper limit of CBF autoregulation.
Uncertainty information in climate data records from Earth observation
Merchant, C. J.
2017-12-01
How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is
The Bohr--Einstein ''weighing-of-energy'' debate and the principle of equivalence
International Nuclear Information System (INIS)
Hughes, R.J.
1990-01-01
The Bohr--Einstein debate over the ''weighing of energy'' and the validity of the time--energy uncertainty relation is reexamined in the context of gravitation theories that do not respect the equivalence principle. Bohr's use of the equivalence principle is shown to be sufficient, but not necessary, to establish the validity of this uncertainty relation in Einstein's ''weighing-of-energy'' gedanken experiment. The uncertainty relation is shown to hold in any energy-conserving theory of gravity, and so a failure of the equivalence principle does not engender a failure of quantum mechanics. The relationship between the gravitational redshift and the equivalence principle is reviewed
Network planning under uncertainties
Ho, Kwok Shing; Cheung, Kwok Wai
2008-11-01
One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a
Jones, P. W.; Strelitz, R. A.
2012-12-01
The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs
Vassilopoulos, Stephanos P.; Brouzos, Andreas; Moberly, Nicholas J.; Tsorbatzoudis, Haralambos; Tziouma, Olga
2017-01-01
Research has shown that social anxiety generalises to sporting and athletic situations. The present study explored the applicability of the Clark and Wells model of social anxiety--and its metacognitive extension--to sport anxiety. Participants were 290 students aged 11-13 years, who completed measures of sport anxiety, social anxiety, depression…
DEFF Research Database (Denmark)
Djarmati, Ana; Schneider, Susanne A; Lohmann, Katja
2009-01-01
-onset generalised dystonia with spasmodic dysphonia. This combination of symptoms might be a characteristic feature of DYT6 dystonia and could be useful in the differential diagnosis of DYT1, DYT4, DYT12, and DYT17 dystonia. In addition to the identified mutations, a rare non-coding substitution in THAP1 might...
Directory of Open Access Journals (Sweden)
Mihai OLAN
2016-12-01
Full Text Available The present paper describes the usage of the generalised object model for the analysis of the production processes of biofuels from wooden biomass. The “imposed decision” technique is employed to analyse several alternative solutions in order to find out the optimum one, to be further developed and put into practice.
Directory of Open Access Journals (Sweden)
Mihai OLAN
2016-12-01
Full Text Available The present paper describes the usage of the generalised object model for the analysis of the production processes of biofuels from wooden biomass. The “imposed decision” technique is employed to analyse several alternative solutions in order to find out the optimum one, to be further developed and put into practice.
DEFF Research Database (Denmark)
Kamel, Salah; Jurado, Francisco; Chen, Zhe
2016-01-01
This study proposes the generalised unified power flow controller (GUPFC) model in the hybrid current power mismatch Newton-Raphson formulation (HPCIM). In this model, active power, real and imaginary current components are injected at the terminals of series impedances of GUPFC. These injected...
Directory of Open Access Journals (Sweden)
D. A. Hughes
2011-03-01
Full Text Available This paper assesses the hydrological response to scenarios of climate change in the Okavango River catchment in Southern Africa. Climate scenarios are constructed representing different changes in global mean temperature from an ensemble of 7 climate models assessed in the IPCC AR4. The results show a substantial change in mean flow associated with a global warming of 2 °C. However, there is considerable uncertainty in the sign and magnitude of the projected changes between different climate models, implying that the ensemble mean is not an appropriate generalised indicator of impact. The uncertainty in response between different climate model patterns is considerably greater than the range due to uncertainty in hydrological model parameterisation. There is also a clear need to evaluate the physical mechanisms associated with the model projected changes in this region. The implications for water resource management policy are considered.
Dealing with exploration uncertainties
International Nuclear Information System (INIS)
Capen, E.
1992-01-01
Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side
Commonplaces and social uncertainty
DEFF Research Database (Denmark)
Lassen, Inger
2008-01-01
This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...... risk discourse (Myers 2005; 2007). In additional, however, I argue that commonplaces are used to mitigate feelings of insecurity caused by uncertainty and to negotiate new codes of moral conduct. Keywords: uncertainty, commonplaces, risk discourse, focus groups, appraisal...
Uncertainty information in climate data records from Earth observation
Directory of Open Access Journals (Sweden)
C. J. Merchant
2017-07-01
shape of the error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.
Uncertainty information in climate data records from Earth observation
Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang
2017-07-01
error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.
Borgonovi, Francesca; Pokropek, Artur
2017-01-01
The paper examines between-country differences in the mechanisms through which education could promote generalised trust using data from 29 countries participating in the OECD's Survey of Adult Skills (PIAAC). Results indicate that education is strongly associated with generalised trust and that a large part of this association is mediated by…
Energy and Uncertainty in General Relativity
Cooperstock, F. I.; Dupre, M. J.
2018-03-01
The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.
Sensitivity and uncertainty analysis
Cacuci, Dan G; Navon, Ionel Michael
2005-01-01
As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c
Uncertainty in artificial intelligence
Levitt, TS; Lemmer, JF; Shachter, RD
1990-01-01
Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i
Uncertainty Analyses and Strategy
International Nuclear Information System (INIS)
Kevin Coppersmith
2001-01-01
The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository
Kassab, Salah E; Fida, Mariam; Radwan, Ahmed; Hassan, Adla B; Abu-Hijleh, Marwan; O'Connor, Brian P
2016-07-01
In problem-based learning (PBL), students construct concept maps that integrate different concepts related to the PBL case and are guided by the learning needs generated in small-group tutorials. Although an instrument to measure students' concept maps in PBL programmes has been developed, the psychometric properties of this instrument have not yet been assessed. This study evaluated the generalisability of and sources of variance in medical students' concept map assessment scores in a PBL context. Medical students (Year 4, n = 116) were asked to construct three integrated concept maps in which the content domain of each map was to be focused on a PBL clinical case. Concept maps were independently evaluated by four raters based on five criteria: valid selection of concepts; hierarchical arrangement of concepts; degree of integration; relationship to the context of the problem, and degree of student creativity. Generalisability theory was used to compute the reliability of the concept map scores. The dependability coefficient, which indicates the reliability of scores across the measured facets for making absolute decisions, was 0.814. Students' concept map scores (universe scores) accounted for the largest proportion of total variance (47%) across all score comparisons. Rater differences accounted for 10% of total variance, and the student × rater interaction accounted for 25% of total variance. The variance attributable to differences in the content domain of the maps was negligible (2%). The remaining 16% of the variance reflected unexplained sources of error. Results from the D study suggested that a dependability level of 0.80 can be achieved by using three raters who each score two concept map domains, or by using five raters who each score only one concept map domain. This study demonstrated that concept mapping assessment scores of medical students in PBL have high reliability. Results suggested that greater improvements in dependability might be made
Uncertainties in repository modeling
Energy Technology Data Exchange (ETDEWEB)
Wilson, J.R.
1996-12-31
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.
Uncertainties in repository modeling
International Nuclear Information System (INIS)
Wilson, J.R.
1996-01-01
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling
Risk, Uncertainty, and Entrepreneurship
DEFF Research Database (Denmark)
Koudstaal, Martin; Sloof, Randolph; Van Praag, Mirjam
2016-01-01
Theory predicts that entrepreneurs have distinct attitudes toward risk and uncertainty, but empirical evidence is mixed. To better understand these mixed results, we perform a large “lab-in-the-field” experiment comparing entrepreneurs to managers (a suitable comparison group) and employees (n D ...
International Nuclear Information System (INIS)
Haefele, W.; Renn, O.; Erdmann, G.
1990-01-01
The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de
Courtney, H; Kirkland, J; Viguerie, P
1997-01-01
At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.
Simple Resonance Hierarchy for Surmounting Quantum Uncertainty
International Nuclear Information System (INIS)
Amoroso, Richard L.
2010-01-01
For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M 4 ±C 4 with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.
Principles of project management
1982-01-01
The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.
Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.
Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel
2014-01-01
Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.
Generalised joint hypermobility and neurodevelopmental traits in a non-clinical adult population.
Glans, Martin; Bejerot, Susanne; Humble, Mats B
2017-09-01
Generalised joint hypermobility (GJH) is reportedly overrepresented among clinical cases of attention deficit/hyperactivity disorder (ADHD), autism spectrum disorder (ASD) and developmental coordination disorder (DCD). It is unknown if these associations are dimensional and, therefore, also relevant among non-clinical populations. To investigate if GJH correlates with sub-syndromal neurodevelopmental symptoms in a normal population. Hakim-Grahame's 5-part questionnaire (5PQ) on GJH, neuropsychiatric screening scales measuring ADHD and ASD traits, and a DCD-related question concerning clumsiness were distributed to a non-clinical, adult, Swedish population ( n =1039). In total, 887 individuals met our entry criteria. We found no associations between GJH and sub-syndromal symptoms of ADHD, ASD or DCD. Although GJH is overrepresented in clinical cases with neurodevelopmental disorders, such an association seems absent in a normal population. Thus, if GJH serves as a biomarker cutting across diagnostic boundaries, this association is presumably limited to clinical populations. None. © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license.
Sleep onset uncovers thalamic abnormalities in patients with idiopathic generalised epilepsy
Directory of Open Access Journals (Sweden)
Andrew P. Bagshaw
2017-01-01
Full Text Available The thalamus is crucial for sleep regulation and the pathophysiology of idiopathic generalised epilepsy (IGE, and may serve as the underlying basis for the links between the two. We investigated this using EEG-fMRI and a specific emphasis on the role and functional connectivity (FC of the thalamus. We defined three types of thalamic FC: thalamocortical, inter-hemispheric thalamic, and intra-hemispheric thalamic. Patients and controls differed in all three measures, and during wakefulness and sleep, indicating disorder-dependent and state-dependent modification of thalamic FC. Inter-hemispheric thalamic FC differed between patients and controls in somatosensory regions during wakefulness, and occipital regions during sleep. Intra-hemispheric thalamic FC was significantly higher in patients than controls following sleep onset, and disorder-dependent alterations to FC were seen in several thalamic regions always involving somatomotor and occipital regions. As interactions between thalamic sub-regions are indirect and mediated by the inhibitory thalamic reticular nucleus (TRN, the results suggest abnormal TRN function in patients with IGE, with a regional distribution which could suggest a link with the thalamocortical networks involved in the generation of alpha rhythms. Intra-thalamic FC could be a more widely applicable marker beyond patients with IGE.
Spud and FLML: generalising and automating the user interfaces of scientific computer models
Ham, D. A.; Farrell, P. E.; Maddison, J. R.; Gorman, G. J.; Wilson, C. R.; Kramer, S. C.; Shipton, J.; Collins, G. S.; Cotter, C. J.; Piggott, M. D.
2009-04-01
The interfaces by which users specify the scenarios to be simulated by scientific computer models are frequently primitive, under-documented and ad-hoc text files which make using the model in question difficult and error-prone and significantly increase the development cost of the model. We present a model-independent system, Spud[1], which formalises the specification of model input formats in terms of formal grammars. This is combined with an automatically generated graphical user interface which guides users to create valid model inputs based on the grammar provided, and a generic options reading module which minimises the development cost of adding model options. We further present FLML, the Fluidity Markup Language. FLML applies Spud to the Imperial College Ocean Model (ICOM) resulting in a graphically driven system which radically improves the usability of ICOM. As well as a step forward for ICOM, FLML illustrates how the Spud system can be applied to an existing complex ocean model highlighting the potential of Spud as a user interface for other codes in the ocean modelling community. [1] Ham, D. A. et.al, Spud 1.0: generalising and automating the user interfaces of scientific computer models, Geosci. Model Dev. Discuss., 1, 125-146, 2008.
Generalised Sandpile Dynamics on Artificial and Real-World Directed Networks.
Zachariou, Nicky; Expert, Paul; Takayasu, Misako; Christensen, Kim
2015-01-01
The main finding of this paper is a novel avalanche-size exponent τ ≈ 1.87 when the generalised sandpile dynamics evolves on the real-world Japanese inter-firm network. The topology of this network is non-layered and directed, displaying the typical bow tie structure found in real-world directed networks, with cycles and triangles. We show that one can move from a strictly layered regular lattice to a more fluid structure of the inter-firm network in a few simple steps. Relaxing the regular lattice structure by introducing an interlayer distribution for the interactions, forces the scaling exponent of the avalanche-size probability density function τ out of the two-dimensional directed sandpile universality class τ = 4/3, into the mean field universality class τ = 3/2. Numerical investigation shows that these two classes are the only that exist on the directed sandpile, regardless of the underlying topology, as long as it is strictly layered. Randomly adding a small proportion of links connecting non adjacent layers in an otherwise layered network takes the system out of the mean field regime to produce non-trivial avalanche-size probability density function. Although these do not display proper scaling, they closely reproduce the behaviour observed on the Japanese inter-firm network.
Assigning Tie Points to a Generalised Building Model for Uas Image Orientation
Unger, J.; Rottensteiner, F.; Heipke, C.
2017-08-01
This paper addresses the integration of a building model into the pose estimation of image sequences. Images are captured by an Unmanned Aerial System (UAS) equipped with a camera flying in between buildings. Two approaches to assign tie points to a generalised building model in object space are presented. A direct approach is based on the distances between the object coordinates of tie points and planes of the building model. An indirect approach first finds planes within the tie point cloud that are subsequently matched to model planes; finally based on these matches, tie points are assigned to model planes. For both cases, the assignments are used in a hybrid bundle adjustment to refine the poses (image orientations). Experimental results for an image sequence demonstrate improvements in comparison to an adjustment without the building model. Differences and limitations of the two approaches for point-plane assignment are discussed - in the experiments they perform similar with respect to estimated standard deviations of tie points.
Generalised Sandpile Dynamics on Artificial and Real-World Directed Networks.
Directory of Open Access Journals (Sweden)
Nicky Zachariou
Full Text Available The main finding of this paper is a novel avalanche-size exponent τ ≈ 1.87 when the generalised sandpile dynamics evolves on the real-world Japanese inter-firm network. The topology of this network is non-layered and directed, displaying the typical bow tie structure found in real-world directed networks, with cycles and triangles. We show that one can move from a strictly layered regular lattice to a more fluid structure of the inter-firm network in a few simple steps. Relaxing the regular lattice structure by introducing an interlayer distribution for the interactions, forces the scaling exponent of the avalanche-size probability density function τ out of the two-dimensional directed sandpile universality class τ = 4/3, into the mean field universality class τ = 3/2. Numerical investigation shows that these two classes are the only that exist on the directed sandpile, regardless of the underlying topology, as long as it is strictly layered. Randomly adding a small proportion of links connecting non adjacent layers in an otherwise layered network takes the system out of the mean field regime to produce non-trivial avalanche-size probability density function. Although these do not display proper scaling, they closely reproduce the behaviour observed on the Japanese inter-firm network.
A Lagrange-based generalised formulation for the equations of motion of simple walking models.
McGrath, Michael; Howard, David; Baker, Richard
2017-04-11
Simple 2D models of walking often approximate the human body to multi-link dynamic systems, where body segments are represented by rigid links connected by frictionless hinge joints. Performing forward dynamics on the equations of motion (EOM) of these systems can be used to simulate their movement. However, deriving these equations can be time consuming. Using Lagrangian mechanics, a generalised formulation for the EOM of n-link open-loop chains is derived. This can be used for single support walking models. This has an advantage over Newton-Euler mechanics in that it is independent of coordinate system and prior knowledge of the ground reaction force (GRF) is not required. Alternative strategies, such as optimisation algorithms, can be used to estimate joint activation and simulate motion. The application of Lagrange multipliers, to enforce motion constraints, is used to adapt this general formulation for application to closed-loop chains. This can be used for double support walking models. Finally, inverse dynamics are used to calculate the GRF for these general n-link chains. The necessary constraint forces to maintain a closed-loop chain, calculated from the Lagrange multipliers, are one solution to the indeterminate problem of GRF distribution in double support models. An example of this method's application is given, whereby an optimiser estimates the joint moments by tracking kinematic data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Interference effects of neutral MSSM Higgs bosons with a generalised narrow-width approximation
International Nuclear Information System (INIS)
Fuchs, Elina
2014-11-01
Mixing effects in the MSSM Higgs sector can give rise to a sizeable interference between the neutral Higgs bosons. On the other hand, factorising a more complicated process into production and decay parts by means of the narrow-width approximation (NWA) simplifies the calculation. The standard NWA, however, does not account for interference terms. Therefore, we introduce a generalisation of the NWA (gNWA) which allows for a consistent treatment of interference effects between nearly mass-degenerate particles. Furthermore, we apply the gNWA at the tree and 1-loop level to an example process where the neutral Higgs bosons h and H are produced in the decay of a heavy neutralino and subsequently decay into a fermion pair. The h-H propagator mixing is found to agree well with the approximation of Breit-Wigner propagators times finite wave-function normalisation factors, both leading to a significant interference contribution. The factorisation of the interference term based on on-shell matrix elements reproduces the full interference result within a precision of better than 1% for the considered process. The gNWA also enables the inclusion of contributions beyond the 1-loop order into the most precise prediction.
Smit, Jacoba E; Hanekom, Tania; Hanekom, Johan J
2009-08-01
The objective of this study was to determine if a recently developed human Ranvier node model, which is based on a modified version of the Hodgkin-Huxley model, could predict the excitability behaviour in human peripheral sensory nerve fibres with diameters ranging from 5.0 to 15.0 microm. The Ranvier node model was extended to include a persistent sodium current and was incorporated into a generalised single cable nerve fibre model. Parameter temperature dependence was included. All calculations were performed in Matlab. Sensory nerve fibre excitability behaviour characteristics predicted by the new nerve fibre model at different temperatures and fibre diameters compared well with measured data. Absolute refractory periods deviated from measured data, while relative refractory periods were similar to measured data. Conduction velocities showed both fibre diameter and temperature dependence and were underestimated in fibres thinner than 12.5 microm. Calculated strength-duration time constants ranged from 128.5 to 183.0 micros at 37 degrees C over the studied nerve fibre diameter range, with chronaxie times about 30% shorter than strength-duration time constants. Chronaxie times exhibited temperature dependence, with values overestimated by a factor 5 at temperatures lower than body temperature. Possible explanations include the deviated absolute refractory period trend and inclusion of a nodal strangulation relationship.
Hirani, V
2011-06-01
To look at the trends in prevalence of generalised (body mass index (BMI) ≥ 25 kg/m2) and abdominal obesity (waist circumference (WC) >102 cm, men; > 88 cm, women) among older people from 1993 to 2008, prevalence of chronic disease by overweight/obesity and WC categories in England 2005 and evaluate the association of these measures with chronic diseases. Analyses of nationally representative cross-sectional population surveys, the Health Survey for England (HSE). Non-institutionalised men and women aged ≥ 65 years (in HSE 2005, 1512 men and 1747 women). Height, weight, waist circumference, blood pressure measurements were taken according to standardised HSE protocols. Information collected on socio-demographic, health behaviour and doctor diagnosed health conditions. Generalised obesity and abdominal obesity increased among men and women from 1993 to 2008. In 2005, the HSE 2005 focussed on older people. 72% of men and 68% of women aged over 65 were either overweight or obese. Prevalence of raised WC was higher in women (58%) than in men (46%). The prevalence of diabetes and arthritis was higher in people with generalised obesity in both sexes. Men were more likely to have had a joint replacement and had a higher prevalence of stroke if they were overweight only but women were more likely to have had a joint replacement only if they were obese (13%) and had a higher risk of falls with generalised obesity. The pattern was similar for the prevalence of chronic diseases by raised WC. Multivariate analysis showed that generalised and abdominal obesity was independently associated with risk of hypertension, diabetes and arthritis in both men and women. In women only, there was an association between generalised obesity and having a fall in the last year (OR: 1.5), and between abdominal obesity and having a joint replacement (OR: 1.9, p=0.01). Complications of obesity such as diabetes, hypertension and arthritis, are more common in men and women aged over 65 who are
Handbook of management under uncertainty
2001-01-01
A mere few years ago it would have seemed odd to propose a Handbook on the treatment of management problems within a sphere of uncertainty. Even today, on the threshold of the third millennium, this statement may provoke a certain wariness. In fact, to resort to exact or random data, that is probable date, is quite normal and con venient, as we then know where we are going best, where we are proposing to go if all occurs as it is conceived and hoped for. To treat uncertain information, to accept a new principle and from there determined criteria, without being sure of oneself and confiding only in the will to better understand objects and phenomena, constitutes and compromise with a new form of understanding the behaviour of current beings that goes even further than simple rationality. Economic Science and particularly the use of its elements of configuration in the world of management, has imbued several generations with an analytical spirit that has given rise to the elaboration of theories widely accept...
Arnaud, Patrick; Cantet, Philippe; Odry, Jean
2017-11-01
the use of a statistical law with two parameters (here generalised extreme value Type I distribution) and clearly lower than those associated with the use of a three-parameter law (here generalised extreme value Type II distribution). For extreme flood quantiles, the uncertainties are mostly due to the rainfall generator because of the progressive saturation of the hydrological model.
Knight, Claire; Munro, Malcolm
2001-07-01
Distributed component based systems seem to be the immediate future for software development. The use of such techniques, object oriented languages, and the combination with ever more powerful higher-level frameworks has led to the rapid creation and deployment of such systems to cater for the demand of internet and service driven business systems. This diversity of solution through both components utilised and the physical/virtual locations of those components can provide powerful resolutions to the new demand. The problem lies in the comprehension and maintenance of such systems because they then have inherent uncertainty. The components combined at any given time for a solution may differ, the messages generated, sent, and/or received may differ, and the physical/virtual locations cannot be guaranteed. Trying to account for this uncertainty and to build in into analysis and comprehension tools is important for both development and maintenance activities.
Risk, uncertainty and regulation.
Krebs, John R
2011-12-13
This paper reviews the relationship between scientific evidence, uncertainty, risk and regulation. Risk has many different meanings. Furthermore, if risk is defined as the likelihood of an event happening multiplied by its impact, subjective perceptions of risk often diverge from the objective assessment. Scientific evidence may be ambiguous. Scientific experts are called upon to assess risks, but there is often uncertainty in their assessment, or disagreement about the magnitude of the risk. The translation of risk assessments into policy is a political judgement that includes consideration of the acceptability of the risk and the costs and benefits of legislation to reduce the risk. These general points are illustrated with reference to three examples: regulation of risk from pesticides, control of bovine tuberculosis and pricing of alcohol as a means to discourage excessive drinking.
Vámos, Tibor
The gist of the paper is the fundamental uncertain nature of all kinds of uncertainties and consequently a critical epistemic review of historical and recent approaches, computational methods, algorithms. The review follows the development of the notion from the beginnings of thinking, via the Aristotelian and Skeptic view, the medieval nominalism and the influential pioneering metaphors of ancient India and Persia to the birth of modern mathematical disciplinary reasoning. Discussing the models of uncertainty, e.g. the statistical, other physical and psychological background we reach a pragmatic model related estimation perspective, a balanced application orientation for different problem areas. Data mining, game theories and recent advances in approximation algorithms are discussed in this spirit of modest reasoning.
2012-03-01
certify to : ISO 9001 (QMS), ISO 14001 (EMS), TS 16949 (US Automotive) etc. 2 3 DoD QSM 4.2 standard ISO /IEC 17025:2005 Each has uncertainty...Analytical Measurement Uncertainty Estimation” Defense Technical Information Center # ADA 396946 William S. Ingersoll, 2001 12 Follows the ISO GUM...SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY
International Nuclear Information System (INIS)
Laval, Katia; Laval, Guy
2013-01-01
Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly
Electricity restructuring : acting on principles
International Nuclear Information System (INIS)
Down, E.; Hoover, G.; Howatson, A.; Rheaume, G.
2003-01-01
In the second briefing of this series, the authors explored public policy decisions and political intervention, and their effect on electricity restructuring. Continuous and vigilant regulatory oversight of the electricity industry in Canada is required. The need for improved public policy to reduce uncertainty for private investors who wish to enter the market was made clear using case studies from the United Kingdom, California, Alberta, and Ontario. Clarity and consistency must be the two guiding principles for public policy decisions and political intervention in the sector. By clarity, the authors meant that rules, objectives, and timelines of the restructuring process are clear to all market participants. Market rules, implementation, and consumer expectations must be consistent. refs., 3 figs
Managing Measurement Uncertainty in Building Acoustics
Directory of Open Access Journals (Sweden)
Chiara Scrosati
2015-12-01
Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single
Dimensional cosmological principles
International Nuclear Information System (INIS)
Chi, L.K.
1985-01-01
The dimensional cosmological principles proposed by Wesson require that the density, pressure, and mass of cosmological models be functions of the dimensionless variables which are themselves combinations of the gravitational constant, the speed of light, and the spacetime coordinates. The space coordinate is not the comoving coordinate. In this paper, the dimensional cosmological principle and the dimensional perfect cosmological principle are reformulated by using the comoving coordinate. The dimensional perfect cosmological principle is further modified to allow the possibility that mass creation may occur. Self-similar spacetimes are found to be models obeying the new dimensional cosmological principle
Uncertainty and Decision Making
1979-09-01
included as independent variables orderli- ness, the status of the source of information, the primacy versus recency of positive information items, and...low uncertainty and high satisfac- tion. The primacy / recency and sequential/final variables produced no significant differences. In summary, we have...to which the different independent variables (credibility, probability, and content) had an effect on the favorability judgments. The results were
Growth uncertainty and risksharing
Stefano Athanasoulis; Eric Van Wincoop
1997-01-01
How large are potential benefits from global risksharing? In order to answer this question we propose a new methodology that is closely connected with the empirical growth literature. We obtain estimates of residual risk (growth uncertainty) at various horizons from regressions of country-specific growth in deviation from world growth on a wide set of variables in the information set. Since this residual risk can be entirely hedged through risksharing, we use it to obtain a measure of the pot...
Citizen Candidates Under Uncertainty
Eguia, Jon X.
2005-01-01
In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...
Uncertainty in artificial intelligence
Shachter, RD; Henrion, M; Lemmer, JF
1990-01-01
This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und
Calibration Under Uncertainty.
Energy Technology Data Exchange (ETDEWEB)
Swiler, Laura Painton; Trucano, Timothy Guy
2005-03-01
This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.
Participation under Uncertainty
International Nuclear Information System (INIS)
Boudourides, Moses A.
2003-01-01
This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke
The neurobiology of uncertainty: implications for statistical learning.
Hasson, Uri
2017-01-05
The capacity for assessing the degree of uncertainty in the environment relies on estimating statistics of temporally unfolding inputs. This, in turn, allows calibration of predictive and bottom-up processing, and signalling changes in temporally unfolding environmental features. In the last decade, several studies have examined how the brain codes for and responds to input uncertainty. Initial neurobiological experiments implicated frontoparietal and hippocampal systems, based largely on paradigms that manipulated distributional features of visual stimuli. However, later work in the auditory domain pointed to different systems, whose activation profiles have interesting implications for computational and neurobiological models of statistical learning (SL). This review begins by briefly recapping the historical development of ideas pertaining to the sensitivity to uncertainty in temporally unfolding inputs. It then discusses several issues at the interface of studies of uncertainty and SL. Following, it presents several current treatments of the neurobiology of uncertainty and reviews recent findings that point to principles that serve as important constraints on future neurobiological theories of uncertainty, and relatedly, SL. This review suggests it may be useful to establish closer links between neurobiological research on uncertainty and SL, considering particularly mechanisms sensitive to local and global structure in inputs, the degree of input uncertainty, the complexity of the system generating the input, learning mechanisms that operate on different temporal scales and the use of learnt information for online prediction.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).
Freni, Gabriele; Mannina, Giorgio; Viviani, Gapare
2009-12-15
In the last years, the attention on integrated analysis of sewer networks, wastewater treatment plants and receiving waters has been growing. However, the common lack of data in the urban water-quality field and the incomplete knowledge regarding the interpretation of the main phenomena taking part in integrated urban water systems draw attention to the necessity of evaluating the reliability of model results. Uncertainty analysis can provide useful hints and information regarding the best model approach to be used by assessing its degrees of significance and reliability. Few studies deal with uncertainty assessment in the integrated urban-drainage field. In order to fill this gap, there has been a general trend towards transferring the knowledge and the methodologies from other fields. In this respect, the Generalised Likelihood Uncertainty Evaluation (GLUE) methodology, which is widely applied in the field of hydrology, can be a possible candidate for providing a solution to the above problem. However, the methodology relies on several user-defined hypotheses in the selection of a specific formulation of the likelihood measure. This paper presents a survey aimed at evaluating the influence of the likelihood measure formulation in the assessment of uncertainty in integrated urban-drainage modelling. To accomplish this objective, a home-made integrated urban-drainage model was applied to the Savena case study (Bologna, IT). In particular, the integrated urban-drainage model uncertainty was evaluated employing different likelihood measures. The results demonstrate that the subjective selection of the likelihood measure greatly affects the GLUE uncertainty analysis.
The Irrelevance of the Risk-Uncertainty Distinction.
Roser, Dominic
2017-10-01
Precautionary Principles are often said to be appropriate for decision-making in contexts of uncertainty such as climate policy. Contexts of uncertainty are contrasted to contexts of risk depending on whether we have probabilities or not. Against this view, I argue that the risk-uncertainty distinction is practically irrelevant. I start by noting that the history of the distinction between risk and uncertainty is more varied than is sometimes assumed. In order to examine the distinction, I unpack the idea of having probabilities, in particular by distinguishing three interpretations of probability: objective, epistemic, and subjective probability. I then claim that if we are concerned with whether we have probabilities at all-regardless of how low their epistemic credentials are-then we almost always have probabilities for policy-making. The reason is that subjective and epistemic probability are the relevant interpretations of probability and we almost always have subjective and epistemic probabilities. In contrast, if we are only concerned with probabilities that have sufficiently high epistemic credentials, then we obviously do not always have probabilities. Climate policy, for example, would then be a case of decision-making under uncertainty. But, so I argue, we should not dismiss probabilities with low epistemic credentials. Rather, when they are the best available probabilities our decision principles should make use of them. And, since they are almost always available, the risk-uncertainty distinction remains irrelevant.
Vasant Hirani
2010-01-01
Objective: To look at the trends in prevalence of generalised (body mass index (BMI)≥ 25 kg/m2) and abdominal obesity (waist circumference (WC) >102cm, men; >88cm, women) among older people from 1993 to 2008, prevalence of chronic disease by overweight/obesity and waist circumference categories in England 2005 and evaluate the association of these measures with chronic diseases.
A Bayesian Framework of Uncertainties Integration in 3D Geological Model
Liang, D.; Liu, X.
2017-12-01
3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.
Hybrid variational principles and synthesis method for finite element neutron transport calculations
International Nuclear Information System (INIS)
Ackroyd, R.T.; Nanneh, M.M.
1990-01-01
A family of hybrid variational principles is derived using a generalised least squares method. Neutron conservation is automatically satisfied for the hybrid principles employing two trial functions. No interfaces or reflection conditions need to be imposed on the independent even-parity trial function. For some hybrid principles a single trial function can be employed by relating one parity trial function to the other, using one of the parity transport equation in relaxed form. For other hybrid principles the trial functions can be employed sequentially. Synthesis of transport solutions, starting with the diffusion theory approximation, has been used as a way of reducing the scale of the computation that arises with established finite element methods for neutron transport. (author)
Dieleman, G C; Ferdinand, R F
2008-01-01
Anxiety disorders are among the most prevalent psychiatric disorders during childhood. They are often persistent and are associated with a number of negative outcomes. Therefore, effective treatment is required. To present an overview of placebo-controlled studies of pharmacotherapy for social phobia, generalised anxiety disorder and separation anxiety disorder in children an adolescents and to determine which medication is the most effective. The literature was reviewed using Pubmed. Nine randomised double-blind studies on the efficacy of pharmacotherapy for generalised anxiety disorder, separation anxiety disorder and social phobia were found. Tricyclic antidepressants were not more effective than placebo. Studies on benzodiazepines showed that the effect of these drugs was not superior to that of placebo either. Studies of the efficacy of ssris, however, proved that they were superior to placebo. SSRIS are the drugs of first choice for the treatment of social phobias, separation anxiety disorder and generalised anxiety disorder in children and adolescents. There is strong evidence that ssris are effective for the treatment of these anxiety disorders; the standardised effect size varies between medium and large.
Pérez-Palma, Eduardo; Helbig, Ingo; Klein, Karl Martin; Anttila, Verneri; Horn, Heiko; Reinthaler, Eva Maria; Gormley, Padhraig; Ganna, Andrea; Byrnes, Andrea; Pernhorst, Katharina; Toliat, Mohammad R; Saarentaus, Elmo; Howrigan, Daniel P; Hoffman, Per; Miquel, Juan Francisco; De Ferrari, Giancarlo V; Nürnberg, Peter; Lerche, Holger; Zimprich, Fritz; Neubauer, Bern A; Becker, Albert J; Rosenow, Felix; Perucca, Emilio; Zara, Federico; Weber, Yvonne G; Lal, Dennis
2017-01-01
Background Microdeletions are known to confer risk to epilepsy, particularly at genomic rearrangement ‘hotspot’ loci. However, microdeletion burden not overlapping these regions or within different epilepsy subtypes has not been ascertained. Objective To decipher the role of microdeletions outside hotspots loci and risk assessment by epilepsy subtype. Methods We assessed the burden, frequency and genomic content of rare, large microdeletions found in a previously published cohort of 1366 patients with genetic generalised epilepsy (GGE) in addition to two sets of additional unpublished genome-wide microdeletions found in 281 patients with rolandic epilepsy (RE) and 807 patients with adult focal epilepsy (AFE), totalling 2454 cases. Microdeletions were assessed in a combined and subtype-specific approaches against 6746 controls. Results When hotspots are considered, we detected an enrichment of microdeletions in the combined epilepsy analysis (adjusted p=1.06×10−6,OR 1.89, 95% CI 1.51 to 2.35). Epilepsy subtype-specific analyses showed that hotspot microdeletions in the GGE subgroup contribute most of the overall signal (adjusted p=9.79×10−12, OR 7.45, 95% CI 4.20–13.5). Outside hotspots , microdeletions were enriched in the GGE cohort for neurodevelopmental genes (adjusted p=9.13×10−3,OR 2.85, 95% CI 1.62–4.94). No additional signal was observed for RE and AFE. Still, gene-content analysis identified known (NRXN1, RBFOX1 and PCDH7) and novel (LOC102723362) candidate genes across epilepsy subtypes that were not deleted in controls. Conclusions Our results show a heterogeneous effect of recurrent and non-recurrent microdeletions as part of the genetic architecture of GGE and a minor contribution in the aetiology of RE and AFE. PMID:28756411
Farrer, Louise M; Gulliver, Amelia; Bennett, Kylie; Fassnacht, Daniel B; Griffiths, Kathleen M
2016-07-15
Few studies have examined modifiable psychosocial risk factors for mental disorders among university students, and of these, none have employed measures that correspond to clinical diagnostic criteria. The aim of this study was to examine psychosocial and demographic risk factors for major depression and generalised anxiety disorder (GAD) in a sample of Australian university students. An anonymous web-based survey was distributed to undergraduate and postgraduate students at a mid-sized Australian university. A range of psychosocial and demographic risk factors were measured, and logistic regression models were used to examine significant predictors of major depression and GAD. A total of 611 students completed the survey. The prevalence of major depression and GAD in the sample was 7.9 and 17.5 %, respectively. In terms of demographic factors, the risk of depression was higher for students in their first year of undergraduate study, and the risk of GAD was higher for female students, those who moved to attend university, and students experiencing financial stress. In terms of psychosocial factors, students with experience of body image issues and lack of confidence were at significantly greater risk of major depression, and feeling too much pressure to succeed, lack of confidence, and difficulty coping with study was significantly associated with risk of GAD. University students experience a range of unique psychosocial stressors that increase their risk of major depression and GAD, in addition to sociodemographic risk factors. It is important to examine psychosocial factors, as these are potentially modifiable and could be the focus of university-specific mental health interventions.
Korvigo, Ilia; Afanasyev, Andrey; Romashchenko, Nikolay; Skoblov, Mikhail
2018-01-01
Many automatic classifiers were introduced to aid inference of phenotypical effects of uncategorised nsSNVs (nonsynonymous Single Nucleotide Variations) in theoretical and medical applications. Lately, several meta-estimators have been proposed that combine different predictors, such as PolyPhen and SIFT, to integrate more information in a single score. Although many advances have been made in feature design and machine learning algorithms used, the shortage of high-quality reference data along with the bias towards intensively studied in vitro models call for improved generalisation ability in order to further increase classification accuracy and handle records with insufficient data. Since a meta-estimator basically combines different scoring systems with highly complicated nonlinear relationships, we investigated how deep learning (supervised and unsupervised), which is particularly efficient at discovering hierarchies of features, can improve classification performance. While it is believed that one should only use deep learning for high-dimensional input spaces and other models (logistic regression, support vector machines, Bayesian classifiers, etc) for simpler inputs, we still believe that the ability of neural networks to discover intricate structure in highly heterogenous datasets can aid a meta-estimator. We compare the performance with various popular predictors, many of which are recommended by the American College of Medical Genetics and Genomics (ACMG), as well as available deep learning-based predictors. Thanks to hardware acceleration we were able to use a computationally expensive genetic algorithm to stochastically optimise hyper-parameters over many generations. Overfitting was hindered by noise injection and dropout, limiting coadaptation of hidden units. Although we stress that this work was not conceived as a tool comparison, but rather an exploration of the possibilities of deep learning application in ensemble scores, our results show that
Wagner, J.
2017-05-01
We extend our model-independent approach for characterising strong gravitational lenses to its most general form to leading order and use the orientation angles of a set of multiple images with respect to their connection line(s) in addition to the relative distances between the images, their ellipticities, and time-delays. For two symmetric images that straddle the critical curve, the orientation angle additionally allows us to determine the slope of the critical curve and a second (reduced) flexion coefficient at the critical point on the connection line between the images. It also allows us to drop the symmetry assumption that the axis of largest image extension is orthogonal to the critical curve. For three images almost forming a giant arc, the degree of assumed image symmetry is also reduced to the most general case, describing image configurations for which the source need not be placed on the symmetry axis of the two folds that unite at the cusp. For a given set of multiple images, we set limits on the applicability of our approach, show which information can be obtained in cases of merging images, and analyse the accuracy achievable due to the Taylor expansion of the lensing potential for the fold case on a galaxy cluster scale Navarro-Frenk-White-profile, a fold and cusp case on a galaxy cluster scale singular isothermal ellipse, and compare the generalised approach with our previously published one. The position of the critical points is reconstructed with less than 5'' deviation for multiple images closer to the critical points than 30% of the (effective) Einstein radius. The slope of the critical curve at a fold and its shape in the vicinity of a cusp deviate less than 20% from the true values for distances of the images to the critical points less than 15% of the (effective) Einstein radius.
Pérez-Palma, Eduardo; Helbig, Ingo; Klein, Karl Martin; Anttila, Verneri; Horn, Heiko; Reinthaler, Eva Maria; Gormley, Padhraig; Ganna, Andrea; Byrnes, Andrea; Pernhorst, Katharina; Toliat, Mohammad R; Saarentaus, Elmo; Howrigan, Daniel P; Hoffman, Per; Miquel, Juan Francisco; De Ferrari, Giancarlo V; Nürnberg, Peter; Lerche, Holger; Zimprich, Fritz; Neubauer, Bern A; Becker, Albert J; Rosenow, Felix; Perucca, Emilio; Zara, Federico; Weber, Yvonne G; Lal, Dennis
2017-09-01
Microdeletions are known to confer risk to epilepsy, particularly at genomic rearrangement 'hotspot' loci. However, microdeletion burden not overlapping these regions or within different epilepsy subtypes has not been ascertained. To decipher the role of microdeletions outside hotspots loci and risk assessment by epilepsy subtype. We assessed the burden, frequency and genomic content of rare, large microdeletions found in a previously published cohort of 1366 patients with genetic generalised epilepsy (GGE) in addition to two sets of additional unpublished genome-wide microdeletions found in 281 patients with rolandic epilepsy (RE) and 807 patients with adult focal epilepsy (AFE), totalling 2454 cases. Microdeletions were assessed in a combined and subtype-specific approaches against 6746 controls. When hotspots are considered, we detected an enrichment of microdeletions in the combined epilepsy analysis (adjusted p=1.06×10 -6 ,OR 1.89, 95% CI 1.51 to 2.35). Epilepsy subtype-specific analyses showed that hotspot microdeletions in the GGE subgroup contribute most of the overall signal (adjusted p=9.79×10 -12 , OR 7.45, 95% CI 4.20-13.5). Outside hotspots , microdeletions were enriched in the GGE cohort for neurodevelopmental genes (adjusted p=9.13×10 -3 ,OR 2.85, 95% CI 1.62-4.94). No additional signal was observed for RE and AFE. Still, gene-content analysis identified known ( NRXN1 , RBFOX1 and PCDH7 ) and novel ( LOC102723362 ) candidate genes across epilepsy subtypes that were not deleted in controls. Our results show a heterogeneous effect of recurrent and non-recurrent microdeletions as part of the genetic architecture of GGE and a minor contribution in the aetiology of RE and AFE. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Generalised Einstein mass-variation formulae: II Superluminal relative frame velocities
Directory of Open Access Journals (Sweden)
James M. Hill
Full Text Available In part I of this paper we have deduced generalised Einstein mass variation formulae assuming relative frame velocities vc. We again use the notion of the residual mass m0(v which for v>c is defined by the equation m(v=m0(v[(v/c2-1]-1/2 for the actual mass m(v. The residual mass is essentially the actual mass with the Einstein factor removed, and we emphasise that we make no restrictions on m0(v. Using this formal device we deduce corresponding new mass variation formulae applicable to superluminal relative frame velocities, assuming only the extended Lorentz transformations and their consequences, and two invariants that are known to apply in special relativity. The present authors have previously speculated a dual framework such that both the rest mass m0∗ and the residual mass at infinite velocity m∞∗ (by which we mean p∞∗/c, assuming finite momentum at infinity are equally important parameters in the specification of mass as a function of its velocity, and the two arbitrary constants can be so determined. The new formulae involving two arbitrary constants may also be exploited so that the mass remains finite at the speed of light, and two distinct mass profiles are determined as functions of their velocity with the rest mass assumed to be alternatively prescribed at the origin of either frame. The two profiles so obtained (M(U,m(u and (M∗(U,m∗(u although distinct have a common ratio M(U/M∗(U=m(u/m∗(u that is a function of v>c, indicating that observable mass depends upon the frame in which the rest mass is prescribed. Keywords: Special relativity, Einstein mass variation, New formulae
Better Rulesets by Removing Redundant Specialisations and Generalisations in Association Rule Mining
Directory of Open Access Journals (Sweden)
Henry Petersen
2017-11-01
Full Text Available Association rule mining is a fundamental task in many data mining and analysis applications, both for knowledge extraction and as part of other processes (for example, building associative classifiers. It is well known that the number of associations identified by many association rule mining algorithms can be so large as to present a barrier to their interpretability and practical use. A typical solution to this problem involves removing redundant rules. This paper proposes a novel definition of redundancy, which is used to identify only the most interesting associations. Compared to existing redundancy based approaches, our method is both more robust to noise, and produces fewer overall rules for a given data (improving clarity. A rule can be considered redundant if the knowledge it describes is already contained in other rules. Given an association rule, most existing approaches consider rules to be redundant if they add additional variables without increasing quality according to some measure of interestingness. We claim that complex interactions between variables can confound many interestingness measures. This can lead to existing approaches being overly aggressive in removing redundant associations. Most existing approaches also fail to take into account situations where more general rules (those with fewer attributes can be considered redundant with respect to their specialisations. We examine this problem and provide concrete examples of such errors using artificial data. An alternate definition of redundancy that addresses these issues is proposed. Our approach is shown to identify interesting associations missed by comparable methods on multiple real and synthetic data. When combined with the removal of redundant generalisations, our approach is often able to generate smaller overall rule sets, while leaving average rule quality unaffected or slightly improved.
Rosvall, Per-Åke; Nilsson, Stefan
2016-08-30
There has been an increase of reports describing mental health problems in adolescents, especially girls. School nurses play an important role in supporting young people with health problems. Few studies have considered how the nurses' gender norms may influence their discussions. To investigate this issue, semi-structured interviews focusing on school nurses' work with students who have mental health problems were conducted. Transcripts of interviews with Swedish school nurses (n = 15) from the Help overcoming pain early project (HOPE) were analysed using theories on gender as a theoretical framework and then organised into themes related to the school nurses' provision of contact and intervention. The interviewees were all women, aged between 42-63 years, who had worked as nurses for 13-45 years, and as school nurses for 2-28 years. Five worked in upper secondary schools (for students aged 16-19) and 10 in secondary schools (for students aged 12-16). The results show that school nurses more commonly associated mental health problems with girls. When the school nurses discussed students that were difficult to reach, boys in particular were mentioned. However, very few nurses mentioned specific intervention to address students' mental health problems, and all of the mentioned interventions were focused on girls. Some of the school nurses reported that it was more difficult to initiate a health dialogue with boys, yet none of the nurses had organized interventions for the boys. We conclude that generalisations can sometimes be analytically helpful, facilitating, for instance, the identification of problems in school nurses' work methods and interventions. However, the most important conclusion from our research, which applied a design that is not commonly used, is that more varied approaches, as well as a greater awareness of potential gender stereotype pitfalls, are necessary to meet the needs of diverse student groups.
Do Orthopaedic Surgeons Acknowledge Uncertainty?
Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert
2016-01-01
Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if
Directory of Open Access Journals (Sweden)
Katie Harron
Full Text Available We determined the generalisability and cost-impact of adopting antibiotic-impregnated CVCs in all paediatric intensive care units (PICUs in England, based on results from a large randomised controlled trial (the CATCH trial; ISRCTN34884569.BSI rates using standard CVCs were estimated through linkage of national PICU audit data (PICANet with laboratory surveillance data. We estimated the number of BSI averted if PICUs switched from standard to antibiotic-impregnated CVCs by applying the CATCH trial rate-ratio (0.40; 95% CI 0.17,0.97 to the BSI rate using standard CVCs. The value of healthcare resources made available by averting one BSI as estimated from the trial economic analysis was £10,975; 95% CI -£2,801,£24,751.The BSI rate using standard CVCs was 4.58 (95% CI 4.42,4.74 per 1000 CVC-days in 2012. Applying the rate-ratio gave 232 BSI averted using antibiotic CVCs. The additional cost of purchasing antibiotic-impregnated compared with standard CVCs was £36 for each child, corresponding to additional costs of £317,916 for an estimated 8831 CVCs required in PICUs in 2012. Based on 2012 BSI rates, management of BSI in PICUs cost £2.5 million annually (95% uncertainty interval: -£160,986, £5,603,005. The additional cost of antibiotic CVCs would be less than the value of resources associated with managing BSI in PICUs with standard BSI rates >1.2 per 1000 CVC-days.The cost of introducing antibiotic-impregnated CVCs is less than the cost associated with managing BSIs occurring with standard CVCs. The long-term benefits of preventing BSI could mean that antibiotic CVCs are cost-effective even in PICUs with extremely low BSI rates.
Harron, Katie; Mok, Quen; Hughes, Dyfrig; Muller-Pebody, Berit; Parslow, Roger; Ramnarayan, Padmanabhan; Gilbert, Ruth
2016-01-01
We determined the generalisability and cost-impact of adopting antibiotic-impregnated CVCs in all paediatric intensive care units (PICUs) in England, based on results from a large randomised controlled trial (the CATCH trial; ISRCTN34884569). BSI rates using standard CVCs were estimated through linkage of national PICU audit data (PICANet) with laboratory surveillance data. We estimated the number of BSI averted if PICUs switched from standard to antibiotic-impregnated CVCs by applying the CATCH trial rate-ratio (0.40; 95% CI 0.17,0.97) to the BSI rate using standard CVCs. The value of healthcare resources made available by averting one BSI as estimated from the trial economic analysis was £10,975; 95% CI -£2,801,£24,751. The BSI rate using standard CVCs was 4.58 (95% CI 4.42,4.74) per 1000 CVC-days in 2012. Applying the rate-ratio gave 232 BSI averted using antibiotic CVCs. The additional cost of purchasing antibiotic-impregnated compared with standard CVCs was £36 for each child, corresponding to additional costs of £317,916 for an estimated 8831 CVCs required in PICUs in 2012. Based on 2012 BSI rates, management of BSI in PICUs cost £2.5 million annually (95% uncertainty interval: -£160,986, £5,603,005). The additional cost of antibiotic CVCs would be less than the value of resources associated with managing BSI in PICUs with standard BSI rates >1.2 per 1000 CVC-days. The cost of introducing antibiotic-impregnated CVCs is less than the cost associated with managing BSIs occurring with standard CVCs. The long-term benefits of preventing BSI could mean that antibiotic CVCs are cost-effective even in PICUs with extremely low BSI rates.
Position-momentum uncertainty relations based on moments of arbitrary order
International Nuclear Information System (INIS)
Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo; Dehesa, Jesus S.
2011-01-01
The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.
Zhang, Jun; Zhang, Yang; Yu, Chang-shui
2015-06-29
The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details.
Optimizing production under uncertainty
DEFF Research Database (Denmark)
Rasmussen, Svend
This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...
Optimization under Uncertainty
Lopez, Rafael H.
2016-01-06
The goal of this poster is to present the main approaches to optimization of engineering systems in the presence of uncertainties. We begin by giving an insight about robust optimization. Next, we detail how to deal with probabilistic constraints in optimization, the so called the reliability based design. Subsequently, we present the risk optimization approach, which includes the expected costs of failure in the objective function. After that the basic description of each approach is given, the projects developed by CORE are presented. Finally, the main current topic of research of CORE is described.
Uncertainty of spatial straightness in 3D measurement
International Nuclear Information System (INIS)
Wang Jinxing; Jiang Xiangqian; Ma Limin; Xu Zhengao; Li Zhu
2005-01-01
The least-square method is commonly employed to verify the spatial straightness in actual three-dimensional measurement process, but the uncertainty of the verification result is usually not given by the coordinate measuring machines. According to the basic principle of spatial straightness least-square verification and the uncertainty propagation formula given by ISO/TS 14253-2, a calculation method for the uncertainty of spatial straightness least-square verification is proposed in this paper. By this method, the coefficients of the line equation are regarded as a statistical vector, so that the line equation, the result of the spatial straightness verification and the uncertainty of the result can be obtained after the expected value and covariance matrix of the vector are determined. The method not only assures the integrity of the verification result, but also accords with the requirement of the new generation of GPS standards, which can improve the veracity of verification
Studying generalised dark matter interactions with extended halo-independent methods
International Nuclear Information System (INIS)
Kahlhoefer, Felix; Wild, Sebastian
2016-07-01
The interpretation of dark matter direct detection experiments is complicated by the fact that neither the astrophysical distribution of dark matter nor the properties of its particle physics interactions with nuclei are known in detail. To address both of these issues in a very general way we develop a new framework that combines the full formalism of non-relativistic effective interactions with state-of-the-art halo-independent methods. This approach makes it possible to analyse direct detection experiments for arbitrary dark matter interactions and quantify the goodness-of-fit independent of astrophysical uncertainties. We employ this method in order to demonstrate that the degeneracy between astrophysical uncertainties and particle physics unknowns is not complete. Certain models can be distinguished in a halo-independent way using a single ton-scale experiment based on liquid xenon, while other models are indistinguishable with a single experiment but can be separated using combined information from several target elements.
Biomechanics principles and practices
Peterson, Donald R
2014-01-01
Presents Current Principles and ApplicationsBiomedical engineering is considered to be the most expansive of all the engineering sciences. Its function involves the direct combination of core engineering sciences as well as knowledge of nonengineering disciplines such as biology and medicine. Drawing on material from the biomechanics section of The Biomedical Engineering Handbook, Fourth Edition and utilizing the expert knowledge of respected published scientists in the application and research of biomechanics, Biomechanics: Principles and Practices discusses the latest principles and applicat
Dolan, Thomas James
2013-01-01
Fusion Research, Volume I: Principles provides a general description of the methods and problems of fusion research. The book contains three main parts: Principles, Experiments, and Technology. The Principles part describes the conditions necessary for a fusion reaction, as well as the fundamentals of plasma confinement, heating, and diagnostics. The Experiments part details about forty plasma confinement schemes and experiments. The last part explores various engineering problems associated with reactor design, vacuum and magnet systems, materials, plasma purity, fueling, blankets, neutronics
Database principles programming performance
O'Neil, Patrick
2014-01-01
Database: Principles Programming Performance provides an introduction to the fundamental principles of database systems. This book focuses on database programming and the relationships between principles, programming, and performance.Organized into 10 chapters, this book begins with an overview of database design principles and presents a comprehensive introduction to the concepts used by a DBA. This text then provides grounding in many abstract concepts of the relational model. Other chapters introduce SQL, describing its capabilities and covering the statements and functions of the programmi
National Research Council Canada - National Science Library
Walker, C. H
2012-01-01
"Now in its fourth edition, this exceptionally accessible text provides students with a multidisciplinary perspective and a grounding in the fundamental principles required for research in toxicology today...
Free fall and the equivalence principle revisited
Pendrill, Ann-Marie
2017-11-01
Free fall is commonly discussed as an example of the equivalence principle, in the context of a homogeneous gravitational field, which is a reasonable approximation for small test masses falling moderate distances. Newton’s law of gravity provides a generalisation to larger distances, and also brings in an inhomogeneity in the gravitational field. In addition, Newton’s third law of action and reaction causes the Earth to accelerate towards the falling object, bringing in a mass dependence in the time required for an object to reach ground—in spite of the equivalence between inertial and gravitational mass. These aspects are rarely discussed in textbooks when the motion of everyday objects are discussed. Although these effects are extremely small, it may still be important for teachers to make assumptions and approximations explicit, to be aware of small corrections, and also to be prepared to estimate their size. Even if the corrections are not part of regular teaching, some students may reflect on them, and their questions deserve to be taken seriously.
Uncertainties in risk assessment at USDOE facilities
Energy Technology Data Exchange (ETDEWEB)
Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.
1994-01-01
The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.
Uncertainties in risk assessment at USDOE facilities
International Nuclear Information System (INIS)
Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.
1994-01-01
The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms open-quote risk assessment close-quote and open-quote risk management close-quote are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of open-quotes... the most significant data and uncertainties...close quotes in an assessment. Significant data and uncertainties are open-quotes...those that define and explain the main risk conclusionsclose quotes. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation
Improvement of Statistical Decisions under Parametric Uncertainty
Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis
2011-10-01
A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.
Probabilistic Mass Growth Uncertainties
Plumer, Eric; Elliott, Darren
2013-01-01
Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.
Oil price uncertainty in Canada
International Nuclear Information System (INIS)
Elder, John; Serletis, Apostolos
2009-01-01
Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)
Directory of Open Access Journals (Sweden)
Héctor Torres-Silva
2008-11-01
Full Text Available This work deals with the problem of the construction of the Lagrange functional for an electromagnetic field. The generalised Maxwell equations for an electromagnetic field in free space are introduced. The main idea relies on the change of Lagrange function under the integral action. Usually, the Lagrange functional which describes the electromagnetic field is built with the quadrate of the electromagnetic field tensor . Such a quadrate term is the reason, from a mathematical point of view, for the linear form of the Maxwell equations in free space. The author does not make this assumption and nonlinear Maxwell equations are obtained. New material parameters of free space are established. The equations obtained are quite similar to the well-known Maxwell equations. The energy tensor of the electromagnetic field from a chiral approach to the Born Infeld Lagrangian is discussed in connection with the cosmological constant.Se aborda el problema de la construcción de la funcional de Lagrange de un campo electromagnético. Se introducen las ecuaciones generalizadas de Maxwell de un campo electromagnético en el espacio libre. La idea principal se basa en el cambio de función de Lagrange en virtud de la acción integral. Por lo general, la funcional de lagrange, que describe el campo electromagnético, se construye con el cuadrado del tensor de campo electromagnético. Ese término cuadrático es la razón, desde un punto de vista matemático, de la forma lineal de las ecuaciones de Maxwell en el espacio libre. Se obtienen las ecuaciones no lineales de Maxwell sin considerar esta suposición. Las ecuaciones de Maxwell obtenidas son bastante similares a las conocidas ecuaciones de Maxwell. Se analiza el tensor de energía del campo electromagnético en un enfoque quiral de la Lagrangiana de Born Infeld en relación con la constante cosmológica.
O'Malley, Karina R; Waters, Allison M
2018-05-01
differences were observed in between-phase CS evaluations and subjective anxiety ratings. Avoidance of threat conditioned stimuli may impair extinction learning and increase physiological arousal generalisation to safe stimuli. Copyright © 2018 Elsevier Ltd. All rights reserved.
A generalised random encounter model for estimating animal density with remote sensor data.
Lucas, Tim C D; Moorcroft, Elizabeth A; Freeman, Robin; Rowcliffe, J Marcus; Jones, Kate E
2015-05-01
Wildlife monitoring technology is advancing rapidly and the use of remote sensors such as camera traps and acoustic detectors is becoming common in both the terrestrial and marine environments. Current methods to estimate abundance or density require individual recognition of animals or knowing the distance of the animal from the sensor, which is often difficult. A method without these requirements, the random encounter model (REM), has been successfully applied to estimate animal densities from count data generated from camera traps. However, count data from acoustic detectors do not fit the assumptions of the REM due to the directionality of animal signals.We developed a generalised REM (gREM), to estimate absolute animal density from count data from both camera traps and acoustic detectors. We derived the gREM for different combinations of sensor detection widths and animal signal widths (a measure of directionality). We tested the accuracy and precision of this model using simulations of different combinations of sensor detection widths and animal signal widths, number of captures and models of animal movement.We find that the gREM produces accurate estimates of absolute animal density for all combinations of sensor detection widths and animal signal widths. However, larger sensor detection and animal signal widths were found to be more precise. While the model is accurate for all capture efforts tested, the precision of the estimate increases with the number of captures. We found no effect of different animal movement models on the accuracy and precision of the gREM.We conclude that the gREM provides an effective method to estimate absolute animal densities from remote sensor count data over a range of sensor and animal signal widths. The gREM is applicable for count data obtained in both marine and terrestrial environments, visually or acoustically (e.g. big cats, sharks, birds, echolocating bats and cetaceans). As sensors such as camera traps and acoustic
Earthquake Loss Estimation Uncertainties
Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander
2013-04-01
The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity
Assessment principles and tools.
Golnik, Karl C
2014-01-01
The goal of ophthalmology residency training is to produce competent ophthalmologists. Competence can only be determined by appropriately assessing resident performance. There are accepted guiding principles that should be applied to competence assessment methods. These principles are enumerated herein and ophthalmology-specific assessment tools that are available are described.
International Nuclear Information System (INIS)
Unnikrishnan, C.S.
1994-01-01
Principle of equivalence was the fundamental guiding principle in the formulation of the general theory of relativity. What are its key elements? What are the empirical observations which establish it? What is its relevance to some new experiments? These questions are discussed in this article. (author). 11 refs., 5 figs
Principles of Critical Dialogue.
Lankford, E. Louis
1986-01-01
Proposes four principles of critical dialog designed to suggest a consistent pattern of preparation for criticism. The principles concern the characteristics of the intended audience, establishing the goals of art criticism, making a commitment to a context of relevant dialogue, and clarifying one's concept of art in qualifying an object for…
International Nuclear Information System (INIS)
Carr, B.J.
1982-01-01
The anthropic principle (the conjecture that certain features of the world are determined by the existence of Man) is discussed with the listing of the objections, and is stated that nearly all the constants of nature may be determined by the anthropic principle which does not give exact values for the constants but only their orders of magnitude. (J.T.)
Denning, Peter J.
2008-01-01
The Great Principles of Computing is a framework for understanding computing as a field of science. The website ...April 2008 (Rev. 8/31/08) The Great Principles of Computing is a framework for understanding computing as a field of science.
Quantifying uncertainty in nuclear analytical measurements
International Nuclear Information System (INIS)
2004-07-01
The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration
Schroeder, Doris; Ladikas, Miltos
2015-01-01
Responsible Research and Innovation (RRI) has emerged as a science policy framework that attempts to import broad social values into technological innovation processes whilst supporting institutional decision-making under conditions of uncertainty and ambiguity. When looking at RRI from a ‘principled’ perspective, we consider responsibility and justice to be important cornerstones of the framework. The main aim of this article is to suggest a method of realising these principles through the a...
International Nuclear Information System (INIS)
Khoury, Justin; Parikh, Maulik
2009-01-01
Mach's principle is the proposition that inertial frames are determined by matter. We put forth and implement a precise correspondence between matter and geometry that realizes Mach's principle. Einstein's equations are not modified and no selection principle is applied to their solutions; Mach's principle is realized wholly within Einstein's general theory of relativity. The key insight is the observation that, in addition to bulk matter, one can also add boundary matter. Given a space-time, and thus the inertial frames, we can read off both boundary and bulk stress tensors, thereby relating matter and geometry. We consider some global conditions that are necessary for the space-time to be reconstructible, in principle, from bulk and boundary matter. Our framework is similar to that of the black hole membrane paradigm and, in asymptotically anti-de Sitter space-times, is consistent with holographic duality.
Variational principles in physics
Basdevant, Jean-Louis
2007-01-01
Optimization under constraints is an essential part of everyday life. Indeed, we routinely solve problems by striking a balance between contradictory interests, individual desires and material contingencies. This notion of equilibrium was dear to thinkers of the enlightenment, as illustrated by Montesquieu’s famous formulation: "In all magistracies, the greatness of the power must be compensated by the brevity of the duration." Astonishingly, natural laws are guided by a similar principle. Variational principles have proven to be surprisingly fertile. For example, Fermat used variational methods to demonstrate that light follows the fastest route from one point to another, an idea which came to be known as Fermat’s principle, a cornerstone of geometrical optics. Variational Principles in Physics explains variational principles and charts their use throughout modern physics. The heart of the book is devoted to the analytical mechanics of Lagrange and Hamilton, the basic tools of any physicist. Prof. Basdev...
Direct tests of measurement uncertainty relations: what it takes.
Busch, Paul; Stevens, Neil
2015-02-20
The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables.
The factualization of uncertainty:
DEFF Research Database (Denmark)
Meyer, G.; Folker, A.P.; Jørgensen, R.B.
2005-01-01
on risk assessment does nothing of the sort and is not likely to present an escape from the international deadlock on the use of genetic modification in agriculture and food production. The new legislation is likely to stimulate the kind of emotive reactions it was intended to prevent. In risk assessment...... exercises, scientific uncertainty is turned into risk, expressed in facts and figures. Paradoxically, this conveys an impression of certainty, while value-disagreement and conflicts of interest remain hidden below the surface of factuality. Public dialogue and negotiation along these lines are rendered...... would be to take care of itself – rethinking the role and the limitations of science in a social context, and, thereby gaining the strength to fulfill this role and to enter into dialogue with the rest of society. Scientific communities appear to be obvious candidates for prompting reflection...
Petzinger, Tom
I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.
Traceability and Measurement Uncertainty
DEFF Research Database (Denmark)
Tosello, Guido; De Chiffre, Leonardo
2004-01-01
-Nürnberg, Chair for Quality Management and Manufacturing-Oriented Metrology (Germany). 'Metro-E-Learn' project proposes to develop and implement a coherent learning and competence chain that leads from introductory and foundation e-courses in initial manufacturing engineering studies towards higher....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...
An uncertainty inventory demonstration - a primary step in uncertainty quantification
Energy Technology Data Exchange (ETDEWEB)
Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM
2009-01-01
Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.
The precautionary principle and pharmaceutical risk management.
Callréus, Torbjörn
2005-01-01
Although it is often vigorously contested and has several different formulations, the precautionary principle has in recent decades guided environmental policy making in the face of scientific uncertainty. Originating from a criticism of traditional risk assessment, the key element of the precautionary principle is the justification for acting in the face of uncertain knowledge about risks. In the light of its growing invocation in various areas that are related to public health and recently in relation to drug safety issues, this article presents an introductory review of the main elements of the precautionary principle and some arguments conveyed by its advocates and opponents. A comparison of the characteristics of pharmaceutical risk management and environmental policy making (i.e. the setting within which the precautionary principle evolved), indicates that several important differences exist. If believed to be of relevance, in order to avoid arbitrary and unpredictable decision making, both the interpretation and possible application of the precautionary principle need to be adapted to the conditions of pharmaceutical risk management.
Entropy-power uncertainty relations: towards a tight inequality for all Gaussian pure states
International Nuclear Information System (INIS)
Hertz, Anaelle; Jabbour, Michael G; Cerf, Nicolas J
2017-01-01
We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate continuous variables relies on entropy power, a standard notion in Shannon information theory for real-valued signals. The resulting entropy-power uncertainty relation is equivalent to the entropic formulation of the uncertainty relation due to Bialynicki-Birula and Mycielski, but can be further extended to rotated variables. Hence, based on a reasonable assumption, we give a partial proof of a tighter form of the entropy-power uncertainty relation taking correlations into account and provide extensive numerical evidence of its validity. Interestingly, it implies the generalized (rotation-invariant) Schrödinger–Robertson uncertainty relation exactly as the original entropy-power uncertainty relation implies Heisenberg relation. It is saturated for all Gaussian pure states, in contrast with hitherto known entropic formulations of the uncertainty principle. (paper)
Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2
International Nuclear Information System (INIS)
Wickett, A.J.; Yadigaroglu, G.
1994-08-01
The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop
Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance...
Uncertainty quantification in lattice QCD calculations for nuclear physics
Energy Technology Data Exchange (ETDEWEB)
Beane, Silas R. [Univ. of Washington, Seattle, WA (United States); Detmold, William [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Orginos, Kostas [College of William and Mary, Williamsburg, VA (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Savage, Martin J. [Institute for Nuclear Theory, Seattle, WA (United States)
2015-02-05
The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.
Additivity of entropic uncertainty relations
Directory of Open Access Journals (Sweden)
René Schwonnek
2018-03-01
Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.
Uncertainty Management and Sensitivity Analysis
DEFF Research Database (Denmark)
Georgiadis, Stylianos; Fantke, Peter
2017-01-01
Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...
Impact of discharge data uncertainty on nutrient load uncertainty
Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars
2016-04-01
Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.
Uncertainty analysis of environmental models
International Nuclear Information System (INIS)
Monte, L.
1990-01-01
In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition
Chemical model reduction under uncertainty
Najm, Habib
2016-01-05
We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.
On the Generalisation of Kepler's 3rd Law for the Vacuum Field of the Point-Mass
Directory of Open Access Journals (Sweden)
Crothers S. J.
2005-07-01
Full Text Available I derive herein a general form of Kepler’s 3rd Law for the general solution to Einstein’s vacuum field. I also obtain stable orbits for photons in all the configurations of the point-mass. Contrary to the accepted theory, Kepler’s 3rd Law is modified by General Relativity and leads to a finite angular velocity as the proper radius of the orbit goes down to zero, without the formation of a black hole. Finally, I generalise the expression for the potential function of the general solution for the point-mass in the weak field.
Zhang, L; Kokkinakis, M; Chong, BVP
2014-01-01
The paper presents the application of a UPFC to a case study of a 12-bus high power network. The UPFC shunt converter employs 8 3-level Neutral Point Clamped (NPC) voltage source converters (VSC) and 12 single-phase three-winding phase shifting transformers (PST), generating a 48-pulse output voltage. The 3-phase H-bridge series converter shares the same dc-link with the shunt one. The novel feature of this work lies in the use of a model-based generalised predictive current control law to th...
Lai, Qiang; Zhao, Xiao-Wen; Rajagopal, Karthikeyan; Xu, Guanghui; Akgul, Akif; Guleryuz, Emre
2018-01-01
This paper considers the generation of multi-butterfly chaotic attractors from a generalised Sprott C system with multiple non-hyperbolic equilibria. The system is constructed by introducing an additional variable whose derivative has a switching function to the Sprott C system. It is numerically found that the system creates two-, three-, four-, five-butterfly attractors and any other multi-butterfly attractors. First, the dynamic analyses of multi-butterfly chaotic attractors are presented. Secondly, the field programmable gate array implementation, electronic circuit realisation and random number generator are done with the multi-butterfly chaotic attractors.
Reliability analysis under epistemic uncertainty
International Nuclear Information System (INIS)
Nannapaneni, Saideep; Mahadevan, Sankaran
2016-01-01
This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.
Money and Growth under Uncertainty.
ECONOMICS, UNCERTAINTY), (*MONEY, DECISION MAKING), (* BEHAVIOR , MATHEMATICAL MODELS), PRODUCTION, CONSUMPTION , EQUILIBRIUM(PHYSIOLOGY), GROWTH(PHYSIOLOGY), MANAGEMENT ENGINEERING, PROBABILITY, INTEGRAL EQUATIONS, THESES
Simplified propagation of standard uncertainties
International Nuclear Information System (INIS)
Shull, A.H.
1997-01-01
An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper
Hill, Rodney
2013-01-01
Principles of Dynamics presents classical dynamics primarily as an exemplar of scientific theory and method. This book is divided into three major parts concerned with gravitational theory of planetary systems; general principles of the foundations of mechanics; and general motion of a rigid body. Some of the specific topics covered are Keplerian Laws of Planetary Motion; gravitational potential and potential energy; and fields of axisymmetric bodies. The principles of work and energy, fictitious body-forces, and inertial mass are also looked into. Other specific topics examined are kinematics
Modern electronic maintenance principles
Garland, DJ
2013-01-01
Modern Electronic Maintenance Principles reviews the principles of maintaining modern, complex electronic equipment, with emphasis on preventive and corrective maintenance. Unfamiliar subjects such as the half-split method of fault location, functional diagrams, and fault finding guides are explained. This book consists of 12 chapters and begins by stressing the need for maintenance principles and discussing the problem of complexity as well as the requirements for a maintenance technician. The next chapter deals with the connection between reliability and maintenance and defines the terms fai
Pérez-Soba Díez del Corral, Juan José
2008-01-01
Bioethics emerges about the tecnological problems of acting in human life. Emerges also the problem of the moral limits determination, because they seem exterior of this practice. The Bioethics of Principles, take his rationality of the teleological thinking, and the autonomism. These divergence manifest the epistemological fragility and the great difficulty of hmoralñ thinking. This is evident in the determination of autonomy's principle, it has not the ethical content of Kant's propose. We need a new ethic rationality with a new refelxion of new Principles whose emerges of the basic ethic experiences.
Hamilton's principle for beginners
International Nuclear Information System (INIS)
Brun, J L
2007-01-01
I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a line. Next, students are challenged to add gravity to reinforce the argument and, finally, a two-dimensional motion in a vertical plane is considered. Furthermore these examples force us to be very clear about such an abstract principle