WorldWideScience

Sample records for fundamental measure theory

  1. Fundamental measure theory for hard-sphere mixtures: a review

    International Nuclear Information System (INIS)

    Roth, Roland

    2010-01-01

    Hard-sphere systems are one of the fundamental model systems of statistical physics and represent an important reference system for molecular or colloidal systems with soft repulsive or attractive interactions in addition to hard-core repulsion at short distances. Density functional theory for classical systems, as one of the core theoretical approaches of statistical physics of fluids and solids, has to be able to treat such an important system successfully and accurately. Fundamental measure theory is up to date the most successful and most accurate density functional theory for hard-sphere mixtures. Since its introduction fundamental measure theory has been applied to many problems, tested against computer simulations, and further developed in many respects. The literature on fundamental measure theory is already large and is growing fast. This review aims to provide a starting point for readers new to fundamental measure theory and an overview of important developments. (topical review)

  2. Proposed experiment to test fundamentally binary theories

    Science.gov (United States)

    Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán

    2017-09-01

    Fundamentally binary theories are nonsignaling theories in which measurements of many outcomes are constructed by selecting from binary measurements. They constitute a sensible alternative to quantum theory and have never been directly falsified by any experiment. Here we show that fundamentally binary theories are experimentally testable with current technology. For that, we identify a feasible Bell-type experiment on pairs of entangled qutrits. In addition, we prove that, for any n , quantum n -ary correlations are not fundamentally (n -1 ) -ary. For that, we introduce a family of inequalities that hold for fundamentally (n -1 ) -ary theories but are violated by quantum n -ary correlations.

  3. Fundamental number theory with applications

    CERN Document Server

    Mollin, Richard A

    2008-01-01

    An update of the most accessible introductory number theory text available, Fundamental Number Theory with Applications, Second Edition presents a mathematically rigorous yet easy-to-follow treatment of the fundamentals and applications of the subject. The substantial amount of reorganizing makes this edition clearer and more elementary in its coverage. New to the Second Edition           Removal of all advanced material to be even more accessible in scope           New fundamental material, including partition theory, generating functions, and combinatorial number theory           Expa

  4. Fundamental aspects of quantum theory

    International Nuclear Information System (INIS)

    Gorini, V.; Frigerio, A.

    1986-01-01

    This book presents information on the following topics: general problems and crucial experiments; the classical behavior of measuring instruments; quantum interference effect for two atoms radiating a single photon; quantization and stochastic processes; quantum Markov processes driven by Bose noise; chaotic behavior in quantum mechanics; quantum ergodicity and chaos; microscopic and macroscopic levels of description; fundamental properties of the ground state of atoms and molecules; n-level systems interacting with Bosons - semiclassical limits; general aspects of gauge theories; adiabatic phase shifts for neutrons and photons; the spins of cyons and dyons; round-table discussion the the Aharonov-Bohm effect; gravity in quantum mechanics; the gravitational phase transition; anomalies and their cancellation; a new gauge without any ghost for Yang-Mills Theory; and energy density and roughening in the 3-D Ising ferromagnet

  5. Twenty five years of fundamental theory

    International Nuclear Information System (INIS)

    Bell, J.S.

    1980-01-01

    In reviewing the last twenty five years in fundamental physics theory it is stated that there has been no revolution in this field. In the absence of gravitation, Lorentz invariance remains a requirement on fundamental laws. Einstein's theory of gravitation inspires increasing conviction on the astronomical scale. Quantum theory remains the framework for all serious effort in microphysics, and quantum electrodynamics remains the model of a fully articulated microphysical theory, completely successful in its domain. However,a number of ideas have appeared, of great theoretical interest and some phenomenological success, which may well contribute to the next decisive step. Recent work on the following topics is mentioned; gravitational radiation, singularites, black body radiation from black holes, gauge and hidden symmetry in quantum electrodynamics, the renormalization of electromagnetic and weak interaction theory, non-Abelian gauge theories, magnetic monopoles as the most striking example of solitons, and supersymmetry. (UK)

  6. M(atrix) theory: matrix quantum mechanics as a fundamental theory

    International Nuclear Information System (INIS)

    Taylor, Washington

    2001-01-01

    This article reviews the matrix model of M theory. M theory is an 11-dimensional quantum theory of gravity that is believed to underlie all superstring theories. M theory is currently the most plausible candidate for a theory of fundamental physics which reconciles gravity and quantum field theory in a realistic fashion. Evidence for M theory is still only circumstantial -- no complete background-independent formulation of the theory exists as yet. Matrix theory was first developed as a regularized theory of a supersymmetric quantum membrane. More recently, it has appeared in a different guise as the discrete light-cone quantization of M theory in flat space. These two approaches to matrix theory are described in detail and compared. It is shown that matrix theory is a well-defined quantum theory that reduces to a supersymmetric theory of gravity at low energies. Although its fundamental degrees of freedom are essentially pointlike, higher-dimensional fluctuating objects (branes) arise through the non-Abelian structure of the matrix degrees of freedom. The problem of formulating matrix theory in a general space-time background is discussed, and the connections between matrix theory and other related models are reviewed

  7. Theory of fundamental interactions

    International Nuclear Information System (INIS)

    Pestov, A.B.

    1992-01-01

    In the present article the theory of fundamental interactions is derived in a systematic way from the first principles. In the developed theory there is no separation between space-time and internal gauge space. Main equations for basic fields are derived. In is shown that the theory satisfies the correspondence principle and gives rise to new notions in the considered region. In particular, the conclusion is made about the existence of particles which are characterized not only by the mass, spin, charge but also by the moment of inertia. These are rotating particles, the particles which represent the notion of the rigid body on the microscopical level and give the key for understanding strong interactions. The main concepts and dynamical laws for these particles are formulated. The basic principles of the theory may be examined experimentally not in the distant future. 29 refs

  8. The implications of fundamental cause theory for priority setting.

    Science.gov (United States)

    Goldberg, Daniel S

    2014-10-01

    Application of fundamental cause theory to Powers and Faden's model of social justice highlights the ethical superiority of upstream public health interventions. In this article, I assess the ramifications of fundamental cause theory specifically in context of public health priority setting. Ethically optimal public health policy simultaneously maximizes overall population health and compresses health inequalities. The fundamental cause theory is an important framework in helping to identify which categories of public health interventions are most likely to advance these twin goals.

  9. SU(2) Gauge Theory with Two Fundamental Flavours

    DEFF Research Database (Denmark)

    Arthur, Rudy; Drach, Vincent; Hansen, Martin

    2016-01-01

    We investigate the continuum spectrum of the SU(2) gauge theory with $N_f=2$ flavours of fermions in the fundamental representation. This model provides a minimal template which is ideal for a wide class of Standard Model extensions featuring novel strong dynamics that range from composite...... (Goldstone) Higgs theories to several intriguing types of dark matter candidates, such as the SIMPs. We improve our previous lattice analysis [1] by adding more data at light quark masses, at two additional lattice spacings, by determining the lattice cutoff via a Wilson flow measure of the $w_0$ parameter...

  10. Fundamental principles of quantum theory

    International Nuclear Information System (INIS)

    Bugajski, S.

    1980-01-01

    After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)

  11. Fundamental tests and measures of the structure of matter at short distances

    International Nuclear Information System (INIS)

    Brodsky, S.J.

    1981-07-01

    Recent progress in gauge field theories has led to a new perspective on the structure of matter and basic interactions at short distances. It is clear that at very high energies quantum electrodynamics, together with the weak and strong interactions, are part of a unified theory with new fundamental constants, new symmetries, and new conservation laws. A non-technical introduction to these topics is given, with emphasis on fundamental tests and measurements. 21 references

  12. Fundamentals of queueing theory

    CERN Document Server

    Gross, Donald; Thompson, James M; Harris, Carl M

    2013-01-01

    Praise for the Third Edition ""This is one of the best books available. Its excellent organizational structure allows quick reference to specific models and its clear presentation . . . solidifies the understanding of the concepts being presented.""-IIE Transactions on Operations Engineering Thoroughly revised and expanded to reflect the latest developments in the field, Fundamentals of Queueing Theory, Fourth Edition continues to present the basic statistical principles that are necessary to analyze the probabilistic nature of queues. Rather than pre

  13. Measurement theory in quantum mechanics

    International Nuclear Information System (INIS)

    Klein, G.

    1980-01-01

    It is assumed that consciousness, memory and liberty (within the limits of the quantum mechanics indeterminism) are fundamental properties of elementary particles. Then, using this assumption it is shown how measurements and observers may be introduced in a natural way in the quantum mechanics theory. There are no longer fundamental differences between macroscopic and microscopic objects, between classical and quantum objects, between observer and object. Thus, discrepancies and paradoxes have disappeared from the conventional quantum mechanics theory. One consequence of the cumulative memory of the particles is that the sum of negentropy plus information is a constant. Using this theory it is also possible to explain the 'paranormal' phenomena and what is their difference from the 'normal' ones [fr

  14. Long-range weight functions in fundamental measure theory of the non-uniform hard-sphere fluid

    International Nuclear Information System (INIS)

    Hansen-Goos, Hendrik

    2016-01-01

    We introduce long-range weight functions to the framework of fundamental measure theory (FMT) of the non-uniform, single-component hard-sphere fluid. While the range of the usual weight functions is equal to the hard-sphere radius R , the modified weight functions have range 3 R . Based on the augmented FMT, we calculate the radial distribution function g (r) up to second order in the density within Percus’ test particle theory. Consistency of the compressibility and virial routes on this level allows us to determine the free parameter γ of the theory. As a side result, we obtain a value for the fourth virial coefficient B 4 which deviates by only 0.01% from the exact result. The augmented FMT is tested for the dense fluid by comparing results for g (r) calculated via the test particle route to existing results from molecular dynamics simulations. The agreement at large distances (r   >  6 R) is significantly improved when the FMT with long-range weight functions is used. In order to improve agreement close to contact (r   =  2 R) we construct a free energy which is based on the accurate Carnahan–Starling equation of state, rather than the Percus–Yevick compressibility equation underlying standard FMT. (paper)

  15. Fundamental constants and tests of theory in Rydberg states of hydrogenlike ions.

    Science.gov (United States)

    Jentschura, Ulrich D; Mohr, Peter J; Tan, Joseph N; Wundt, Benedikt J

    2008-04-25

    A comparison of precision frequency measurements to quantum electrodynamics (QED) predictions for Rydberg states of hydrogenlike ions can yield information on values of fundamental constants and test theory. With the results of a calculation of a key QED contribution reported here, the uncertainty in the theory of the energy levels is reduced to a level where such a comparison can yield an improved value of the Rydberg constant.

  16. Fundamental Constants and Tests of Theory in Rydberg States of Hydrogenlike Ions

    International Nuclear Information System (INIS)

    Jentschura, Ulrich D.; Mohr, Peter J.; Tan, Joseph N.; Wundt, Benedikt J.

    2008-01-01

    A comparison of precision frequency measurements to quantum electrodynamics (QED) predictions for Rydberg states of hydrogenlike ions can yield information on values of fundamental constants and test theory. With the results of a calculation of a key QED contribution reported here, the uncertainty in the theory of the energy levels is reduced to a level where such a comparison can yield an improved value of the Rydberg constant

  17. Random measures, theory and applications

    CERN Document Server

    Kallenberg, Olav

    2017-01-01

    Offering the first comprehensive treatment of the theory of random measures, this book has a very broad scope, ranging from basic properties of Poisson and related processes to the modern theories of convergence, stationarity, Palm measures, conditioning, and compensation. The three large final chapters focus on applications within the areas of stochastic geometry, excursion theory, and branching processes. Although this theory plays a fundamental role in most areas of modern probability, much of it, including the most basic material, has previously been available only in scores of journal articles. The book is primarily directed towards researchers and advanced graduate students in stochastic processes and related areas.

  18. On time variation of fundamental constants in superstring theories

    International Nuclear Information System (INIS)

    Maeda, K.I.

    1988-01-01

    Assuming the action from the string theory and taking into account the dynamical freedom of a dilaton and its coupling to matter fluid, the authors show that fundamental 'constants' in string theories are independent of the 'radius' of the internal space. Since the scalar related to the 'constants' is coupled to the 4-dimensional gravity and matter fluid in the same way as in the Jordan-Brans Dicke theory with ω = -1, it must be massive and can get a mass easily through some symmetry breaking mechanism (e.g. the SUSY breaking due to a gluino condensation). Consequently, time variation of fundamental constants is too small to be observed

  19. Fundamental measure theory for the electric double layer: implications for blue-energy harvesting and water desalination

    International Nuclear Information System (INIS)

    Härtel, Andreas; Janssen, Mathijs; Samin, Sela; Roij, René van

    2015-01-01

    Capacitive mixing (CAPMIX) and capacitive deionization (CDI) are promising candidates for harvesting clean, renewable energy and for the energy efficient production of potable water, respectively. Both CAPMIX and CDI involve water-immersed porous carbon (supercapacitors) electrodes at voltages of the order of hundreds of millivolts, such that counter-ionic packing is important for the electric double layer (EDL) which forms near the surfaces of these porous materials. Thus, we propose a density functional theory (DFT) to model the EDL, where the White-Bear mark II fundamental measure theory functional is combined with a mean-field Coulombic and a mean spherical approximation-type correction to describe the interplay between dense packing and electrostatics, in good agreement with molecular dynamics simulations. We discuss the concentration-dependent potential rise due to changes in the chemical potential in capacitors in the context of an over-ideal theoretical description and its impact on energy harvesting and water desalination. Compared to less elaborate mean-field models our DFT calculations reveal a higher work output for blue-energy cycles and a higher energy demand for desalination cycles. (paper)

  20. DOE fundamentals handbook: Nuclear physics and reactor theory

    International Nuclear Information System (INIS)

    1993-01-01

    The Nuclear Physics and Reactor Theory Handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of nuclear physics and reactor theory. The handbook includes information on atomic and nuclear physics; neutron characteristics; reactor theory and nuclear parameters; and the theory of reactor operation. This information will provide personnel with a foundation for understanding the scientific principles that are associated with various DOE nuclear facility operations and maintenance

  1. Fundamentals of time-dependent density functional theory

    International Nuclear Information System (INIS)

    Marques, Miguel A.L.; Rubio, Angel

    2012-01-01

    There have been many significant advances in time-dependent density functional theory over recent years, both in enlightening the fundamental theoretical basis of the theory, as well as in computational algorithms and applications. This book, as successor to the highly successful volume Time-Dependent Density Functional Theory (Lect. Notes Phys. 706, 2006) brings together for the first time all recent developments in a systematic and coherent way. First, a thorough pedagogical presentation of the fundamental theory is given, clarifying aspects of the original proofs and theorems, as well as presenting fresh developments that extend the theory into new realms such as alternative proofs of the original Runge-Gross theorem, open quantum systems, and dispersion forces to name but a few. Next, all of the basic concepts are introduced sequentially and building in complexity, eventually reaching the level of open problems of interest. Contemporary applications of the theory are discussed, from real-time coupled-electron-ion dynamics, to excited-state dynamics and molecular transport. Last but not least, the authors introduce and review recent advances in computational implementation, including massively parallel architectures and graphical processing units. Special care has been taken in editing this volume as a multi-author textbook, following a coherent line of thought, and making all the relevant connections between chapters and concepts consistent throughout. As such it will prove to be the text of reference in this field, both for beginners as well as expert researchers and lecturers teaching advanced quantum mechanical methods to model complex physical systems, from molecules to nanostructures, from biocomplexes to surfaces, solids and liquids. (orig.)

  2. Infusing fundamental cause theory with features of Pierre Bourdieu's theory of symbolic power.

    Science.gov (United States)

    Veenstra, Gerry

    2018-02-01

    The theory of fundamental causes is one of the more influential attempts to provide a theoretical infrastructure for the strong associations between indicators of socioeconomic status (education, income, occupation) and health. It maintains that people of higher socioeconomic status have greater access to flexible resources such as money, knowledge, prestige, power, and beneficial social connections that they can use to reduce their risks of morbidity and mortality and minimize the consequences of disease once it occurs. However, several key aspects of the theory remain underspecified, compromising its ability to provide truly compelling explanations for socioeconomic health inequalities. In particular, socioeconomic status is an assembly of indicators that do not necessarily cohere in a straightforward way, the flexible resources that disproportionately accrue to higher status people are not clearly defined, and the distinction between socioeconomic status and resources is ambiguous. I attempt to address these definitional issues by infusing fundamental cause theory with features of a well-known theory of socioeconomic stratification in the sociological literature-Pierre Bourdieu's theory of symbolic power.

  3. Measurement theory for engineers

    CERN Document Server

    Gertsbakh, Ilya

    2003-01-01

    The emphasis of this textbook is on industrial applications of Statistical Measurement Theory. It deals with the principal issues of measurement theory, is concise and intelligibly written, and to a wide extent self-contained. Difficult theoretical issues are separated from the mainstream presentation. Each topic starts with an informal introduction followed by an example, the rigorous problem formulation, solution method, and a detailed numerical solution. Each chapter concludes with a set of exercises of increasing difficulty, mostly with solutions. The book is meant as a text for graduate students and a reference for researchers and industrial experts specializing in measurement and measurement data analysis for quality control, quality engineering and industrial process improvement using statistical methods. Knowledge of calculus and fundamental probability and statistics is required for the understanding of its contents.

  4. Measurement Errors and Uncertainties Theory and Practice

    CERN Document Server

    Rabinovich, Semyon G

    2006-01-01

    Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...

  5. Fundamental U-Theory of Time. Part 1

    Directory of Open Access Journals (Sweden)

    Yuvraj J. Gopaul

    2016-02-01

    Full Text Available The Fundamental U-Theory of Time (Part 1 is an original theory that aims to unravel the mystery of what exactly is ‘time’. To date very few explanations, from the branches of physics or cosmology, have succeeded to provide an accurate and comprehensive depiction of time. Most explanations have only managed to provide partial understanding or at best, glimpses of its true nature. The U-Theory uses ‘Thought Experiments’ to uncover the determining characteristics of time. In part 1 of this theory, the focus is not on the mathematics as it is on the accuracy of the depiction of time. Moreover, it challenges current views on theoretical physics, particularly on the idea of ‘time travel’. Notably, it is a theory seeking to present a fresh approach for reviewing Einstein’s Theory of Relativity, while unlocking new pathways for upcoming research in the field of physics and cosmology.

  6. In search for the unified theory of fundamental interactions

    International Nuclear Information System (INIS)

    Ansel'm, A.A.

    1980-01-01

    The problem of developing the unified theory of fundamental interactions is considered in a popular form. The fundamental interactions include interactions between really elementary particles (quarks and leptons) which are performed by strong, weak, electromagnetic and gravitational forces. The unified theory is based on the requirement of ''Local symmetry''. The problem on invariance of strong interaction theory to local isotopic transformation was proposed for the first time by Yang and Mills, who introduced fields, called compensating (they compensate additional members in the theory equations, appearing during local transformations) Quanta of these fields (calibrating bosons) are massless particles with a spin, equal to one. The bosons should have the mass different from zero in order to be the carriers of real strong and weak interactions. At present there exist two mechanisms, due to which the mentioned controdiction can be overcome. One of these mechanisms - spontaneous symmetry distortion, the other mechanism - ''non-escape'', or ''captivity'' of the particles. The main ideas of building the realistic model of strong interaction are briefly presented

  7. Towards the Fundamental Quantum Limit of Linear Measurements of Classical Signals.

    Science.gov (United States)

    Miao, Haixing; Adhikari, Rana X; Ma, Yiqiu; Pang, Belinda; Chen, Yanbei

    2017-08-04

    The quantum Cramér-Rao bound (QCRB) sets a fundamental limit for the measurement of classical signals with detectors operating in the quantum regime. Using linear-response theory and the Heisenberg uncertainty relation, we derive a general condition for achieving such a fundamental limit. When applied to classical displacement measurements with a test mass, this condition leads to an explicit connection between the QCRB and the standard quantum limit that arises from a tradeoff between the measurement imprecision and quantum backaction; the QCRB can be viewed as an outcome of a quantum nondemolition measurement with the backaction evaded. Additionally, we show that the test mass is more a resource for improving measurement sensitivity than a victim of the quantum backaction, which suggests a new approach to enhancing the sensitivity of a broad class of sensors. We illustrate these points with laser interferometric gravitational-wave detectors.

  8. Non-nucleon degrees of freedom in nuclei and ABC plan for developing fundamental nuclear theories

    International Nuclear Information System (INIS)

    Zhang Qiren

    1996-01-01

    We emphasize that to develop a fundamental nuclear theory one has to consider various non-nucleon degrees of freedom in nuclei and to make the theory relativistic. A three step ABC Plan for this purpose is proposed. The A pan is to reform the relativistic hadron field theory by taking the finite baryon size into account. We call finite size baryons atoms in contrast with points. The fundamental nuclear theory in this form is therefore a quantum atom dynamics (QAD). The B plan is to reform the bag model for hadrons by making it be quantum bag dynamics (QBD). This is a model fundamental nuclear theory on the quark level. The fundamental nuclear theory should eventually be developed on the basis of quantum chromodynamics (QCD). This is the C Plan

  9. Fundamental Principle for Quantum Theory

    OpenAIRE

    Khrennikov, Andrei

    2002-01-01

    We propose the principle, the law of statistical balance for basic physical observables, which specifies quantum statistical theory among all other statistical theories of measurements. It seems that this principle might play in quantum theory the role that is similar to the role of Einstein's relativity principle.

  10. Fundamental Elements and Interactions of Nature: A Classical Unification Theory

    Directory of Open Access Journals (Sweden)

    Tianxi Zhang

    2010-04-01

    Full Text Available A classical unification theory that completely unifies all the fundamental interactions of nature is developed. First, the nature is suggested to be composed of the following four fundamental elements: mass, radiation, electric charge, and color charge. All known types of matter or particles are a combination of one or more of the four fundamental elements. Photons are radiation; neutrons have only mass; protons have both mass and electric charge; and quarks contain mass, electric charge, and color charge. The nature fundamental interactions are interactions among these nature fundamental elements. Mass and radiation are two forms of real energy. Electric and color charges are considered as two forms of imaginary energy. All the fundamental interactions of nature are therefore unified as a single interaction between complex energies. The interaction between real energies is the gravitational force, which has three types: mass-mass, mass-radiation, and radiation-radiation interactions. Calculating the work done by the mass-radiation interaction on a photon derives the Einsteinian gravitational redshift. Calculating the work done on a photon by the radiation-radiation interaction derives a radiation redshift, which is much smaller than the gravitational redshift. The interaction between imaginary energies is the electromagnetic (between electric charges, weak (between electric and color charges, and strong (between color charges interactions. In addition, we have four imaginary forces between real and imaginary energies, which are mass-electric charge, radiation-electric charge, mass-color charge, and radiation-color charge interactions. Among the four fundamental elements, there are ten (six real and four imaginary fundamental interactions. This classical unification theory deepens our understanding of the nature fundamental elements and interactions, develops a new concept of imaginary energy for electric and color charges, and provides a

  11. Fundamental Elements and Interactions of Nature: A Classical Unification Theory

    Directory of Open Access Journals (Sweden)

    Zhang T. X.

    2010-04-01

    Full Text Available A classical unification theory that completely unifies all the fundamental interactions of nature is developed. First, the nature is suggested to be composed of the following four fundamental elements: mass, radiation, electric charge, and color charge. All known types of matter or particles are a combination of one or more of the four fundamental elements. Photons are radiation; neutrons have only mass; protons have both mass and electric charge; and quarks contain mass, electric charge, and color charge. The nature fundamental interactions are interactions among these nature fundamental elements. Mass and radiation are two forms of real energy. Electric and color charges are con- sidered as two forms of imaginary energy. All the fundamental interactions of nature are therefore unified as a single interaction between complex energies. The interac- tion between real energies is the gravitational force, which has three types: mass-mass, mass-radiation, and radiation-radiation interactions. Calculating the work done by the mass-radiation interaction on a photon derives the Einsteinian gravitational redshift. Calculating the work done on a photon by the radiation-radiation interaction derives a radiation redshift, which is much smaller than the gravitational redshift. The interaction between imaginary energies is the electromagnetic (between electric charges, weak (between electric and color charges, and strong (between color charges interactions. In addition, we have four imaginary forces between real and imaginary energies, which are mass-electric charge, radiation-electric charge, mass-color charge, and radiation- color charge interactions. Among the four fundamental elements, there are ten (six real and four imaginary fundamental interactions. This classical unification theory deep- ens our understanding of the nature fundamental elements and interactions, develops a new concept of imaginary energy for electric and color charges, and provides a

  12. Perspective: Fundamental aspects of time-dependent density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Maitra, Neepa T. [Department of Physics and Astronomy, Hunter College and the Physics Program at the Graduate Center of the City University of New York, 695 Park Avenue, New York, New York 10065 (United States)

    2016-06-14

    In the thirty-two years since the birth of the foundational theorems, time-dependent density functional theory has had a tremendous impact on calculations of electronic spectra and dynamics in chemistry, biology, solid-state physics, and materials science. Alongside the wide-ranging applications, there has been much progress in understanding fundamental aspects of the functionals and the theory itself. This Perspective looks back to some of these developments, reports on some recent progress and current challenges for functionals, and speculates on future directions to improve the accuracy of approximations used in this relatively young theory.

  13. Modern measurements fundamentals and applications

    CERN Document Server

    Petri, D; Carbone, P; Catelani, M

    2015-01-01

    This book explores the modern role of measurement science for both the technically most advanced applications and in everyday and will help readers gain the necessary skills to specialize their knowledge for a specific field in measurement. Modern Measurements is divided into two parts. Part I (Fundamentals) presents a model of the modern measurement activity and the already recalled fundamental bricks. It starts with a general description that introduces these bricks and the uncertainty concept. The next chapters provide an overview of these bricks and finishes (Chapter 7) with a more general and complex model that encompasses both traditional (hard) measurements and (soft) measurements, aimed at quantifying non-physical concepts, such as quality, satisfaction, comfort, etc. Part II (Applications) is aimed at showing how the concepts presented in Part I can be usefully applied to design and implement measurements in some very impor ant and broad fields. The editors cover System Identification (Chapter 8...

  14. Fundamental link between system theory and statistical mechanics

    International Nuclear Information System (INIS)

    Atmanspacher, H.; Scheingraber, H.

    1987-01-01

    A fundamental link between system theory and statistical mechanics has been found to be established by the Kolmogorov entropy. By this quantity the temporal evolution of dynamical systems can be classified into regular, chaotic, and stochastic processes. Since K represents a measure for the internal information creation rate of dynamical systems, it provides an approach to irreversibility. The formal relationship to statistical mechanics is derived by means of an operator formalism originally introduced by Prigogine. For a Liouville operator L and an information operator M tilde acting on a distribution in phase space, it is shown that i[L, M tilde] = KI (I = identity operator). As a first consequence of this equivalence, a relation is obtained between the chaotic correlation time of a system and Prigogine's concept of a finite duration of presence. Finally, the existence of chaos in quantum systems is discussed with respect to the existence of a quantum mechanical time operator

  15. Measurement of attenuation coefficients of the fundamental and second harmonic waves in water

    Science.gov (United States)

    Zhang, Shuzeng; Jeong, Hyunjo; Cho, Sungjong; Li, Xiongbing

    2016-02-01

    Attenuation corrections in nonlinear acoustics play an important role in the study of nonlinear fluids, biomedical imaging, or solid material characterization. The measurement of attenuation coefficients in a nonlinear regime is not easy because they depend on the source pressure and requires accurate diffraction corrections. In this work, the attenuation coefficients of the fundamental and second harmonic waves which come from the absorption of water are measured in nonlinear ultrasonic experiments. Based on the quasilinear theory of the KZK equation, the nonlinear sound field equations are derived and the diffraction correction terms are extracted. The measured sound pressure amplitudes are adjusted first for diffraction corrections in order to reduce the impact on the measurement of attenuation coefficients from diffractions. The attenuation coefficients of the fundamental and second harmonics are calculated precisely from a nonlinear least squares curve-fitting process of the experiment data. The results show that attenuation coefficients in a nonlinear condition depend on both frequency and source pressure, which are much different from a linear regime. In a relatively lower drive pressure, the attenuation coefficients increase linearly with frequency. However, they present the characteristic of nonlinear growth in a high drive pressure. As the diffraction corrections are obtained based on the quasilinear theory, it is important to use an appropriate source pressure for accurate attenuation measurements.

  16. Geometric theory of fundamental interactions. Foundations of unified physics

    International Nuclear Information System (INIS)

    Pestov, A.B.

    2012-01-01

    We put forward an idea that regularities of unified physics are in a simple relation: everything in the concept of space and the concept of space in everything. With this hypothesis as a ground, a conceptual structure of a unified geometrical theory of fundamental interactions is created and deductive derivation of its main equations is produced. The formulated theory gives solution of the actual problems, provides opportunity to understand the origin and nature of physical fields, local internal symmetry, time, energy, spin, charge, confinement, dark energy and dark matter, thus conforming the existence of new physics in its unity

  17. On the fundamental principles of the relativistic theory of gravitation

    International Nuclear Information System (INIS)

    Logunov, A.A.; Mestvirishvili, M.A.

    1990-01-01

    This paper expounds consistently within the frames of the Special Relativity Theory the fundamental postulates of the Relativistic Theory of Gravitation (RTG) which make it possible to obtain the unique complete system of the equations for gravitational field. Major attention has been paid to the analysis of the gauge group and of the causality principle. Some results related to the evolution of the Friedmann Universe, to gravitational collapse, etc. being the consequences of the RTG equations are also presented. 7 refs

  18. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  19. Scattering lengths in SU(2) gauge theory with two fundamental fermions

    DEFF Research Database (Denmark)

    Arthur, R.; Drach, V.; Hansen, Martin Rasmus Lundquist

    2014-01-01

    We investigate non perturbatively scattering properties of Goldstone Bosons in an SU(2) gauge theory with two Wilson fermions in the fundamental representation. Such a theory can be used to build extensions of the Standard Model that unifies Technicolor and pseudo Goldstone composite Higgs models...... the expected chiral symmetry breaking pattern. We then discuss how to compute them on the lattice and give preliminary results using finite size methods....

  20. Fundamental theories of waves and particles formulated without classical mass

    Science.gov (United States)

    Fry, J. L.; Musielak, Z. E.

    2010-12-01

    Quantum and classical mechanics are two conceptually and mathematically different theories of physics, and yet they do use the same concept of classical mass that was originally introduced by Newton in his formulation of the laws of dynamics. In this paper, physical consequences of using the classical mass by both theories are explored, and a novel approach that allows formulating fundamental (Galilean invariant) theories of waves and particles without formally introducing the classical mass is presented. In this new formulation, the theories depend only on one common parameter called 'wave mass', which is deduced from experiments for selected elementary particles and for the classical mass of one kilogram. It is shown that quantum theory with the wave mass is independent of the Planck constant and that higher accuracy of performing calculations can be attained by such theory. Natural units in connection with the presented approach are also discussed and justification beyond dimensional analysis is given for the particular choice of such units.

  1. Interpreting doubly special relativity as a modified theory of measurement

    International Nuclear Information System (INIS)

    Liberati, Stefano; Sonego, Sebastiano; Visser, Matt

    2005-01-01

    In this article we develop a physical interpretation for the deformed (doubly) special relativity theories (DSRs), based on a modification of the theory of measurement in special relativity. We suggest that it is useful to regard the DSRs as reflecting the manner in which quantum gravity effects induce Planck-suppressed distortions in the measurement of the 'true' energy and momentum. This interpretation provides a framework for the DSRs that is manifestly consistent, nontrivial, and in principle falsifiable. However, it does so at the cost of demoting such theories from the level of fundamental physics to the level of phenomenological models - models that should in principle be derivable from whatever theory of quantum gravity one ultimately chooses to adopt

  2. Open and closed string worldsheets from free large N gauge theories with adjoint and fundamental matter

    International Nuclear Information System (INIS)

    Yaakov, Itamar

    2006-01-01

    We extend Gopakumar's prescription for constructing closed string worldsheets from free field theory diagrams with adjoint matter to open and closed string worldsheets arising from free field theories with fundamental matter. We describe the extension of the gluing mechanism and the electrical circuit analogy to fundamental matter. We discuss the generalization of the existence and uniqueness theorem of Strebel differentials to open Riemann surfaces. Two examples are computed of correlators containing fundamental matter, and the resulting worldsheet OPE's are computed. Generic properties of Gopakumar's construction are discussed

  3. To the field theory with a fundamental mass

    International Nuclear Information System (INIS)

    Ibadov, R.M.; Kadyshevskij, V.G.

    1986-01-01

    This paper is a continuation of the investigations along the lines of constructing a consistent field theory with fundamental mass M - a hypothetical universal scale in the ultrahigh energy region. Earlier, in the developed approach the key role was played by the de Sitter momentum space of radius M. In this paper a quantum version of this idea is worked out: p-space is assumed to be a de Sitter one like before; however, the four-momentum p μ is treated as a quantum mechanical operator in δ/δx μ only

  4. $SU(2)$ gauge theory with two fundamental flavours: scalar and pseudoscalar spectrum

    CERN Document Server

    Arthur, Rudy; Hietanen, Ari; Pica, Claudio; Sannino, Francesco

    2016-01-01

    We investigate the scalar and pseudoscalar spectrum of the $SU(2)$ gauge theory with $N_f=2$ flavours of fermions in the fundamental representation using non perturbative lattice simulations. We provide first benchmark estimates of the mass of the lightest $0(0^{+})$ ($\\sigma$), $0(0^{-})$ ($\\eta'$) and $1(0^+)$ ($a_0$) states, including estimates of the relevant disconnected contributions. We find $m_{a_0}/F_{\\rm{PS}}= 16.7(4.9)$, $m_\\sigma/F_{\\rm{PS}}=19.2(10.8)$ and $m_{\\eta'}/F_{\\rm{PS}} = 12.8(4.7)$. These values for the masses of light scalar states provide crucial information for composite extensions of the Standard Model from the unified Fundamental Composi te Higgs-Technicolor theory \\cite{Cacciapaglia:2014uja} to models of composite dark matter.

  5. An Ultraviolet Chiral Theory of the Top for the Fundamental Composite (Goldstone) Higgs

    DEFF Research Database (Denmark)

    Cacciapaglia, Giacomo; Sannino, Francesco

    2016-01-01

    We introduce a scalar-less anomaly free chiral gauge theory that serves as natural ultraviolet completion of models of fundamental composite (Goldstone) Higgs dynamics. The new theory is able to generate the top mass and furthermore features a built-in protection mechanism that naturally suppresses...... the bottom mass. At low energies the theory predicts new fractionally charged fermions, and a number of four-fermion operators that, besides being relevant for the generation of the top mass, also lead to an intriguing phenomenology for the new states predicted by the theory....

  6. An ultraviolet chiral theory of the top for the fundamental composite (Goldstone) Higgs

    Energy Technology Data Exchange (ETDEWEB)

    Cacciapaglia, Giacomo, E-mail: g.cacciapaglia@ipnl.in2p3.fr [Univ Lyon, Université Lyon 1, CNRS/IN2P3, IPNL, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Sannino, Francesco, E-mail: sannino@cp3.dias.sdu.dk [CP" 3-Origins and the Danish IAS, University of Southern Denmark, Campusvej 55, DK-5230 Odense M (Denmark)

    2016-04-10

    We introduce a scalar-less anomaly free chiral gauge theory that serves as natural ultraviolet completion of models of fundamental composite (Goldstone) Higgs dynamics. The new theory is able to generate the top mass and furthermore features a built-in protection mechanism that naturally suppresses the bottom mass. At low energies the theory predicts new fractionally charged fermions, and a number of four-fermion operators that, besides being relevant for the generation of the top mass, also lead to an intriguing phenomenology for the new states predicted by the theory.

  7. Non-additive measures theory and applications

    CERN Document Server

    Narukawa, Yasuo; Sugeno, Michio; 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012)

    2014-01-01

    This book provides a comprehensive and timely report in the area of non-additive measures and integrals. It is based on a panel session on fuzzy measures, fuzzy integrals and aggregation operators held during the 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012) in Girona, Spain, November 21-23, 2012. The book complements the MDAI 2012 proceedings book, published in Lecture Notes in Computer Science (LNCS) in 2012. The individual chapters, written by key researchers in the field, cover fundamental concepts and important definitions (e.g. the Sugeno integral, definition of entropy for non-additive measures) as well some important applications (e.g. to economics and game theory) of non-additive measures and integrals. The book addresses students, researchers and practitioners working at the forefront of their field.  

  8. Two-colour QCD at finite fundamental quark-number density and related theories

    International Nuclear Information System (INIS)

    Hands, S.J.; Kogut, J.B.; Morrison, S.E.; Sinclair, D.K.

    2001-01-01

    We are simulating SU(2) Yang-Mills theory with four flavours of dynamical quarks in the fundamental representation of SU(2) 'colour' at finite chemical potential, μ for quark number, as a model for QCD at finite baryon number density. In particular we observe that for μ large enough this theory undergoes a phase transition to a state with a diquark condensate which breaks quark-number symmetry. In this phase we examine the spectrum of light scalar and pseudoscalar bosons and see evidence for the Goldstone boson associated with this spontaneous symmetry breaking. This theory is closely related to QCD at finite chemical potential for isospin, a theory which we are now studying for SU(3) colour

  9. Two-colour QCD at finite fundamental quark-number density and related theories

    International Nuclear Information System (INIS)

    Hands, S. J.; Kogut, J. B.; Morrison, S. E.; Sinclair, D. K.

    2000-01-01

    We are simulating SU(2) Yang-Mills theory with four flavours of dynamical quarks in the fundamental representation of SU(2) colour at finite chemical potential, p for quark number, as a model for QCD at finite baryon number density. In particular we observe that for p large enough this theory undergoes a phase transition to a state with a diquark condensate which breaks quark-number symmetry. In this phase we examine the spectrum of light scalar and pseudoscalar bosons and see evidence for the Goldstone boson associated with this spontaneous symmetry breaking. This theory is closely related to QCD at finite chemical potential for isospin, a theory which we are now studying for SU(3) colour

  10. Applied Physics of Carbon Nanotubes Fundamentals of Theory, Optics and Transport Devices

    CERN Document Server

    Rotkin, Slava V

    2005-01-01

    The book describes the state-of-the-art in fundamental, applied and device physics of nanotubes, including fabrication, manipulation and characterization for device applications; optics of nanotubes; transport and electromechanical devices and fundamentals of theory for applications. This information is critical to the field of nanoscience since nanotubes have the potential to become a very significant electronic material for decades to come. The book will benefit all all readers interested in the application of nanotubes, either in their theoretical foundations or in newly developed characterization tools that may enable practical device fabrication.

  11. The theory of confidence-building measures

    International Nuclear Information System (INIS)

    Darilek, R.E.

    1992-01-01

    This paper discusses the theory of Confidence-Building Measures (CBMs) in two ways. First, it employs a top-down, deductively oriented approach to explain CBM theory in terms of the arms control goals and objectives to be achieved, the types of measures to be employed, and the problems or limitations likely to be encountered when applying CBMs to conventional or nuclear forces. The chapter as a whole asks how various types of CBMs might function during a political - military escalation from peacetime to a crisis and beyond (i.e. including conflict), as well as how they might operate in a de-escalatory environment. In pursuit of these overarching issues, the second section of the chapter raises a fundamental but complicating question: how might the next all-out war actually come aoubt - by unpremeditated escalation resulting from misunderstanding or miscalculation, or by premeditation resulting in a surprise attack? The second section of the paper addresses this question, explores its various implications for CBMs, and suggests the potential contribution of different types of CBMs toward successful resolution of the issues involved

  12. Measurement and probability a probabilistic theory of measurement with applications

    CERN Document Server

    Rossi, Giovanni Battista

    2014-01-01

    Measurement plays a fundamental role both in physical and behavioral sciences, as well as in engineering and technology: it is the link between abstract models and empirical reality and is a privileged method of gathering information from the real world. Is it possible to develop a single theory of measurement for the various domains of science and technology in which measurement is involved? This book takes the challenge by addressing the following main issues: What is the meaning of measurement? How do we measure? What can be measured? A theoretical framework that could truly be shared by scientists in different fields, ranging from physics and engineering to psychology is developed. The future in fact will require greater collaboration between science and technology and between different sciences. Measurement, which played a key role in the birth of modern science, can act as an essential interdisciplinary tool and language for this new scenario. A sound theoretical basis for addressing key problems in mea...

  13. Fundamental Processes in Plasmas. Final report

    International Nuclear Information System (INIS)

    O'Neil, Thomas M.; Driscoll, C. Fred

    2009-01-01

    This research focuses on fundamental processes in plasmas, and emphasizes problems for which precise experimental tests of theory can be obtained. Experiments are performed on non-neutral plasmas, utilizing three electron traps and one ion trap with a broad range of operating regimes and diagnostics. Theory is focused on fundamental plasma and fluid processes underlying collisional transport and fluid turbulence, using both analytic techniques and medium-scale numerical simulations. The simplicity of these systems allows a depth of understanding and a precision of comparison between theory and experiment which is rarely possible for neutral plasmas in complex geometry. The recent work has focused on three areas in basic plasma physics. First, experiments and theory have probed fundamental characteristics of plasma waves: from the low-amplitude thermal regime, to inviscid damping and fluid echoes, to cold fluid waves in cryogenic ion plasmas. Second, the wide-ranging effects of dissipative separatrices have been studied experimentally and theoretically, finding novel wave damping and coupling effects and important plasma transport effects. Finally, correlated systems have been investigated experimentally and theoretically: UCSD experients have now measured the Salpeter correlation enhancement, and theory work has characterized the 'guiding center atoms of antihydrogen created at CERN

  14. The Fundamentals Of Kants Moral Theory

    Directory of Open Access Journals (Sweden)

    Adriana Saraiva Lamounier Rodrigues

    2015-12-01

    Full Text Available The article intends to study the moral thought in the philosophy of Immanuel Kant, considered the first of the philosophers who composes the movement known as German Idealism, especially in the book "Critique of Practical Reason". To achieve the objective the article begins with the traces of the moral studies at Kants time and its fundamental questions as well as traces of his formation that influenced his writings. Soon after, it analyzes the Kantian thought itself, through the work "The Idea of Justice in Kant", from Joaquim Carlos Salgado, theoretical framework of this research. It will be analyzed the postulate of freedom and its relationship with the sollen and the moral law, the species of imperatives, the categorical imperative and equality, connections that the moral theory makes for the existence of positive law and that the author considers the greater pillar of the Idea of justice from the Prussian philosophers point of of view. The methodology used in the research is theoretical, based on the analysis of the theoretical framework and its relationship to other publications concerning the same subject.

  15. DOE Fundamentals Handbook: Electrical Science, Volume 1

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of electrical theory, terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive components; batteries; AC and DC voltage regulators; transformers; and electrical test instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  16. DOE Fundamentals Handbook: Electrical Science, Volume 3

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of electrical theory, terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors and generators; AC power and reactive components; batteries; AC and DC voltage regulators; transformers; and electrical test instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  17. DOE Fundamentals Handbook: Electrical Science, Volume 4

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of electrical theory, terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive transformers; and electrical test components; batteries; AC and DC voltage regulators; instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  18. Fundamentals of Acoustics. Psychoacoustics and Hearing. Acoustical Measurements

    Science.gov (United States)

    Begault, Durand R.; Ahumada, Al (Technical Monitor)

    1997-01-01

    These are 3 chapters that will appear in a book titled "Building Acoustical Design", edited by Charles Salter. They are designed to introduce the reader to fundamental concepts of acoustics, particularly as they relate to the built environment. "Fundamentals of Acoustics" reviews basic concepts of sound waveform frequency, pressure, and phase. "Psychoacoustics and Hearing" discusses the human interpretation sound pressure as loudness, particularly as a function of frequency. "Acoustic Measurements" gives a simple overview of the time and frequency weightings for sound pressure measurements that are used in acoustical work.

  19. Is signal detection theory fundamentally flawed? A response to Balakrishnan (1998a, 1998b, 1999).

    Science.gov (United States)

    Treisman, Michel

    2002-12-01

    For nearly 50 years, signal detection theory (SDT; Green & Swvets, 1966; Macmillan & Creelman, 1991) has been of central importance in the development of psychophysics and other areas of psychology. The theory has recently been challenged by Balakrishnan (1998b), who argues that, within SDT, an alternative index is "better justified" than d' and who claims to show (1998a, 1999) that SDT is fundamentally flawed and should be rejected. His evidence is based on new nonparametric measures that he has introduced and applied to experimental data. He believes his results show that basic assumptions of SDT are not supported-in particular, that payoff and probability manipulations do not affect the position of the decision criterion. In view of the importance of SDT in psychology, these claims deserve careful examination. They are critically reviewed here. It appears that it is Balakrishnans arguments that fail, and not SDT

  20. Positivism and Constitutional Post- positivism : A Debate on Breast Theory of Fundamental Rights

    Directory of Open Access Journals (Sweden)

    Matheus Felipe de Castro

    2016-05-01

    Full Text Available This article, based on the theoretical framework of the philosophy of praxis, is to discuss the strained relations between power and justice in the enforcement of fundamental rights, making a comparison between the theoretical concepts of Hans Kelsen and Robert Alexy. In this sense, are compared the thoughts of these two authors, emphasizing the central role that power has the legal conception of the first as opposed to the theory of justice that animates the legal conceptions of the second. We discuss how this tension that appears in the theoretical confrontation of the two authors is actually a moment of real, but constitutes a dialectical interaction which must be observed and deciphered in the concrete application of the law. It concludes with the search of separation from what is real from what is in this ideological debate, seeking to deepen the debate on fundamental rights as the core of modern structural theory of law.

  1. A fundamental study of ''contribution'' transport theory and channel theory applications

    International Nuclear Information System (INIS)

    Williams, M.L.

    1992-01-01

    The objective of this three-year study is to develop a technique called ''channel theory'' that can be used in interpreting particle transport analysis such as frequently required in radiation shielding design and assessment. Channel theory is a technique used to provide insight into the mechanisms by which particles emitted from a source are transported through a complex system and register a response on some detector. It is based on the behavior of a pseudo particle called a ''contributon,'' which is the response carrier through space and energy channels that connect the source and detector. ''Contributons'' are those particles among all the ones contained in the system which will eventually contribute some amount of response to the detector. The specific goals of this projects are to provide a more fundamental theoretical understanding of the method, and to develop computer programs to apply the techniques to practical problems encountered in radiation transport analysis. The overall project can be divided into three components to meet these objectives: (a) Theoretical Development, (b) Code Development, and (c) Sample Applications. During the present third year of this study, an application of contributon theory to the analysis of radiation heating in a nuclear rocket has been completed, and a paper on the assessment of radiation damage response of an LWR pressure vessel and analysis of radiation propagation through space and energy channels in air at the Hiroshima weapon burst was accepted for publication. A major effort was devoted to developing a new ''Contributon Monte Carlo'' method, which can improve the efficiency of Monte Carlo calculations of radiation transport by tracking only contributons. The theoretical basis for Contributon Monte Carlo has been completed, and the implementation and testing of the technique is presently being performed

  2. Measure and integration theory

    CERN Document Server

    Burckel, Robert B

    2001-01-01

    This book gives a straightforward introduction to the field as it is nowadays required in many branches of analysis and especially in probability theory. The first three chapters (Measure Theory, Integration Theory, Product Measures) basically follow the clear and approved exposition given in the author's earlier book on ""Probability Theory and Measure Theory"". Special emphasis is laid on a complete discussion of the transformation of measures and integration with respect to the product measure, convergence theorems, parameter depending integrals, as well as the Radon-Nikodym theorem. The fi

  3. Group theory for chemists fundamental theory and applications

    CERN Document Server

    Molloy, K C

    2010-01-01

    The basics of group theory and its applications to themes such as the analysis of vibrational spectra and molecular orbital theory are essential knowledge for the undergraduate student of inorganic chemistry. The second edition of Group Theory for Chemists uses diagrams and problem-solving to help students test and improve their understanding, including a new section on the application of group theory to electronic spectroscopy.Part one covers the essentials of symmetry and group theory, including symmetry, point groups and representations. Part two deals with the application of group theory t

  4. Accurate Estimation of Low Fundamental Frequencies from Real-Valued Measurements

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2013-01-01

    In this paper, the difficult problem of estimating low fundamental frequencies from real-valued measurements is addressed. The methods commonly employed do not take the phenomena encountered in this scenario into account and thus fail to deliver accurate estimates. The reason for this is that the......In this paper, the difficult problem of estimating low fundamental frequencies from real-valued measurements is addressed. The methods commonly employed do not take the phenomena encountered in this scenario into account and thus fail to deliver accurate estimates. The reason...... for this is that they employ asymptotic approximations that are violated when the harmonics are not well-separated in frequency, something that happens when the observed signal is real-valued and the fundamental frequency is low. To mitigate this, we analyze the problem and present some exact fundamental frequency estimators...

  5. New progress of fundamental aspects in quantum mechanics

    International Nuclear Information System (INIS)

    Sun Changpu

    2001-01-01

    The review recalls the conceptual origins of various interpretations of quantum mechanics. With the focus on quantum measurement problems, new developments of fundamental quantum theory are described in association with recent experiments such as the decoherence process in cavity quantum electrodynamics 'which-way' detection using the Bragg scattering of cold atoms, and quantum interference using the small quantum system of molecular C 60 . The fundamental problems include the quantum coherence of a macroscopic object, the von Neumann chain in quantum measurement, the Schroedinger cat paradox, et al. Many land math experiments have been accomplished with possible important applications in quantum information. The most recent research on the new quantum theory by G.'t Hooft is reviewed, as well as future prospects of quantum mechanics

  6. Detailed examination of 'standard elementary particle theories' based on measurement with Tristan

    International Nuclear Information System (INIS)

    Kamae, Tsuneyoshi

    1989-01-01

    The report discusses possible approaches to detailed analysis of 'standard elementary particle theories' on the basis of measurements made with Tristan. The first section of the report addresses major elementary particles involved in the 'standard theories'. The nature of the gauge particles, leptons, quarks and Higgs particle are briefly outlined. The Higgs particle and top quark have not been discovered, though the Higgs particle is essential in the Weiberg-Salam theory. Another important issue in this field is the cause of the collapse of the CP symmetry. The second section deals with problems which arise in universalizing the concept of the 'standard theories'. What are required to solve these problems include the discovery of supersymmetric particles, discovery of conflicts in the 'standard theories', and accurate determination of fundamental constants used in the 'standard theories' by various different methods. The third and fourth sections address the Weinberg-Salam theory and quantum chromodynamics (QCD). There are four essential parameters for the 'standard theories', three of which are associated with the W-S theory. The mass of the W and Z bosons measured in proton-antiproton collision experiments is compared with that determined by applying the W-S theory to electron-positron experiments. For QCD, it is essential to determine the lambda constant. (N.K.)

  7. A fundamental special-relativistic theory valid for all real-valued speeds

    Directory of Open Access Journals (Sweden)

    Vedprakash Sewjathan

    1984-01-01

    Full Text Available This paper constitutes a fundamental rederivation of special relativity based on the c-invariance postulate but independent of the assumption ds′2=±ds2 (Einstein [1], Kittel et al [2], Recami [3], the equivalence principle, homogeneity of space-time, isotropy of space, group properties and linearity of space-time transformations or the coincidence of the origins of inertial space-time frames. The mathematical formalism is simpler than Einstein's [4] and Recami's [3]. Whilst Einstein's subluminal and Recami's superluminal theories are rederived in this paper by further assuming the equivalence principle and “mathematical inverses” [4,3], this paper derives (independent of these assumptions with physico-mathematical motivation an alternate singularity-free special-relativistic theory which replaces Einstein's factor [1/(1−V2/c2]12 and Recami's extended-relativistic factor [1/(V2/c2−1]12 by [(1−(V2/c2n/(1−V2/c2]12, where n equals the value of (m(V/m02 as |V|→c. In this theory both Newton's and Einstein's subluminal theories are experimentally valid on account of negligible terms. This theory implies that non-zero rest mass luxons will not be detected as ordinary non-zero rest mass bradyons because of spatial collapse, and non-zero rest mass tachyons are undetectable because they exist in another cosmos, resulting in a supercosmos of matter, with the possibility of infinitely many such supercosmoses, all moving forward in time. Furthermore this theory is not based on any assumption giving rise to the twin paradox controversy. The paper concludes with a discussion of the implications of this theory for general relativity.

  8. Fundamental characteristics of the QFP measured by the dc SQUID

    International Nuclear Information System (INIS)

    Shimizu, N.; Harada, Y.; Miyamoto, N.; Hosoya, M.; Goto, E.

    1989-01-01

    This paper describes the fundamental characteristics of the Quantum Flux Parametron (QFP) measured by a new method in which the output signals of the QFP are detected with a dc SQUID. The dc SQUID linearly and continuously converts the output current of the QFP to voltage, allowing the output signal of the QFP to be measured as the voltage of the dc SQUID. Thus, the fundamental characteristics of the QFP have been experimentally confirmed in detail

  9. Radiometric temperature measurements fundamentals

    CERN Document Server

    Zhang, Zhuomin M; Machin, Graham

    2009-01-01

    This book describes the theory of radiation thermometry, both at a primary level and for a variety of applications, such as in the materials processing industries and remote sensing. This book is written for those who will apply radiation thermometry in industrial practice; use radiation thermometers for scientific research; the radiation thermometry specialist in a national measurement institute; developers of radiation thermometers who are working to innovate products for instrument manufacturers, and developers of non-contact thermometry methods to address challenging thermometry problems.

  10. Two-ion theory of energy coupling in ATP synthesis rectifies a fundamental flaw in the governing equations of the chemiosmotic theory.

    Science.gov (United States)

    Nath, Sunil

    2017-11-01

    The vital coupled processes of oxidative phosphorylation and photosynthetic phosphorylation synthesize molecules of adenosine-5'-triphosphate (ATP), the universal biological energy currency, and sustain all life on our planet. The chemiosmotic theory of energy coupling in oxidative and photophosphorylation was proposed by Mitchell >50years ago. It has had a contentious history, with part of the accumulated body of experimental evidence supporting it, and part of it in conflict with the theory. Although the theory was strongly criticized by many prominent scientists, the controversy has never been resolved. Here, the mathematical steps of Mitchell's original derivation leading to the principal equation of the chemiosmotic theory are scrutinized, and a fundamental flaw in them has been identified. Surprisingly, this flaw had not been detected earlier. Discovery of such a defect negates, or at least considerably weakens, the theoretical foundations on which the chemiosmotic theory is based. Ad hoc or simplistic ways to remedy this defect are shown to be scientifically unproductive and sterile. A novel two-ion theory of biological energy coupling salvages the situation by rectifying the fundamental flaw in the chemiosmotic theory, and the governing equations of the new theory have been shown to accurately quantify and predict extensive recent experimental data on ATP synthesis by F 1 F O -ATP synthase without using adjustable parameters. Some major biological implications arising from the new thinking are discussed. The principles of energy transduction and coupling proposed in the new paradigm are shown to be of a very general and universal nature. It is concluded that the timely availability after a 25-year research struggle of Nath's torsional mechanism of energy transduction and ATP synthesis is a rational alternative that has the power to solve the problems arising from the past, and also meet present and future challenges in this important interdisciplinary field

  11. Polarization of electron-positron vacuum by strong magnetic field in theory with fundamental mass

    International Nuclear Information System (INIS)

    Kadyshevskij, V.G.; ); Rodionov, V.N.

    2003-01-01

    The exact Lagrangian function of the intensive constant magnetic field, replacing the Heisenberg-Euler Lagrangian in the traditional quantum electrodynamics, is calculated within the frames of the theory with the fundamental mass in the single-loop approximation. It is established that the obtained generalization of the Lagrangian function is substantial by arbitrary values of the magnetic field. The calculated Lagrangian in the weak field coincides with the known Heisenberg-Euler formula. The Lagrangian dependence on the field in the extremely strong fields completely disappears and it tends in this area to the threshold value, which is determined by the fundamental and lepton mass ratio [ru

  12. Fundamental length

    International Nuclear Information System (INIS)

    Pradhan, T.

    1975-01-01

    The concept of fundamental length was first put forward by Heisenberg from purely dimensional reasons. From a study of the observed masses of the elementary particles known at that time, it is sumrised that this length should be of the order of magnitude 1 approximately 10 -13 cm. It was Heisenberg's belief that introduction of such a fundamental length would eliminate the divergence difficulties from relativistic quantum field theory by cutting off the high energy regions of the 'proper fields'. Since the divergence difficulties arise primarily due to infinite number of degrees of freedom, one simple remedy would be the introduction of a principle that limits these degrees of freedom by removing the effectiveness of the waves with a frequency exceeding a certain limit without destroying the relativistic invariance of the theory. The principle can be stated as follows: It is in principle impossible to invent an experiment of any kind that will permit a distintion between the positions of two particles at rest, the distance between which is below a certain limit. A more elegant way of introducing fundamental length into quantum theory is through commutation relations between two position operators. In quantum field theory such as quantum electrodynamics, it can be introduced through the commutation relation between two interpolating photon fields (vector potentials). (K.B.)

  13. Fundamentals of number theory

    CERN Document Server

    LeVeque, William J

    1996-01-01

    This excellent textbook introduces the basics of number theory, incorporating the language of abstract algebra. A knowledge of such algebraic concepts as group, ring, field, and domain is not assumed, however; all terms are defined and examples are given - making the book self-contained in this respect.The author begins with an introductory chapter on number theory and its early history. Subsequent chapters deal with unique factorization and the GCD, quadratic residues, number-theoretic functions and the distribution of primes, sums of squares, quadratic equations and quadratic fields, diopha

  14. Quantum theory of measurements as quantum decision theory

    International Nuclear Information System (INIS)

    Yukalov, V I; Sornette, D

    2015-01-01

    Theory of quantum measurements is often classified as decision theory. An event in decision theory corresponds to the measurement of an observable. This analogy looks clear for operationally testable simple events. However, the situation is essentially more complicated in the case of composite events. The most difficult point is the relation between decisions under uncertainty and measurements under uncertainty. We suggest a unified language for describing the processes of quantum decision making and quantum measurements. The notion of quantum measurements under uncertainty is introduced. We show that the correct mathematical foundation for the theory of measurements under uncertainty, as well as for quantum decision theory dealing with uncertain events, requires the use of positive operator-valued measure that is a generalization of projection-valued measure. The latter is appropriate for operationally testable events, while the former is necessary for characterizing operationally uncertain events. In both decision making and quantum measurements, one has to distinguish composite nonentangled events from composite entangled events. Quantum probability can be essentially different from classical probability only for entangled events. The necessary condition for the appearance of an interference term in the quantum probability is the occurrence of entangled prospects and the existence of an entangled strategic state of a decision maker or of an entangled statistical state of a measuring device

  15. Rho meson decay width in SU(2) gauge theories with 2 fundamental flavours

    CERN Document Server

    Janowski, Tadeusz; Pica, Claudio

    2016-01-01

    SU(2) gauge theories with two quark flavours in the fundamental representation are among the most promising theories of composite dynamics describing the electroweak sector. Three out of five Goldstone bosons in these models become the longitudinal components of the W and Z bosons giving them mass. Like in QCD, we expect a spectrum of excitations which appear as resonances in vector boson scattering, in particular the vector resonance corresponding to the rho-meson in QCD. In this talk I will present the preliminary results of the first calculation of the rho-meson decay width in this theory, which is analogous to rho to two pions decay calculation in QCD. The results presented were calculated in a moving frame with total momentum (0,0,1) on two ensembles. Future plans include using 3 moving frames on a larger set of ensembles to extract the resonance parameters more reliably and also take the chiral and continuum limits.

  16. Introduction to probability and measure theories

    International Nuclear Information System (INIS)

    Partasarati, K.

    1983-01-01

    Chapters of probability and measured theories are presented. The Borele images of spaces with the measure into each other and in separate metric spaces are studied. The Kolmogorov theorem on the continuation of probabilies is drawn from the theorem on the measure continuation to the projective limits of spaces with measure. The integration theory is plotted, measures on multiplications of spaces are studied. The theory of conventional mathematical expectations by projections in Hilbert space is presented. In conclusion, the theory of weak convergence of measures of elements of the theory of characteristic functions and the theory of invariant and quasi-invariant measures on groups and homogeneous spaces is given

  17. Fundamental papers in wavelet theory

    CERN Document Server

    Walnut, David F

    2006-01-01

    This book traces the prehistory and initial development of wavelet theory, a discipline that has had a profound impact on mathematics, physics, and engineering. Interchanges between these fields during the last fifteen years have led to a number of advances in applications such as image compression, turbulence, machine vision, radar, and earthquake prediction. This book contains the seminal papers that presented the ideas from which wavelet theory evolved, as well as those major papers that developed the theory into its current form. These papers originated in a variety of journals from differ

  18. Evaluating fundamentals of care: The development of a unit-level quality measurement and improvement programme.

    Science.gov (United States)

    Parr, Jenny M; Bell, Jeanette; Koziol-McLain, Jane

    2018-06-01

    The project aimed to develop a unit-level quality measurement and improvement programme using evidence-based fundamentals of care. Feedback from patients, families, whānau, staff and audit data in 2014 indicated variability in the delivery of fundamental aspects of care such as monitoring, nutrition, pain management and environmental cleanliness at a New Zealand District Health Board. A general inductive approach was used to explore the fundamentals of care and design a measurement and improvement programme, the Patient and Whānau Centred Care Standards (PWCCS), focused on fundamental care. Five phases were used to explore the evidence, and design and test a measurement and improvement framework. Nine identified fundamental elements of care were used to define expected standards of care and develop and test a measurement and improvement framework. Four six-monthly peer reviews have been undertaken since June 2015. Charge Nurse Managers used results to identify quality improvements. Significant improvement was demonstrated overall, in six of the 27 units, in seven of the nine standards and three of the four measures. In all, 89% (n = 24) of units improved their overall result. The PWCCS measurement and improvement framework make visible nursing fundamentals of care in line with continuous quality improvement to increase quality of care. Delivering fundamentals of care is described by nurses as getting ?back to basics'. Patient and family feedback supports the centrality of fundamentals of care to their hospital experience. Implementing a unit-level fundamentals of care quality measurement and improvement programme clarifies expected standards of care, highlights the contribution of fundamentals of care to quality and provides a mechanism for ongoing improvements. © 2018 John Wiley & Sons Ltd.

  19. DOE Fundamentals Handbook: Electrical Science, Volume 2

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive components; batteries; AC and DC voltage regulators; transformers; and electrical test instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  20. Fundamentals of ergonomic exoskeleton robots

    NARCIS (Netherlands)

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a

  1. Fundamental physics in particle traps

    International Nuclear Information System (INIS)

    Quint, Wolfgang; Vogel, Manuel

    2014-01-01

    The individual topics are covered by leading experts in the respective fields of research. Provides readers with present theory and experiments in this field. A useful reference for researchers. This volume provides detailed insight into the field of precision spectroscopy and fundamental physics with particles confined in traps. It comprises experiments with electrons and positrons, protons and antiprotons, antimatter and highly charged ions, together with corresponding theoretical background. Such investigations represent stringent tests of quantum electrodynamics and the Standard model, antiparticle and antimatter research, test of fundamental symmetries, constants, and their possible variations with time and space. They are key to various aspects within metrology such as mass measurements and time standards, as well as promising to further developments in quantum information processing. The reader obtains a valuable source of information suited for beginners and experts with an interest in fundamental studies using particle traps.

  2. Fundamental problems of gauge field theory

    International Nuclear Information System (INIS)

    Velo, G.; Wightman, A.S.

    1986-01-01

    As a result of the experimental and theoretical developments of the last two decades, gauge field theory, in one form or another, now provides the standard language for the description of Nature; QCD and the standard model of the electroweak interactions illustrate this point. It is a basic task of mathematical physics to provide a solid foundation for these developments by putting the theory in a physically transparent and mathematically rigorous form. The lecture notes collected in this volume concentrate on the many unsolved problems which arise here, and on the general ideas and methods which have been proposed for their solution. In particular, the use of rigorous renormalization group methods to obtain control over the continuum limit of lattice gauge field theories, the exploration of the extraordinary enigmatic connections between Kac-Moody-Virasoro algebras and string theory, and the systematic use of the theory of local algebras and indefinite metric spaces to classify the charged C* states in gauge field theories are mentioned

  3. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  4. Fundamental limits of radio interferometers: calibration and source parameter estimation

    OpenAIRE

    Trott, Cathryn M.; Wayth, Randall B.; Tingay, Steven J.

    2012-01-01

    We use information theory to derive fundamental limits on the capacity to calibrate next-generation radio interferometers, and measure parameters of point sources for instrument calibration, point source subtraction, and data deconvolution. We demonstrate the implications of these fundamental limits, with particular reference to estimation of the 21cm Epoch of Reionization power spectrum with next-generation low-frequency instruments (e.g., the Murchison Widefield Array -- MWA, Precision Arra...

  5. Mass anomalous dimension in SU(2) with six fundamental fermions

    DEFF Research Database (Denmark)

    Bursa, Francis; Del Debbio, Luigi; Keegan, Liam

    2010-01-01

    We simulate SU(2) gauge theory with six massless fundamental Dirac fermions. We measure the running of the coupling and the mass in the Schroedinger Functional scheme. We observe very slow running of the coupling constant. We measure the mass anomalous dimension gamma, and find it is between 0.13...

  6. Fundamental course of measuring. Pt. 2. 4. enlarged ed.

    International Nuclear Information System (INIS)

    Merz, L.

    1975-01-01

    The fundamental course of the electrical measuring of non-electrical parameters aims to fulfill the task of presenting the present knowledge on the basic measuring methods in simple language and illustrative form. The present part II deals especially with measuring methods in heat and process engineering in the industrial field. Following the introduction in part A, the techniques of electrical probes are mainly described, and it is shown which mechanical probes cannot yet be replaced by electrical ones. Part C describes the techniques of measuring transducers. (ORU) [de

  7. Quantum measurement

    CERN Document Server

    Busch, Paul; Pellonpää, Juha-Pekka; Ylinen, Kari

    2016-01-01

    This is a book about the Hilbert space formulation of quantum mechanics and its measurement theory. It contains a synopsis of what became of the Mathematical Foundations of Quantum Mechanics since von Neumann’s classic treatise with this title. Fundamental non-classical features of quantum mechanics—indeterminacy and incompatibility of observables, unavoidable measurement disturbance, entanglement, nonlocality—are explicated and analysed using the tools of operational quantum theory. The book is divided into four parts: 1. Mathematics provides a systematic exposition of the Hilbert space and operator theoretic tools and relevant measure and integration theory leading to the Naimark and Stinespring dilation theorems; 2. Elements develops the basic concepts of quantum mechanics and measurement theory with a focus on the notion of approximate joint measurability; 3. Realisations offers in-depth studies of the fundamental observables of quantum mechanics and some of their measurement implementations; and 4....

  8. Derivation of binding energies on the basis of fundamental nuclear theory

    International Nuclear Information System (INIS)

    Kouki, Tuomo.

    1975-10-01

    An attempt to assess the degree of consistency between the underlying ideas of two different approaches to nuclear energy relations is described. The fundamental approach in the form of density dependent Hartree-Fock theory, as well as the method of renormalizing shell model energies have both met with fair success. Whereas the former method is based on nuclear matter theory, the latter's central idea is to combine shell structure with an average liquid drop behaviour. The shell smoothing procedure employed there has been subject to intense theoretical study. Only little attention has been paid to the liquid drop aspect of the method. It is purposed to derive the liquid drop mass formula by means of a model force fitted to results of some nuclear matter calculations. Moreover, the force is tested by applying it to finite nuclei. Because of this, the present work could also be regarded as an attempt to find a very direct way of relating nuclear matter properties to those of finite nuclei. As the results in this respect are worse than expected, we conclude with a discussion of possible directions of improvement. (author)

  9. On the conception of fundamental time asymmetries in physics

    Energy Technology Data Exchange (ETDEWEB)

    Wohlfarth, Daniel

    2013-02-05

    The investigation is divided in 7 chapters and aims to argue for the realizability of a new conception of 'fundamental time asymmetries' in physics. After an introduction (chapter 1) in the field of interest, the investigation continues by developing a conception of fundamentality for time asymmetries in chapter 2. Chapter 3 shows that this conception is realized in classical cosmology and chapter 4 demonstrates, by taking in to account the result from chapter 3, that classical electrodynamics is understandable as a time asymmetric theory. Chapter 5 focuses on time asymmetries in quantum cosmology as well as quantum thermodynamics and demonstrates - as in the classical case - that a fundamental time asymmetry is imbedded in those fields. The considerations, contained in chapter 6, are focused on non relativistic quantum mechanics (NRQM). Here the main aim is to demonstrate that NRQM can be understood as a time asymmetric theory - even without using the measurement-process for that purpose. Chapter 7 summarized the main arguments and conclusions.

  10. On the conception of fundamental time asymmetries in physics

    International Nuclear Information System (INIS)

    Wohlfarth, Daniel

    2013-01-01

    The investigation is divided in 7 chapters and aims to argue for the realizability of a new conception of 'fundamental time asymmetries' in physics. After an introduction (chapter 1) in the field of interest, the investigation continues by developing a conception of fundamentality for time asymmetries in chapter 2. Chapter 3 shows that this conception is realized in classical cosmology and chapter 4 demonstrates, by taking in to account the result from chapter 3, that classical electrodynamics is understandable as a time asymmetric theory. Chapter 5 focuses on time asymmetries in quantum cosmology as well as quantum thermodynamics and demonstrates - as in the classical case - that a fundamental time asymmetry is imbedded in those fields. The considerations, contained in chapter 6, are focused on non relativistic quantum mechanics (NRQM). Here the main aim is to demonstrate that NRQM can be understood as a time asymmetric theory - even without using the measurement-process for that purpose. Chapter 7 summarized the main arguments and conclusions.

  11. Relativities of fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    2017-08-01

    S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally.

  12. Individual differences in fundamental social motives.

    Science.gov (United States)

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Einstein Gravity Explorer–a medium-class fundamental physics mission

    NARCIS (Netherlands)

    Schiller, S.; Tino, G.M.; Gill, E.

    2008-01-01

    The Einstein Gravity Explorer mission (EGE) is devoted to a precise measurement of the properties of space-time using atomic clocks. It tests one of the most fundamental predictions of Einstein’s Theory of General Relativity, the gravitational redshift, and thereby searches for hints of quantum

  14. The inductively coupled plasma as a source for the measurement of fundamental spectroscopic constants

    International Nuclear Information System (INIS)

    Farnsworth, P.B.

    1993-01-01

    Inductively coupled plasmas (ICPs) are stable, robust sources for the generation of spectra from neutral and singly ionized atoms. They are used extensively for analytical spectrometry, but have seen limited use for the measurement of fundamental spectroscopic constants. Several properties of the ICP affect its suitability for such fundamental measurements. They include: spatial structure, spectral background, noise characteristics, electron densities and temperatures, and the state of equilibrium in the plasma. These properties are particularly sensitive to the means by which foreign atoms are introduced into the plasma. With some departures from the operating procedures normally used in analytical measurements, the ICP promise to be a useful source for the measurement of fundamental atomic constants. (orig.)

  15. Quantum decision theory as quantum theory of measurement

    International Nuclear Information System (INIS)

    Yukalov, V.I.; Sornette, D.

    2008-01-01

    We present a general theory of quantum information processing devices, that can be applied to human decision makers, to atomic multimode registers, or to molecular high-spin registers. Our quantum decision theory is a generalization of the quantum theory of measurement, endowed with an action ring, a prospect lattice and a probability operator measure. The algebra of probability operators plays the role of the algebra of local observables. Because of the composite nature of prospects and of the entangling properties of the probability operators, quantum interference terms appear, which make actions noncommutative and the prospect probabilities nonadditive. The theory provides the basis for explaining a variety of paradoxes typical of the application of classical utility theory to real human decision making. The principal advantage of our approach is that it is formulated as a self-consistent mathematical theory, which allows us to explain not just one effect but actually all known paradoxes in human decision making. Being general, the approach can serve as a tool for characterizing quantum information processing by means of atomic, molecular, and condensed-matter systems

  16. Fundamental concepts in Particle Physics course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    The course will provide an introduction to some of the basic theoretical techniques used to describe the fundamental particles and their interactions. Of central importance to our understanding of these forces are the underlying symmetries of nature and I will review the nature of these symmetries and how they are used to build a predictive theory. I discuss how the combination of quantum mechanics and relativity leads to the quantum field theory (QFT) description of the states of matter and their interactions. The Feynman rules used to determine the QFT predictions for experimentally measurable processes are derived and applied to the calculation of decay widths and cross sections.

  17. Essentials of measure theory

    CERN Document Server

    Kubrusly, Carlos S

    2015-01-01

    Classical in its approach, this textbook is thoughtfully designed and composed in two parts. Part I is meant for a one-semester beginning graduate course in measure theory, proposing an “abstract” approach to measure and integration, where the classical concrete cases of Lebesgue measure and Lebesgue integral are presented as an important particular case of general theory. Part II of the text is more advanced and is addressed to a more experienced reader. The material is designed to cover another one-semester graduate course subsequent to a first course, dealing with measure and integration in topological spaces. The final section of each chapter in Part I presents problems that are integral to each chapter, the majority of which consist of auxiliary results, extensions of the theory, examples, and counterexamples. Problems which are highly theoretical have accompanying hints. The last section of each chapter of Part II consists of Additional Propositions containing auxiliary and complementary results. Th...

  18. Fundamentals in hadronic atom theory

    CERN Document Server

    Deloff, A

    2003-01-01

    Hadronic atoms provide a unique laboratory for studying hadronic interactions essentially at threshold. This text is the first book-form exposition of hadronic atom theory with emphasis on recent developments, both theoretical and experimental. Since the underlying Hamiltonian is a non-self-adjoined operator, the theory goes beyond traditional quantum mechanics and this book covers topics that are often glossed over in standard texts on nuclear physics. The material contained here is intended for the advanced student and researcher in nuclear, atomic or elementary-particle physics. A good know

  19. Boolean Approach to Dichotomic Quantum Measurement Theories

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, K. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Nakamura, T. [Keio University, Yokohama (Japan); Batle, J. [Universitat de les Illes Balears, Balearic Islands (Spain); Abdalla, S. [King Abdulaziz University Jeddah, Jeddah (Saudi Arabia); Farouk, A. [Al-Zahra College for Women, Muscat (Egypt)

    2017-02-15

    Recently, a new measurement theory based on truth values was proposed by Nagata and Nakamura [Int. J. Theor. Phys. 55, 3616 (2016)], that is, a theory where the results of measurements are either 0 or 1. The standard measurement theory accepts a hidden variable model for a single Pauli observable. Hence, we can introduce a classical probability space for the measurement theory in this particular case. Additionally, we discuss in the present contribution the fact that projective measurement theories (the results of which are either +1 or −1) imply the Bell, Kochen, and Specker (BKS) paradox for a single Pauli observable. To justify our assertion, we present the BKS theorem in almost all the two-dimensional states by using a projective measurement theory. As an example, we present the BKS theorem in two-dimensions with white noise. Our discussion provides new insight into the quantum measurement problem by using this measurement theory based on the truth values.

  20. Hydromechanics theory and fundamentals

    CERN Document Server

    Sinaiski, Emmanuil G

    2010-01-01

    Written by an experienced author with a strong background in applications of this field, this monograph provides a comprehensive and detailed account of the theory behind hydromechanics. He includes numerous appendices with mathematical tools, backed by extensive illustrations. The result is a must-have for all those needing to apply the methods in their research, be it in industry or academia.

  1. Fundamental structures of M(brane) theory

    International Nuclear Information System (INIS)

    Hoppe, Jens

    2011-01-01

    A dynamical symmetry, as well as special diffeomorphism algebras generalizing the Witt-Virasoro algebra, related to Poincare invariance and crucial with regard to quantization, questions of integrability, and M(atrix) theory, are found to exist in the theory of relativistic extended objects of any dimension.

  2. Arguing against fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  3. Fundamentals of differential beamforming

    CERN Document Server

    Benesty, Jacob; Pan, Chao

    2016-01-01

    This book provides a systematic study of the fundamental theory and methods of beamforming with differential microphone arrays (DMAs), or differential beamforming in short. It begins with a brief overview of differential beamforming and some popularly used DMA beampatterns such as the dipole, cardioid, hypercardioid, and supercardioid, before providing essential background knowledge on orthogonal functions and orthogonal polynomials, which form the basis of differential beamforming. From a physical perspective, a DMA of a given order is defined as an array that measures the differential acoustic pressure field of that order; such an array has a beampattern in the form of a polynomial whose degree is equal to the DMA order. Therefore, the fundamental and core problem of differential beamforming boils down to the design of beampatterns with orthogonal polynomials. But certain constraints also have to be considered so that the resulting beamformer does not seriously amplify the sensors’ self noise and the mism...

  4. Field theories without fundamental (gauge) symmetries

    International Nuclear Information System (INIS)

    Nielsen, H.B.

    1983-12-01

    By using the lack of dependence of the form of the kinetic energy for a non-relativistic free particle as an example, it is argued that a physical law with a less extended range of application (non-relativistic energy momentum relation) often follows from a more extended one (in this case the relativistic relation) without too much dependence on the details of the latter. This is extended to the ideal of random dynamics: no fundamental laws are needed to be known. Almost any random fundamental model will give the correct main features for the range of physical conditions accessible today (energies less than 1000 GeV) even if it is wrong in detail. This suggests the programme of attempting to 'derive' the various symmetries and other features of physics known today from random models at least without the feature to be derived. The achievements in the programme of random dynamics up till now are briefly reviewed. In particular, Lorentz invariance may be understood as a low energy phenomenon. (Auth.)

  5. Fundamentals of ultrasonic phased arrays

    CERN Document Server

    Schmerr, Lester W

    2014-01-01

    This book describes in detail the physical and mathematical foundations of ultrasonic phased array measurements.?The book uses linear systems theory to develop a comprehensive model of the signals and images that can be formed with phased arrays. Engineers working in the field of ultrasonic nondestructive evaluation (NDE) will find in this approach a wealth of information on how to design, optimize and interpret ultrasonic inspections with phased arrays. The fundamentals and models described in the book will also be of significant interest to other fields, including the medical ultrasound and

  6. Fundamentals of ergonomic exoskeleton robots

    OpenAIRE

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a new theoretical framework for analyzing physical human robot interaction (pHRI) with exoskeletons, and (2) a clear set of design rules of how to build wearable, portable exoskeletons to easily and...

  7. Minimalist Program and its fundamental improvements in syntactic theory: evidence from Agreement Asymmetry in Standard Arabic

    Directory of Open Access Journals (Sweden)

    Nasser Al-Horais

    2012-11-01

    Full Text Available Normal 0 21 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:auto; mso-para-margin-right:1.0cm; mso-para-margin-bottom:auto; mso-para-margin-left:2.0cm; text-align:justify; text-indent:-1.0cm; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi; mso-ansi-language:EN-US; mso-fareast-language:EN-US;} The Minimalist Program is a major line of inquiry that has been developing inside Generative Grammar since the early nineties, when it was proposed by Chomsky  (1993, 1995. In that time, Chomsky (1998: 5 presents Minimalist Program as a program, not as a theory, but today, Minimalist Program lays out a very specific view of the basis of syntactic grammar that, when compared to other formalisms, is often taken to look very much like a theory. The prime concern of this paper, however, is  to provide a comprehensive and accessible introduction to the art of minimalist approach to the theory of grammar. In this regard, this paper discusses some new ideas articulated recently by Chomsky, and have led to several fundamental improvements in syntactic theory  such as changing the function of movement and the Extended Projection Principle (EPP feature, or proposing new theories such as Phases and Feature Inheritance. In order to evidence the significance of these fundamental improvements, the current paper provides a minimalist analysis to account for agreement and word-order asymmetry in Stranded Arabic. This fresh minimalist account meets the challenges (to the basic tenets of syntactic theory occurred

  8. Geometric measure theory

    CERN Document Server

    Waerden, B

    1996-01-01

    From the reviews: "... Federer's timely and beautiful book indeed fills the need for a comprehensive treatise on geometric measure theory, and his detailed exposition leads from the foundations of the theory to the most recent discoveries. ... The author writes with a distinctive style which is both natural and powerfully economical in treating a complicated subject. This book is a major treatise in mathematics and is essential in the working library of the modern analyst." Bulletin of the London Mathematical Society.

  9. Modeling, Measurements, and Fundamental Database Development for Nonequilibrium Hypersonic Aerothermodynamics

    Science.gov (United States)

    Bose, Deepak

    2012-01-01

    The design of entry vehicles requires predictions of aerothermal environment during the hypersonic phase of their flight trajectories. These predictions are made using computational fluid dynamics (CFD) codes that often rely on physics and chemistry models of nonequilibrium processes. The primary processes of interest are gas phase chemistry, internal energy relaxation, electronic excitation, nonequilibrium emission and absorption of radiation, and gas-surface interaction leading to surface recession and catalytic recombination. NASAs Hypersonics Project is advancing the state-of-the-art in modeling of nonequilibrium phenomena by making detailed spectroscopic measurements in shock tube and arcjets, using ab-initio quantum mechanical techniques develop fundamental chemistry and spectroscopic databases, making fundamental measurements of finite-rate gas surface interactions, implementing of detailed mechanisms in the state-of-the-art CFD codes, The development of new models is based on validation with relevant experiments. We will present the latest developments and a roadmap for the technical areas mentioned above

  10. Field algebras in quantum theory with indefinite metric. III. Spectrum of modular operator and Tomita's fundamental theorem

    International Nuclear Information System (INIS)

    Dadashyan, K.Yu.; Khoruzhii, S.S.

    1987-01-01

    The construction of a modular theory for weakly closed J-involutive algebras of bounded operators on Pontryagin spaces is continued. The spectrum of the modular operator Δ of such an algebra is investigated, the existence of a strongly continuous J-unitary group is established and, under the condition that the spectrum lies in the right half-plane, Tomita's fundamental theorem is proved

  11. Wilson loops in superconformal Chern-Simons theory and fundamental strings in Anti-de Sitter supergravity dual

    International Nuclear Information System (INIS)

    Rey, Soo-Jong; Suyama, Takao; Yamaguchi, Satoshi

    2009-01-01

    We study Wilson loop operators in three-dimensional, N = 6 superconformal Chern-Simons theory dual to IIA superstring theory on AdS 4 x CP 3 . Novelty of Wilson loop operators in this theory is that, for a given contour, there are two linear combinations of Wilson loop transforming oppositely under time-reversal transformation. We show that one combination is holographically dual to IIA fundamental string, while orthogonal combination is set to zero. We gather supporting evidences from detailed comparative study of generalized time-reversal transformations in both D2-brane worldvolume and ABJM theories. We then classify supersymmetric Wilson loops and find at most 1/6 supersymmetry. We next study Wilson loop expectation value in planar perturbation theory. For circular Wilson loop, we find features remarkably parallel to circular Wilson loop in N = 4 super Yang-Mills theory in four dimensions. First, all odd loop diagrams vanish identically and even loops contribute nontrivial contributions. Second, quantum corrected gauge and scalar propagators take the same form as those of N = 4 super Yang-Mills theory. Combining these results, we propose that expectation value of circular Wilson loop is given by Wilson loop expectation value in pure Chern-Simons theory times zero-dimensional Gaussian matrix model whose variance is specified by an interpolating function of 't Hooft coupling. We suggest the function interpolates smoothly between weak and strong coupling regime, offering new test ground of the AdS/CFT correspondence.

  12. Theoretical prediction and impact of fundamental electric dipole moments

    International Nuclear Information System (INIS)

    Ellis, Sebastian A.R.; Kane, Gordon L.

    2016-01-01

    The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level in the theory at the unification or string scale ∼O(10 16 GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5×10 −30 e cm, and the neutron EDM should not be larger than about 5×10 −29 e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. We comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.

  13. Theoretical prediction and impact of fundamental electric dipole moments

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Sebastian A.R.; Kane, Gordon L. [Michigan Center for Theoretical Physics (MCTP),Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States)

    2016-01-13

    The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level in the theory at the unification or string scale ∼O(10{sup 16} GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5×10{sup −30}e cm, and the neutron EDM should not be larger than about 5×10{sup −29}e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. We comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.

  14. Fundamental problem in the relativistic approach to atomic structure theory

    International Nuclear Information System (INIS)

    Kagawa, Takashi

    1987-01-01

    It is known that the relativistic atomic structure theory contains a serious fundamental problem, so-called the Brown-Ravenhall (BR) problem or variational collapse. This problem arises from the fact that the energy spectrum of the relativistic Hamiltonian for many-electron systems is not bounded from below because the negative-energy solutions as well as the positive-energy ones are obtained from the relativistic equation. This report outlines two methods to avoid the BR problem in the relativistic calculation, that is, the projection operator method and the general variation method. The former method is described first. The use of a modified Hamiltonian containing a projection operator which projects the positive-energy solutions in the relativistic wave equation has been proposed to remove the BR difficulty. The problem in the use of the projection operator method is that the projection operator for the system cannot be determined uniquely. The final part of this report outlines the general variation method. This method can be applied to any system, such as relativistic ones whose Hamiltonian is not bounded from below. (Nogami, K.)

  15. STEP and fundamental physics

    Science.gov (United States)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-09-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 1013 to one part in 1018 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels.

  16. Accurate Q value measurements for fundamental physics studies at JYFLTRAP

    Energy Technology Data Exchange (ETDEWEB)

    Eronen, T., E-mail: tommi.o.eronen@jyu.fi; Kolhinen, V. S. [University of Jyvaeskylae (Finland); Collaboration: JYFLTRAP collaboration

    2011-07-15

    We have measured several Q values at JYFLTRAP for superallowed {beta} decays that contribute to testing the Standard Model and candidate nuclei that one could use for the search of neutrinoless double-{beta} decay. These results play important roles in the research of fundamental physics that have scopes beyond Standard Model.

  17. Lorenz, Gödel and Penrose: new perspectives on determinism and causality in fundamental physics

    Science.gov (United States)

    Palmer, T. N.

    2014-07-01

    Despite being known for his pioneering work on chaotic unpredictability, the key discovery at the core of meteorologist Ed Lorenz's work is the link between space-time calculus and state-space fractal geometry. Indeed, properties of Lorenz's fractal invariant set relate space-time calculus to deep areas of mathematics such as Gödel's Incompleteness Theorem. Could such properties also provide new perspectives on deep unsolved issues in fundamental physics? Recent developments in cosmology motivate what is referred to as the 'cosmological invariant set postulate': that the universe ? can be considered a deterministic dynamical system evolving on a causal measure-zero fractal invariant set ? in its state space. Symbolic representations of ? are constructed explicitly based on permutation representations of quaternions. The resulting 'invariant set theory' provides some new perspectives on determinism and causality in fundamental physics. For example, while the cosmological invariant set appears to have a rich enough structure to allow a description of (quantum) probability, its measure-zero character ensures it is sparse enough to prevent invariant set theory being constrained by the Bell inequality (consistent with a partial violation of the so-called measurement independence postulate). The primacy of geometry as embodied in the proposed theory extends the principles underpinning general relativity. As a result, the physical basis for contemporary programmes which apply standard field quantisation to some putative gravitational lagrangian is questioned. Consistent with Penrose's suggestion of a deterministic but non-computable theory of fundamental physics, an alternative 'gravitational theory of the quantum' is proposed based on the geometry of ?, with new perspectives on the problem of black-hole information loss and potential observational consequences for the dark universe.

  18. The First Fundamental Theorem of Invariant Theory for the Orthosymplectic Supergroup

    Science.gov (United States)

    Lehrer, G. I.; Zhang, R. B.

    2017-01-01

    We give an elementary and explicit proof of the first fundamental theorem of invariant theory for the orthosymplectic supergroup by generalising the geometric method of Atiyah, Bott and Patodi to the supergroup context. We use methods from super-algebraic geometry to convert invariants of the orthosymplectic supergroup into invariants of the corresponding general linear supergroup on a different space. In this way, super Schur-Weyl-Brauer duality is established between the orthosymplectic supergroup of superdimension ( m|2 n) and the Brauer algebra with parameter m - 2 n. The result may be interpreted either in terms of the group scheme OSp( V) over C, where V is a finite dimensional super space, or as a statement about the orthosymplectic Lie supergroup over the infinite dimensional Grassmann algebra {Λ}. We take the latter point of view here, and also state a corresponding theorem for the orthosymplectic Lie superalgebra, which involves an extra invariant generator, the super-Pfaffian.

  19. Atom Interferometry for Fundamental Physics and Gravity Measurements in Space

    Science.gov (United States)

    Kohel, James M.

    2012-01-01

    Laser-cooled atoms are used as freefall test masses. The gravitational acceleration on atoms is measured by atom-wave interferometry. The fundamental concept behind atom interferometry is the quantum mechanical particle-wave duality. One can exploit the wave-like nature of atoms to construct an atom interferometer based on matter waves analogous to laser interferometers.

  20. Inertial rotation measurement with atomic spins: From angular momentum conservation to quantum phase theory

    Science.gov (United States)

    Zhang, C.; Yuan, H.; Tang, Z.; Quan, W.; Fang, J. C.

    2016-12-01

    Rotation measurement in an inertial frame is an important technology for modern advanced navigation systems and fundamental physics research. Inertial rotation measurement with atomic spin has demonstrated potential in both high-precision applications and small-volume low-cost devices. After rapid development in the last few decades, atomic spin gyroscopes are considered a promising competitor to current conventional gyroscopes—from rate-grade to strategic-grade applications. Although it has been more than a century since the discovery of the relationship between atomic spin and mechanical rotation by Einstein [Naturwissenschaften, 3(19) (1915)], research on the coupling between spin and rotation is still a focus point. The semi-classical Larmor precession model is usually adopted to describe atomic spin gyroscope measurement principles. More recently, the geometric phase theory has provided a different view of the rotation measurement mechanism via atomic spin. The theory has been used to describe a gyroscope based on the nuclear spin ensembles in diamond. A comprehensive understanding of inertial rotation measurement principles based on atomic spin would be helpful for future applications. This work reviews different atomic spin gyroscopes and their rotation measurement principles with a historical overlook. In addition, the spin-rotation coupling mechanism in the context of the quantum phase theory is presented. The geometric phase is assumed to be the origin of the measurable rotation signal from atomic spins. In conclusion, with a complete understanding of inertial rotation measurements using atomic spin and advances in techniques, wide application of high-performance atomic spin gyroscopes is expected in the near future.

  1. Aligning the Measurement of Microbial Diversity with Macroecological Theory

    Directory of Open Access Journals (Sweden)

    James C. Stegen

    2016-09-01

    Full Text Available The number of microbial operational taxonomic units (OTUs within a community is akin to species richness within plant/animal (‘macrobial’ systems. A large literature documents OTU richness patterns, drawing comparisons to macrobial theory. There is, however, an unrecognized fundamental disconnect between OTU richness and macrobial theory: OTU richness is commonly estimated on a per-individual basis, while macrobial richness is estimated per-area. Furthermore, the range or extent of sampled environmental conditions can strongly influence a study’s outcomes and conclusions, but this is not commonly addressed when studying OTU richness. Here we (i propose a new sampling approach that estimates OTU richness per-mass of soil, which results in strong support for species energy theory, (ii use data reduction to show how support for niche conservatism emerges when sampling across a restricted range of environmental conditions, and (iii show how additional insights into drivers of OTU richness can be generated by combining different sampling methods while simultaneously considering patterns that emerge by restricting the range of environmental conditions. We propose that a more rigorous connection between microbial ecology and macrobial theory can be facilitated by exploring how changes in OTU richness units and environmental extent influence outcomes of data analysis. While fundamental differences between microbial and macrobial systems persist (e.g., species concepts, we suggest that closer attention to units and scale provide tangible and immediate improvements to our understanding of the processes governing OTU richness and how those processes relate to drivers of macrobial species richness.

  2. Electromagnetic and quantum measurements a bitemporal neoclassical theory

    CERN Document Server

    Wessel-Berg, Tore

    2001-01-01

    It is a pleasure to write a foreword for Professor Tore Wessel-Berg's book, "Electromagnetic and Quantum Measurements: A Bitemporal Neoclassical Theory." This book appeals to me for several reasons. The most important is that, in this book, Wessel-Berg breaks from the pack. The distinguished astrophysicist Thomas Gold has written about the pressures on scientists to move in tight formation, to avoid having their legs nipped by the sheepdogs of science. This book demonstrates that Wessel-Berg is willing to take that risk. I confess that I do not sufficiently understand this book to be able to either agree or disagree with its thesis. Nevertheless, Wessel-Berg makes very cogent arguments for setting out on his journey. The basic equations of physics are indeed time-reversible. Our experience, that leads us to the concept of an "arrow of time," is derived from macro­ scopic phenomena, not from fundamental microscopic phenomena. For this reason, it makes very good sense to explore the consequences of treating mi...

  3. Fundamentals of functions and measure theory

    CERN Document Server

    Mikhalev, Alexander V; Zakharov, Valeriy K

    2018-01-01

    The series is devoted to the publication of monographs and high-level textbooks in mathematics, mathematical methods and their applications. Apart from covering important areas of current interest, a major aim is to make topics of an interdisciplinary nature accessible to the non-specialist. The works in this series are addressed to advanced students and researchers in mathematics and theoretical physics. In addition, it can serve as a guide for lectures and seminars on a graduate level. The series de Gruyter Studies in Mathematics was founded ca. 30 years ago by the late Professor Heinz Bauer and Professor Peter Gabriel with the aim to establish a series of monographs and textbooks of high standard, written by scholars with an international reputation presenting current fields of research in pure and applied mathematics. While the editorial board of the Studies has changed with the years, the aspirations of the Studies are unchanged. In times of rapid growth of mathematical knowledge carefully written monogr...

  4. Ontic structural realism and quantum field theory: Are there intrinsic properties at the most fundamental level of reality?

    Science.gov (United States)

    Berghofer, Philipp

    2018-05-01

    Ontic structural realism refers to the novel, exciting, and widely discussed basic idea that the structure of physical reality is genuinely relational. In its radical form, the doctrine claims that there are, in fact, no objects but only structure, i.e., relations. More moderate approaches state that objects have only relational but no intrinsic properties. In its most moderate and most tenable form, ontic structural realism assumes that at the most fundamental level of physical reality there are only relational properties. This means that the most fundamental objects only possess relational but no non-reducible intrinsic properties. The present paper will argue that our currently best physics refutes even this most moderate form of ontic structural realism. More precisely, I will claim that 1) according to quantum field theory, the most fundamental objects of matter are quantum fields and not particles, and show that 2) according to the Standard Model, quantum fields have intrinsic non-relational properties.

  5. Fundamentals of photonics

    CERN Document Server

    Saleh, Bahaa E A

    2007-01-01

    Now in a new full-color edition, Fundamentals of Photonics, Second Edition is a self-contained and up-to-date introductory-level textbook that thoroughly surveys this rapidly expanding area of engineering and applied physics. Featuring a logical blend of theory and applications, coverage includes detailed accounts of the primary theories of light, including ray optics, wave optics, electromagnetic optics, and photon optics, as well as the interaction of photons and atoms, and semiconductor optics. Presented at increasing levels of complexity, preliminary sections build toward more advan

  6. STEP and fundamental physics

    International Nuclear Information System (INIS)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-01-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 10 13 to one part in 10 18 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels. (paper)

  7. Quivers, words and fundamentals

    International Nuclear Information System (INIS)

    Mattioli, Paolo; Ramgoolam, Sanjaye

    2015-01-01

    A systematic study of holomorphic gauge invariant operators in general N=1 quiver gauge theories, with unitary gauge groups and bifundamental matter fields, was recently presented in http://dx.doi.org/10.1007/JHEP04(2013)094. For large ranks a simple counting formula in terms of an infinite product was given. We extend this study to quiver gauge theories with fundamental matter fields, deriving an infinite product form for the refined counting in these cases. The infinite products are found to be obtained from substitutions in a simple building block expressed in terms of the weighted adjacency matrix of the quiver. In the case without fundamentals, it is a determinant which itself is found to have a counting interpretation in terms of words formed from partially commuting letters associated with simple closed loops in the quiver. This is a new relation between counting problems in gauge theory and the Cartier-Foata monoid. For finite ranks of the unitary gauge groups, the refined counting is given in terms of expressions involving Littlewood-Richardson coefficients.

  8. Another argument against fundamental scalars

    International Nuclear Information System (INIS)

    Joglekar, S.D.

    1990-01-01

    An argument, perhaps not as strong, which is based on the inclusion of interaction with external gravity into a theory describing strong, electromagnetic and weak interactions is presented. The argument is related to the basis of the common belief which favours a renormalizable action against a non-renormalizable action as a candidate for a fundamental theory. (author). 12 refs

  9. Fundamentals of structural dynamics

    CERN Document Server

    Craig, Roy R

    2006-01-01

    From theory and fundamentals to the latest advances in computational and experimental modal analysis, this is the definitive, updated reference on structural dynamics.This edition updates Professor Craig's classic introduction to structural dynamics, which has been an invaluable resource for practicing engineers and a textbook for undergraduate and graduate courses in vibrations and/or structural dynamics. Along with comprehensive coverage of structural dynamics fundamentals, finite-element-based computational methods, and dynamic testing methods, this Second Edition includes new and e

  10. Fundamentals and advanced techniques in derivatives hedging

    CERN Document Server

    Bouchard, Bruno

    2016-01-01

    This book covers the theory of derivatives pricing and hedging as well as techniques used in mathematical finance. The authors use a top-down approach, starting with fundamentals before moving to applications, and present theoretical developments alongside various exercises, providing many examples of practical interest. A large spectrum of concepts and mathematical tools that are usually found in separate monographs are presented here. In addition to the no-arbitrage theory in full generality, this book also explores models and practical hedging and pricing issues. Fundamentals and Advanced Techniques in Derivatives Hedging further introduces advanced methods in probability and analysis, including Malliavin calculus and the theory of viscosity solutions, as well as the recent theory of stochastic targets and its use in risk management, making it the first textbook covering this topic. Graduate students in applied mathematics with an understanding of probability theory and stochastic calculus will find this b...

  11. The issue of phases in quantum measurement theory

    International Nuclear Information System (INIS)

    Pati, Arun Kumar

    1999-01-01

    The issue of phases is always very subtle in quantum world and many of the curious phenomena are due to the existence of the phase of the quantum mechanical wave function. We investigate the issue of phases in quantum measurement theory and predict a new effect of fundamental importance. We call a quantum system under goes a quantum Zeno dynamics when the unitary evolution of a quantum system is interrupted by a sequence of measurements. In particular, we investigate the effect of repeated measurements on the geometric phase and show that the quantum Zeno dynamics can inhibit its development under a large number of measurement pulses. It is interesting to see that neither the total phase nor the dynamical phase goes to zero under large number of measurements. This new effect we call as the 'quantum Zeno Phase effect' in analogous to the quantum Zeno effect where the repeated measurements inhibit the transition probability. This 'quantum Zeno Phase effect' can be proved within von Neumann's collapse mechanism as well as using a continuous measurement model. So the effect is really independent of any particular measurement model considered. Since the geometric phase attributes a memory to a quantum system our results also proves that the path dependent memory of a system can be erased by a sequence of measurements. The quantum Zeno Phase effect provides a way to control and manipulate the phase of a wave function in an interference set up. Finally, we stress that the quantum Zeno Phase effect can be tested using neutron, photon and atom interference experiments with the presently available technology. (Author)

  12. Immersed in media telepresence theory, measurement & technology

    CERN Document Server

    Lombard, Matthew; Freeman, Jonathan; IJsselsteijn, Wijnand; Schaevitz, Rachel J

    2015-01-01

    Highlights key research currently being undertaken within the field of telepresence, providing the most detailed account of the field to date, advancing our understanding of a fundamental property of all media - the illusion of presence; the sense of "being there" inside a virtual environment, with actual or virtual others. This collection has been put together by leading international scholars from America, Europe, and Asia. Together, they describe the state-of-the-art in presence theory, research and technology design for an advanced academic audience. Immersed in Media provides research t

  13. Fundamentals of plasma physics

    CERN Document Server

    Bittencourt, J A

    1986-01-01

    A general introduction designed to present a comprehensive, logical and unified treatment of the fundamentals of plasma physics based on statistical kinetic theory. Its clarity and completeness make it suitable for self-learning and self-paced courses. Problems are included.

  14. Solar-System Bodies as Teaching Tools in Fundamental Physics

    Science.gov (United States)

    Genus, Amelia; Overduin, James

    2018-01-01

    We show how asteroids can be used as teaching tools in fundamental physics. Current gravitational theory assumes that all bodies fall with the same acceleration in the same gravitational field. But this assumption, known as the Equivalence Principle, is violated to some degree in nearly all theories that attempt to unify gravitation with the other fundamental forces of nature. In such theories, bodies with different compositions can fall at different rates, producing small non-Keplerian distortions in their orbits. We focus on the unique all-metal asteroid 16 Psyche as a test case. Using Kepler’s laws of planetary motion together with recent observational data on the orbital motions of Psyche and its neighbors, students are able to derive new constraints on current theories in fundamental physics. These constraints take on particular interest since NASA has just announced plans to visit Psyche in 2026.

  15. Fundamental situations in teaching biology: The case of parthenogenesis

    Directory of Open Access Journals (Sweden)

    Robert Evans

    2012-06-01

    Full Text Available This theoretical paper considers the notion of fundamental situation in the sense of Brousseau’s theory of didactical situations. It introduces some precise elements of this theory in which a teacher provides an environment for student work that aims to enable students, through constructive inquiry, to acquire well defined pieces of scientific knowledge. Situations become fundamental if they not only allow, but force students to construct the target knowledge. A classical example from mathematics is presented, where the target knowledge is a theorem of plane geometry presented as a puzzle. Then a new fundamental situation in biology is described for parthenogenetic reproduction, which has recently turned out to occur in Komodo dragons. An explicit demand to generate and test hypotheses that could explain the given example of dragon reproduction, using authentic DNA data, is given to students. The paper concludes with an analysis of the extent to which this fundamental situation in biology is authentic to the theory of didactical situations.

  16. Introduction to Measure Theory and Integration

    CERN Document Server

    Ambrosio, Luigi; Mennucci, Andrea

    2011-01-01

    This textbook collects the notes for an introductory course in measure theory and integration. The course was taught by the authors to undergraduate students of the Scuola Normale Superiore, in the years 2000-2011. The goal of the course was to present, in a quick but rigorous way, the modern point of view on measure theory and integration, putting Lebesgue's Euclidean space theory into a more general context and presenting the basic applications to Fourier series, calculus and real analysis. The text can also pave the way to more advanced courses in probability, stochastic processes or geomet

  17. Fundamental structures of algebra and discrete mathematics

    CERN Document Server

    Foldes, Stephan

    2011-01-01

    Introduces and clarifies the basic theories of 12 structural concepts, offering a fundamental theory of groups, rings and other algebraic structures. Identifies essentials and describes interrelationships between particular theories. Selected classical theorems and results relevant to current research are proved rigorously within the theory of each structure. Throughout the text the reader is frequently prompted to perform integrated exercises of verification and to explore examples.

  18. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, A.; Murthy, S.

    2007-06-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d ( d = 3, . . . , 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings. (author)

  19. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, Atish; Murthy, Sameer

    2008-01-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d (d = 3, ..., 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings

  20. Fundamentals of electronic image processing

    CERN Document Server

    Weeks, Arthur R

    1996-01-01

    This book is directed to practicing engineers and scientists who need to understand the fundamentals of image processing theory and algorithms to perform their technical tasks. It is intended to fill the gap between existing high-level texts dedicated to specialists in the field and the need for a more practical, fundamental text on image processing. A variety of example images are used to enhance reader understanding of how particular image processing algorithms work.

  1. Qualitative insights on fundamental mechanics

    OpenAIRE

    Mardari, G. N.

    2002-01-01

    The gap between classical mechanics and quantum mechanics has an important interpretive implication: the Universe must have an irreducible fundamental level, which determines the properties of matter at higher levels of organization. We show that the main parameters of any fundamental model must be theory-independent. They cannot be predicted, because they cannot have internal causes. However, it is possible to describe them in the language of classical mechanics. We invoke philosophical reas...

  2. Five fundamental constraints on theories of the origins of music.

    Science.gov (United States)

    Merker, Bjorn; Morley, Iain; Zuidema, Willem

    2015-03-19

    The diverse forms and functions of human music place obstacles in the way of an evolutionary reconstruction of its origins. In the absence of any obvious homologues of human music among our closest primate relatives, theorizing about its origins, in order to make progress, needs constraints from the nature of music, the capacities it engages, and the contexts in which it occurs. Here we propose and examine five fundamental constraints that bear on theories of how music and some of its features may have originated. First, cultural transmission, bringing the formal powers of cultural as contrasted with Darwinian evolution to bear on its contents. Second, generativity, i.e. the fact that music generates infinite pattern diversity by finite means. Third, vocal production learning, without which there can be no human singing. Fourth, entrainment with perfect synchrony, without which there is neither rhythmic ensemble music nor rhythmic dancing to music. And fifth, the universal propensity of humans to gather occasionally to sing and dance together in a group, which suggests a motivational basis endemic to our biology. We end by considering the evolutionary context within which these constraints had to be met in the genesis of human musicality. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  3. Fundamentals of turbomachines

    CERN Document Server

    Dick, Erik

    2015-01-01

    This book explores the working principles of all kinds of turbomachines. The same theoretical framework is used to analyse the different machine types. Fundamentals are first presented and theoretical concepts are then elaborated for particular machine types, starting with the simplest ones.For each machine type, the author strikes a balance between building basic understanding and exploring knowledge of practical aspects. Readers are invited through challenging exercises to consider how the theory applies to particular cases and how it can be generalised.   The book is primarily meant as a course book. It teaches fundamentals and explores applications. It will appeal to senior undergraduate and graduate students in mechanical engineering and to professional engineers seeking to understand the operation of turbomachines. Readers will gain a fundamental understanding of turbomachines. They will also be able to make a reasoned choice of turbomachine for a particular application and to understand its operation...

  4. Fundamental and composite scalars from extra dimensions

    International Nuclear Information System (INIS)

    Aranda, Alfredo; Diaz-Cruz, J.L.; Hernandez-Sanchez, J.; Noriega-Papaqui, R.

    2007-01-01

    We discuss a scenario consisting of an effective 4D theory containing fundamental and composite fields. The strong dynamics sector responsible for the compositeness is assumed to be of extra dimensional origin. In the 4D effective theory the SM fermion and gauge fields are taken as fundamental fields. The scalar sector of the theory resembles a bosonic topcolor in the sense there are two scalar Higgs fields, a composite scalar field and a fundamental gauge-Higgs unification scalar. A detailed analysis of the scalar spectrum is presented in order to explore the parameter space consistent with experiment. It is found that, under the model assumptions, the acceptable parameter space is quite constrained. As a part of our phenomenological study of the model, we evaluate the branching ratio of the lightest Higgs boson and find that our model predicts a large FCNC mode h→tc, which can be as large as O(10 -3 ). Similarly, a large BR for the top FCNC decay is obtained, namely BR(t→c+H)≅10 -4

  5. Ultracold atoms for precision measurement of fundamental physical quantities

    CERN Multimedia

    CERN. Geneva

    2003-01-01

    Cooling and trapping of neutral atoms has been one of the most active fields of research in physics in recent years. Several methods were demonstrated to reach temperatures as low as a few nanokelvin allowing, for example, the investigation of quantum degenerate gases. The ability to control the quantum degrees of freedom of atoms opens the way to applications for precision measurement of fundamental physical quantities. Experiments in progress, planned or being considered using new quantum devices based on ultracold atoms, namely atom interferometers and atomic clocks, will be discussed.

  6. Theories and measures of elder abuse.

    Science.gov (United States)

    Abolfathi Momtaz, Yadollah; Hamid, Tengku Aizan; Ibrahim, Rahimah

    2013-09-01

    Elder abuse is a pervasive phenomenon around the world with devastating effects on the victims. Although it is not a new phenomenon, interest in examining elder abuse is relatively new. This paper aims to provide an overview of the aetiological theories and measures of elder abuse. The paper briefly reviews theories to explain causes of elder abuse and then discusses the most commonly used measures of elder abuse. Based on the reviewed theories, it can be concluded that elder abuse is a multifactorial problem that may affect elderly people from different backgrounds and involve a wide variety of potential perpetrators, including caregivers, adult children, and partners. The review of existing measurement instruments notes that many different screening and assessment instruments have been developed to identify elders who are at risk for or are victims of abuse. However, there is a real need for more measurements of elder abuse, as the current instruments are limited in scope. © 2013 The Authors. Psychogeriatrics © 2013 Japanese Psychogeriatric Society.

  7. Fundamentals of statistics

    CERN Document Server

    Mulholland, Henry

    1968-01-01

    Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges

  8. Theory of Bessel Functions of High Rank - I: Fundamental Bessel Functions

    OpenAIRE

    Qi, Zhi

    2014-01-01

    In this article we introduce a new category of special functions called fundamental Bessel functions arising from the Voronoi summation formula for $\\mathrm{GL}_n (\\mathbb{R})$. The fundamental Bessel functions of rank one and two are the oscillatory exponential functions $e^{\\pm i x}$ and the classical Bessel functions respectively. The main implements and subjects of our study of fundamental Bessel functions are their formal integral representations and Bessel equations.

  9. Fundamentals of the relativistic theory of gravitation

    International Nuclear Information System (INIS)

    Logunov, A.A.; Mestvirishvili, M.A.

    1986-01-01

    An extended exposition of the relativistic theory of gravitation (RTG) proposed by Logunov, Vlasov, and Mestvirishvili is presented. The RTG was constructed uniquely on the basis of the relativity principle and the geometrization principle by regarding the gravitational field as a physical field in the spirit of Faraday and Maxwell possessing energy, momentum, and spins 2 and 0. In the theory, conservation laws for the energy, momentum, and angular momentum for the matter and gravitational field taken together are strictly satisfied. The theory explains all the existing gravitational experiments. When the evolution of the universe is analyzed, the theory leads to the conclusion that the universe is infinite and flat, and it is predicted to contain a large amount of hidden mass. This missing mass exceeds by almost 40 times the amount of matter currently observed in the universe. The RTG predicts that gravitational collapse, which for a comoving observer occurs after a finite proper time, does not lead to infinite compression of matter but is halted at a certain finite density of the collapsing body. Therefore, according to the RTG there cannot be any objects in nature in which the gravitational contraction of matter to infinite density occurs, i.e., there are no black holes

  10. Ocean Ambient Noise Measurement and Theory

    CERN Document Server

    Carey, William M

    2011-01-01

    This book develops the theory of ocean ambient noise mechanisms and measurements, and also describes general noise characteristics and computational methods.  It concisely summarizes the vast ambient noise literature using theory combined with key representative results.  The air-sea boundary interaction zone is described in terms of non-dimensional variables requisite for future experiments.  Noise field coherency, rare directional measurements, and unique basin scale computations and methods are presented.  The use of satellite measurements in these basin scale models is demonstrated.  Finally, this book provides a series of appendices giving in-depth mathematical treatments.  With its complete and careful discussions of both theory and experimental results, this book will be of the greatest interest to graduate students and active researchers working in fields related to ambient noise in the ocean.

  11. Contiguity and quantum theory of measurement

    Energy Technology Data Exchange (ETDEWEB)

    Green, H.S. [Adelaide Univ., SA (Australia). Dept. of Mathematical Physics]|[Adelaide Univ., SA (Australia). Dept. of Physics

    1995-12-31

    This paper presents a comprehensive treatment of the problem of measurement in microscopic physics, consistent with the indeterministic Copenhagen interpretation of quantum mechanics and information theory. It is pointed out that there are serious difficulties in reconciling the deterministic interpretations of quantum mechanics, based on the concepts of a universal wave function or hidden variables, with the principle of contiguity. Quantum mechanics is reformulated entirely in terms of observables, represented by matrices, including the statistical matrix, and the utility of information theory is illustrated by a discussion of the EPR paradox. The principle of contiguity is satisfied by all conserved quantities. A theory of the operation of macroscopic measuring devices is given in the interaction representation, and the attenuation of the indeterminacy of a microscopic observable in the process of measurement is related to observable changes of entropy. 28 refs.

  12. Contiguity and quantum theory of measurement

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1995-01-01

    This paper presents a comprehensive treatment of the problem of measurement in microscopic physics, consistent with the indeterministic Copenhagen interpretation of quantum mechanics and information theory. It is pointed out that there are serious difficulties in reconciling the deterministic interpretations of quantum mechanics, based on the concepts of a universal wave function or hidden variables, with the principle of contiguity. Quantum mechanics is reformulated entirely in terms of observables, represented by matrices, including the statistical matrix, and the utility of information theory is illustrated by a discussion of the EPR paradox. The principle of contiguity is satisfied by all conserved quantities. A theory of the operation of macroscopic measuring devices is given in the interaction representation, and the attenuation of the indeterminacy of a microscopic observable in the process of measurement is related to observable changes of entropy. 28 refs

  13. Antenna theory: Analysis and design

    Science.gov (United States)

    Balanis, C. A.

    The book's main objective is to introduce the fundamental principles of antenna theory and to apply them to the analysis, design, and measurements of antennas. In a description of antennas, the radiation mechanism is discussed along with the current distribution on a thin wire. Fundamental parameters of antennas are examined, taking into account the radiation pattern, radiation power density, radiation intensity, directivity, numerical techniques, gain, antenna efficiency, half-power beamwidth, beam efficiency, bandwidth, polarization, input impedance, and antenna temperature. Attention is given to radiation integrals and auxiliary potential functions, linear wire antennas, loop antennas, linear and circular arrays, self- and mutual impedances of linear elements and arrays, broadband dipoles and matching techniques, traveling wave and broadband antennas, frequency independent antennas and antenna miniaturization, the geometrical theory of diffraction, horns, reflectors and lens antennas, antenna synthesis and continuous sources, and antenna measurements.

  14. Testing Fundamental Gravitation in Space

    Energy Technology Data Exchange (ETDEWEB)

    Turyshev, Slava G.

    2013-10-15

    General theory of relativity is a standard theory of gravitation; as such, it is used to describe gravity when the problems in astronomy, astrophysics, cosmology, and fundamental physics are concerned. The theory is also relied upon in many modern applications involving spacecraft navigation, geodesy, and time transfer. Here we review the foundations of general relativity and discuss its current empirical status. We describe both the theoretical motivation and the scientific progress that may result from the new generation of high-precision tests that are anticipated in the near future.

  15. The quantum theory of measurement

    CERN Document Server

    Busch, Paul; Mittelstaedt, Peter

    1996-01-01

    The amazing accuracy in verifying quantum effects experimentally has recently renewed interest in quantum mechanical measurement theory. In this book the authors give within the Hilbert space formulation of quantum mechanics a systematic exposition of the quantum theory of measurement. Their approach includes the concepts of unsharp objectification and of nonunitary transformations needed for a unifying description of various detailed investigations. The book addresses advanced students and researchers in physics and philosophy of science. In this second edition Chaps. II-IV have been substantially rewritten. In particular, an insolubility theorem for the objectification problem has been formulated in full generality, which includes unsharp object observables and unsharp pointers.

  16. A philosophical assessment of decision theory

    DEFF Research Database (Denmark)

    Jensen, Karsten Klint

    2012-01-01

    modern axiomatic decision theory is an instance of fundamental measurement theory. This is then followed by a thorough introduction to Savage’s version of modern axiomatic decision theory. Turning to the interpretation of the theory, the maxim “maximize expected utility,” which stems from classical...... assignments. In the modern approach, the action guidance is to conform to the axioms. Analyzing decision theory as a theory of good, the maxim “maximize expected goodness” repeats the misunderstanding. Moreover, it implies risk neutrality about good and a cardinal measure of good, and both are problematic......The significance of decision theory consists of giving an account of rational decision making under circumstances of uncertainty. This question is important both from the point of view of what is in our personal interest and from the point of view of what is ethically right. But decision theory...

  17. Consistent Quantum Theory

    Science.gov (United States)

    Griffiths, Robert B.

    2001-11-01

    Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics

  18. Measuring uncertainty within the theory of evidence

    CERN Document Server

    Salicone, Simona

    2018-01-01

    This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone’s Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining ...

  19. Measure theory and fine properties of functions

    CERN Document Server

    Evans, Lawrence Craig

    2015-01-01

    Measure Theory and Fine Properties of Functions, Revised Edition provides a detailed examination of the central assertions of measure theory in n-dimensional Euclidean space. The book emphasizes the roles of Hausdorff measure and capacity in characterizing the fine properties of sets and functions. Topics covered include a quick review of abstract measure theory, theorems and differentiation in ℝn, Hausdorff measures, area and coarea formulas for Lipschitz mappings and related change-of-variable formulas, and Sobolev functions as well as functions of bounded variation.The text provides complete proofs of many key results omitted from other books, including Besicovitch's covering theorem, Rademacher's theorem (on the differentiability a.e. of Lipschitz functions), area and coarea formulas, the precise structure of Sobolev and BV functions, the precise structure of sets of finite perimeter, and Aleksandrov's theorem (on the twice differentiability a.e. of convex functions).This revised edition includes countl...

  20. Fundamentals of the physical theory of diffraction

    CERN Document Server

    Ufimtsev, Pyotr Ya

    2014-01-01

    A complete presentation of the modern physical theory of diffraction and its applications, by the world's leading authority on the topicExtensive revisions and additions to the first edition yield a second edition that is 492 pages in length, with 122 figuresNew sections examine the nature of polarization coupling, and extend the theory of shadow radiation and reflection to opaque objectsThis book features end-of-chapter problems and a solutions manual for university professors and graduate studentsMATLAB codes presented in appendices allow for quick numeric calculations of diffracted waves

  1. Experimental measurements of competition between fundamental and second harmonic emission in a quasi-optical gyrotron

    International Nuclear Information System (INIS)

    Alberti, S.; Pedrozzi, M.; Tran, M.Q.; Hogge, J.P.; Tran, T.M.; Muggli, P.; Joedicke, B.; Mathews, H.G.

    1990-04-01

    A quasi-optical gyrotron (QOG) designed for operation at the fundamental (Ω ce ≅100 GHz) exhibits simultaneous emission at Ω ce and 2Ω ce (second harmonic). For a beam current of 4 A, 20% of the total RF power is emitted at the second harmonic. The experimental measurements show that the excitation of the second harmonic is only possible when the fundamental is present. The frequency of the second harmonic is locked by the frequency of the fundamental. Experimental evidence shows that when the second harmonic is not excited, total efficiency is enhanced. (author) 6 refs., 5 figs., 1 tab

  2. The New Unified Theory of ATP Synthesis/Hydrolysis and Muscle Contraction, Its Manifold Fundamental Consequences and Mechanistic Implications and Its Applications in Health and Disease

    Directory of Open Access Journals (Sweden)

    Sunil Nath

    2008-09-01

    Full Text Available Complete details of the thermodynamics and molecular mechanisms of ATP synthesis/hydrolysis and muscle contraction are offered from the standpoint of the torsional mechanism of energy transduction and ATP synthesis and the rotation-uncoiling-tilt (RUT energy storage mechanism of muscle contraction. The manifold fundamental consequences and mechanistic implications of the unified theory for oxidative phosphorylation and muscle contraction are explained. The consistency of current mechanisms of ATP synthesis and muscle contraction with experiment is assessed, and the novel insights of the unified theory are shown to take us beyond the binding change mechanism, the chemiosmotic theory and the lever arm model. It is shown from first principles how previous theories of ATP synthesis and muscle contraction violate both the first and second laws of thermodynamics, necessitating their revision. It is concluded that the new paradigm, ten years after making its first appearance, is now perfectly poised to replace the older theories. Finally, applications of the unified theory in cell life and cell death are outlined and prospects for future research are explored. While it is impossible to cover each and every specific aspect of the above, an attempt has been made here to address all the pertinent details and what is presented should be sufficient to convince the reader of the novelty, originality, breakthrough nature and power of the unified theory, its manifold fundamental consequences and mechanistic implications, and its applications in health and disease.

  3. Fundamental volatility and stock returns : does fundamental volatility explain stock returns?

    OpenAIRE

    Selboe, Guner K.; Virdee, Jaspal Singh

    2017-01-01

    In this thesis, we investigate whether the fundamental uncertainty can explain the crosssection of stock returns. To measure the fundamental uncertainty, we estimate rolling standard deviations and accounting betas of four different fundamentals: revenues, gross profit, earnings and cash flows. The standard deviation and the beta of revenues significantly explain returns in the Fama-Macbeth procedure, but only appears significant among smaller stocks in the portfolio formation ...

  4. String theory and fundamental interactions. Gabriele Veneziano and theoretical physics - Historical and contemporary perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Gasperini, M. [Bari Univ. (Italy). Dipt. di Fisica; Maharana, J. (eds.) [Institute of Physics, Orissa (India)

    2008-07-01

    This volume, dedicated to Prof. Gabriele Veneziano on the occasion of his retirement from CERN, starts as a broad historico-scientific study on the work on string theory and nonperturbative QCD that has been pioneered by Prof. Veneziano in the late 60s and early 70s. It goes on to examine the many ramifications this and similar early work has spawned over the past decades and the reader will find state-of-the art tutorial reviews on string cosmology, string dualities and symmetries, and much more. The book includes a concise updated scientific biography of, and an interview with, Prof. Veneziano, in which he relates his personal views about the present and future of fundamental physics. This is followed by the commented draft of an unpublished paper of 1973 of his, anticipating interesting results which were rediscovered and published more than a decade later. Overall, this volume is a vast and unique canvas where the re-examination of older and the presentation of newer results and insights are skillfully mixed with personal recollections of the contributing authors, most of them involved in the early days of string and quantum field theory, about Prof. Veneziano and the many interrelated topics considered. (orig.)

  5. String theory and fundamental interactions. Gabriele Veneziano and theoretical physics - Historical and contemporary perspectives

    International Nuclear Information System (INIS)

    Gasperini, M.

    2008-01-01

    This volume, dedicated to Prof. Gabriele Veneziano on the occasion of his retirement from CERN, starts as a broad historico-scientific study on the work on string theory and nonperturbative QCD that has been pioneered by Prof. Veneziano in the late 60s and early 70s. It goes on to examine the many ramifications this and similar early work has spawned over the past decades and the reader will find state-of-the art tutorial reviews on string cosmology, string dualities and symmetries, and much more. The book includes a concise updated scientific biography of, and an interview with, Prof. Veneziano, in which he relates his personal views about the present and future of fundamental physics. This is followed by the commented draft of an unpublished paper of 1973 of his, anticipating interesting results which were rediscovered and published more than a decade later. Overall, this volume is a vast and unique canvas where the re-examination of older and the presentation of newer results and insights are skillfully mixed with personal recollections of the contributing authors, most of them involved in the early days of string and quantum field theory, about Prof. Veneziano and the many interrelated topics considered. (orig.)

  6. An Analysis of Fundamental Mode Surface Wave Amplitude Measurements

    Science.gov (United States)

    Schardong, L.; Ferreira, A. M.; van Heijst, H. J.; Ritsema, J.

    2014-12-01

    Seismic tomography is a powerful tool to decipher the Earth's interior structure at various scales. Traveltimes of seismic waves are widely used to build velocity models, whereas amplitudes are still only seldomly accounted for. This mainly results from our limited ability to separate the various physical effects responsible for observed amplitude variations, such as focussing/defocussing, scattering and source effects. We present new measurements from 50 global earthquakes of fundamental-mode Rayleigh and Love wave amplitude anomalies measured in the period range 35-275 seconds using two different schemes: (i) a standard time-domain amplitude power ratio technique; and (ii) a mode-branch stripping scheme. For minor-arc data, we observe amplitude anomalies with respect to PREM in the range of 0-4, for which the two measurement techniques show a very good overall agreement. We present here a statistical analysis and comparison of these datasets, as well as comparisons with theoretical calculations for a variety of 3-D Earth models. We assess the geographical coherency of the measurements, and investigate the impact of source, path and receiver effects on surface wave amplitudes, as well as their variations with frequency in a wider range than previously studied.

  7. The quality of measurements a metrological reference

    CERN Document Server

    Fridman, A E

    2012-01-01

    This book provides a detailed discussion and commentary on the fundamentals of metrology. The fundamentals of metrology, the principles underlying the design of the SI International System of units, the theory of measurement error, a new methodology for estimation of measurement accuracy based on uncertainty, and methods for reduction of measured results and estimation of measurement uncertainty are all discussed from a modern point of view. The concept of uncertainty is shown to be consistent with the classical theory of accuracy. The theory of random measurement errors is supplemented by a very general description based on the generalized normal distribution; systematic instrumental error is described in terms of a methodology for normalizing the metrological characteristics of measuring instruments. A new international system for assuring uniformity of measurements based on agreements between national metrological institutes is discussed, in addition to the role and procedure for performance of key compari...

  8. Qualitative insights on fundamental mechanics

    International Nuclear Information System (INIS)

    Mardari, Ghenadie N

    2007-01-01

    The gap between classical mechanics and quantum mechanics has an important interpretive implication: the Universe must have an irreducible fundamental level, which determines the properties of matter at higher levels of organization. We show that the main parameters of any fundamental model must be theory-independent. Moreover, such models must also contain discrete identical entities with constant properties. These conclusions appear to support the work of Kaniadakis on subquantum mechanics. A qualitative analysis is offered to suggest compatibility with relevant phenomena, as well as to propose new means for verification

  9. The Kadomtsev-Petviashvili equations and fundamental string theory

    International Nuclear Information System (INIS)

    Gilbert, G.

    1988-01-01

    In this paper the infinite sequence of non-linear partial differential equations known as the Kadomtsev-Petviashvili equations is described in simple terms and possible applications to a fundamental description of interacting strings are addressed. Lines of research likely to prove useful in formulating a description of non-perturbative string configurations are indicated. (orig.)

  10. Summary: fundamental interactions and processes

    International Nuclear Information System (INIS)

    Koltun, D.S.

    1982-01-01

    The subjects of the talks of the first day of the workshop are discussed in terms of fundamental interactions, dynamical theory, and relevant degrees of freedom. Some general considerations are introduced and are used to confront the various approaches taken in the earlier talks

  11. SU (2) with fundamental fermions and scalars

    DEFF Research Database (Denmark)

    Hansen, Martin; Janowski, Tadeusz; Pica, Claudio

    2018-01-01

    We present preliminary results on the lattice simulation of an SU(2) gauge theory with two fermion flavors and one strongly interacting scalar field, all in the fundamental representation of SU(2). The motivation for this study comes from the recent proposal of "fundamental" partial compositeness...... the properties of light meson resonances previously obtained for the SU(2) model. Preprint: CP3-Origins-2017-047 DNRF90...

  12. Impact of Neutrino Oscillation Measurements on Theory

    International Nuclear Information System (INIS)

    Murayama, Hitoshi

    2003-01-01

    Neutrino oscillation data had been a big surprise to theorists, and indeed they have ongoing impact on theory. I review what the impact has been, and what measurements will have critical impact on theory in the future.

  13. Inequivalent quantizations and fundamentally perfect spaces

    International Nuclear Information System (INIS)

    Imbo, T.D.; Sudarshan, E.C.G.

    1987-06-01

    We investigate the problem of inequivalent quantizations of a physical system with multiply connected configuration space X. For scalar quantum theory on X we show that state vectors must be single-valued if and only if the first homology group H 1 (X) is trivial, or equivalently the fundamental group π 1 (X) is perfect. The θ-structure of quantum gauge and gravitational theories is discussed in light of this result

  14. Quantum measurement and algebraic quantum field theories

    International Nuclear Information System (INIS)

    DeFacio, B.

    1976-01-01

    It is shown that the physics and semantics of quantum measurement provide a natural interpretation of the weak neighborhoods of the states on observable algebras without invoking any ideas of ''a reading error'' or ''a measured range.'' Then the state preparation process in quantum measurement theory is shown to give the normal (or locally normal) states on the observable algebra. Some remarks are made concerning the physical implications of normal state for systems with an infinite number of degrees of freedom, including questions on open and closed algebraic theories

  15. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 2: Fundamentals and application to counting measurements with the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) on samples are counted after treating them (e.g. aliquotation, solution, enrichment, separation). It considers, besides the random character of radioactive decay and of pulse counting, all other influences arising from sample treatment, (e.g. weighing, enrichment, calibration or the instability of the test setup). ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG 2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as non radioactive waste, and ISO 11929-1, ISO 11929-3 and ISO 11929-4 and is, consequently, complementary to these documents

  16. Fuel ion rotation measurement and its implications on H-mode theories

    International Nuclear Information System (INIS)

    Kim, J.; Burrell, K.H.; Gohil, P.; Groebner, R.J.; Hinton, F.L.; Kim, Y.B.; Seraydarian, R.; Mandl, W.

    1993-10-01

    Poloidal and toroidal rotation of the fuel ions (He 2+ ) and the impurity ions (C 6+ and B 5+ ) in H-mode helium plasmas have been investigated in the DIII-D tokamak by means of charge exchange recombination spectroscopy, resulting in the discovery that the fuel ion poloidal rotation is in the ion diamagnetic drift direction while the impurity ion rotation is in the electron diamagnetic drift direction. The radial electric field obtained from radial force balance analysis of the measured pressure gradients and rotation velocities is shown to be the same regardless of which ion species is used and therefore is a more fundamental parameter than the rotation flows in studying H-mode phenomena. It is shown that the three contributions to the radial electric field (diamagnetic, poloidal rotation, and toroidal rotation terms) are comparable and consequently the poloidal flow does not solely represent the E x B flow. In the high-shear edge region, the density scale length is comparable to the ion poloidal gyroradius, and thus neoclassical theory is not valid there. In view of this new discovery that the fuel and impurity ions rotate in opposite sense, L-H transition theories based on the poloidal rotation may require improvement

  17. Pilot-wave approaches to quantum field theory

    Energy Technology Data Exchange (ETDEWEB)

    Struyve, Ward, E-mail: Ward.Struyve@fys.kuleuven.be [Institute of Theoretical Physics, K.U.Leuven, Celestijnenlaan 200D, B-3001 Leuven (Belgium); Institute of Philosophy, K.U.Leuven, Kardinaal Mercierplein 2, B-3000 Leuven (Belgium)

    2011-07-08

    The purpose of this paper is to present an overview of recent work on pilot-wave approaches to quantum field theory. In such approaches, systems are not only described by their wave function, as in standard quantum theory, but also by some additional variables. In the non-relativistic pilot-wave theory of deBroglie and Bohm those variables are particle positions. In the context of quantum field theory, there are two natural choices, namely particle positions and fields. The incorporation of those variables makes it possible to provide an objective description of nature in which rather ambiguous notions such as 'measurement' and 'observer' play no fundamental role. As such, the theory is free of the conceptual difficulties, such as the measurement problem, that plague standard quantum theory.

  18. Mass anomalous dimension and running of the coupling in SU(2) with six fundamental fermions

    DEFF Research Database (Denmark)

    Bursa, Francis; Del Debbio, Luigi; Keegan, Liam

    2010-01-01

    We simulate SU(2) gauge theory with six massless fundamental Dirac fermions. By using the Schr\\"odinger Functional method we measure the running of the coupling and the fermion mass over a wide range of length scales. We observe very slow running of the coupling and construct an estimator for the...

  19. Proceedings of the 2003 NASA/JPL Workshop on Fundamental Physics in Space

    Science.gov (United States)

    Strayer, Don (Editor)

    2003-01-01

    The 2003 Fundamental Physics workshop included presentations ranging from forces acting on RNA to properties of clouds of degenerate Fermi atoms, to techniques to probe for a added space-time dimensions, and to flight hardware for low temperature experiments, amongst others. Mark Lee from NASA Headquarters described the new strategic plan that NASA has developed under Administrator Sean O'Keefe's leadership. Mark explained that the Fundamental Physics community now needs to align its research program and the roadmap describing the long-term goals of the program with the NASA plan. Ulf Israelsson of JPL discussed how the rewrite of the roadmap will be implemented under the leadership of the Fundamental Physics Discipline Working Group (DWG). Nick Bigelow, chair of the DWG, outlined how investigators can contribute to the writing of the roadmap. Results of measurements on very cold clouds of Fermi atoms near a Feshbach resonance were described by three investigators. Also, new measurements relating to tests of Einstein equivalence were discussed. Investigators also described methods to test other aspects of Einstein's relativity theories.

  20. The perturbation theory in the fundamental mode. Its application to the analysis of neutronic experiments involving small amounts of materials in fast neutron multiplying media

    International Nuclear Information System (INIS)

    Remsak, Stanislav.

    1975-01-01

    The formalism of the perturbation theory at the first order, is developed in its simplest form: diffusion theory in the fundamental mode and then the more complex formalism of the transport theory in the fundamental mode. A comparison shows the effect of the angular correlation between the fine structures of the flux and its adjoint function, the difference in the treatment of neutron leakage phenomena, and the existence of new terms in the perturbation formula, entailing a reactivity representation in the diffusion theory that is not quite exact. Problems of using the formalism developed are considered: application of the multigroup formalism, transients of the flux and its adjoint function, validity of the first order approximation etc. A detailed analysis allows the formulation of a criterion specifying the validity range. Transients occuring in the reference medium are also treated. A set of numerical tests for determining a method of elimination of transient effects is presented. Some differential experiments are then discussed: sodium blowdown in enriched uranium or plutonium cores, experiments utilizing some structural materials (iron and oxygen) and plutonium sample oscillations. The Cadarache version II program was systematically used but the analysis of the experiments of plutonium sample oscillation in Ermine required the Cadarache version III program [fr

  1. Recent Advances and Future Prospects in Fundamental Symmetries

    Science.gov (United States)

    Plaster, Brad

    2017-09-01

    A broad program of initiatives in fundamental symmetries seeks answers to several of the most pressing open questions in nuclear physics, ranging from the scale of the neutrino mass, to the particle-antiparticle nature of the neutrino, to the origin of the matter-antimatter asymmetry, to the limits of Standard Model interactions. Although the experimental program is quite broad, with efforts ranging from precision measurements of neutrino properties; to searches for electric dipole moments; to precision measurements of magnetic dipole moments; and to precision measurements of couplings, particle properties, and decays; all of these seemingly disparate initiatives are unified by several common threads. These include the use and exploitation of symmetry principles, novel cross-disciplinary experimental work at the forefront of the precision frontier, and the need for accompanying breakthroughs in development of the theory necessary for an interpretation of the anticipated results from these experiments. This talk will highlight recent accomplishments and advances in fundamental symmetries and point to the extraordinary level of ongoing activity aimed at realizing the development and interpretation of next-generation experiments. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, under Award Number DE-SC-0014622.

  2. Measurement of pitch in speech : an implementation of Goldstein's theory of pitch perception

    NARCIS (Netherlands)

    Duifhuis, H.; Willems, L.F.; Sluyter, R.J.

    1982-01-01

    Recent developments in hearing theory have resulted in the rather general acceptance of the idea that the perception of pitch of complex sounds is the result of the psychological pattern recognition process. The pitch is supposedly mediated by the fundamental of the harmonic spectrum which fits the

  3. Testing the Standard Model and Fundamental Symmetries in Nuclear Physics with Lattice QCD and Effective Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Walker-Loud, Andre [College of William and Mary, Williamsburg, VA (United States)

    2016-10-14

    The research supported by this grant is aimed at probing the limits of the Standard Model through precision low-energy nuclear physics. The work of the PI (AWL) and additional personnel is to provide theory input needed for a number of potentially high-impact experiments, notably, hadronic parity violation, Dark Matter direct detection and searches for permanent electric dipole moments (EDMs) in nucleons and nuclei. In all these examples, a quantitative understanding of low-energy nuclear physics from the fundamental theory of strong interactions, Quantum Chromo-Dynamics (QCD), is necessary to interpret the experimental results. The main theoretical tools used and developed in this work are the numerical solution to QCD known as lattice QCD (LQCD) and Effective Field Theory (EFT). This grant is supporting a new research program for the PI, and as such, needed to be developed from the ground up. Therefore, the first fiscal year of this grant, 08/01/2014-07/31/2015, has been spent predominantly establishing this new research effort. Very good progress has been made, although, at this time, there are not many publications to show for the effort. After one year, the PI accepted a job at Lawrence Berkeley National Laboratory, so this final report covers just a single year of five years of the grant.

  4. Fundamental safety principles. Safety fundamentals

    International Nuclear Information System (INIS)

    2007-01-01

    This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purpose. The fundamental safety objective - to protect people and the environment from harmful effects of ionizing radiation - applies to all circumstances that give rise to radiation risks. The safety principles are applicable, as relevant, throughout the entire lifetime of all facilities and activities - existing and new - utilized for peaceful purposes, and to protective actions to reduce existing radiation risks. They provide the basis for requirements and measures for the protection of people and the environment against radiation risks and for the safety of facilities and activities that give rise to radiation risks, including, in particular, nuclear installations and uses of radiation and radioactive sources, the transport of radioactive material and the management of radioactive waste

  5. Fundamental safety principles. Safety fundamentals

    International Nuclear Information System (INIS)

    2006-01-01

    This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purpose. The fundamental safety objective - to protect people and the environment from harmful effects of ionizing radiation - applies to all circumstances that give rise to radiation risks. The safety principles are applicable, as relevant, throughout the entire lifetime of all facilities and activities - existing and new - utilized for peaceful purposes, and to protective actions to reduce existing radiation risks. They provide the basis for requirements and measures for the protection of people and the environment against radiation risks and for the safety of facilities and activities that give rise to radiation risks, including, in particular, nuclear installations and uses of radiation and radioactive sources, the transport of radioactive material and the management of radioactive waste

  6. A relativistic theory for continuous measurement of quantum fields

    International Nuclear Information System (INIS)

    Diosi, L.

    1990-04-01

    A formal theory for the continuous measurement of relativistic quantum fields is proposed. The corresponding scattering equations were derived. The proposed formalism reduces to known equations in the Markovian case. Two recent models for spontaneous quantum state reduction have been recovered in the framework of this theory. A possible example of the relativistic continuous measurement has been outlined in standard Quantum Electrodynamics. The continuous measurement theory possesses an alternative formulation in terms of interacting quantum and stochastic fields. (author) 23 refs

  7. Molecular imaging. Fundamentals and applications

    International Nuclear Information System (INIS)

    Tian, Jie

    2013-01-01

    Covers a wide range of new theory, new techniques and new applications. Contributed by many experts in China. The editor has obtained the National Science and Technology Progress Award twice. ''Molecular Imaging: Fundamentals and Applications'' is a comprehensive monograph which describes not only the theory of the underlying algorithms and key technologies but also introduces a prototype system and its applications, bringing together theory, technology and applications. By explaining the basic concepts and principles of molecular imaging, imaging techniques, as well as research and applications in detail, the book provides both detailed theoretical background information and technical methods for researchers working in medical imaging and the life sciences. Clinical doctors and graduate students will also benefit from this book.

  8. Toward the fundamental theory of nuclear matter physics: The microscopic theory of nuclear collective dynamics

    International Nuclear Information System (INIS)

    Sakata, F.; Marumori, T.; Hashimoto, Y.; Tsukuma, H.; Yamamoto, Y.; Terasaki, J.; Iwasawa, Y.; Itabashi, H.

    1992-01-01

    Since the research field of nuclear physics is expanding rapidly, it is becoming more imperative to develop the microscopie theory of nuclear matter physics which provides us with a unified understanding of diverse phenomena exhibited by nuclei. An estabishment of various stable mean-fields in nuclei allows us to develop the microscopie theory of nuclear collective dynamics within the mean-field approximation. The classical-level theory of nuclear collective dynamics is developed by exploiting the symplectic structure of the timedependent Hartree-Fock (TDHF)-manifold. The importance of exploring the single-particle dynamics, e.g. the level-crossing dynamics in connection with the classical order-to-chaos transition mechanism is pointed out. Since the classical-level theory os directly related to the full quantum mechanical boson expansion theory via the symplectic structure of the TDHF-manifold, the quantum theory of nuclear collective dynamics is developed at the dictation of what os developed on the classical-level theory. The quantum theory thus formulated enables us to introduce the quantum integrability and quantum chaoticity for individual eigenstates. The inter-relationship between the classical-level and quantum theories of nuclear collective dynamics might play a decisive role in developing the quantum theory of many-body problems. (orig.)

  9. arXiv Minimal Fundamental Partial Compositeness

    CERN Document Server

    Cacciapaglia, Giacomo; Sannino, Francesco; Thomsen, Anders Eller

    Building upon the fundamental partial compositeness framework we provide consistent and complete composite extensions of the standard model. These are used to determine the effective operators emerging at the electroweak scale in terms of the standard model fields upon consistently integrating out the heavy composite dynamics. We exhibit the first effective field theories matching these complete composite theories of flavour and analyse their physical consequences for the third generation quarks. Relations with other approaches, ranging from effective analyses for partial compositeness to extra dimensions as well as purely fermionic extensions, are briefly discussed. Our methodology is applicable to any composite theory of dynamical electroweak symmetry breaking featuring a complete theory of flavour.

  10. Macroscopic Fundamental Diagram for pedestrian networks : Theory and applications

    NARCIS (Netherlands)

    Hoogendoorn, S.P.; Daamen, W.; Knoop, V.L.; Steenbakkers, Jeroen; Sarvi, Majid

    2017-01-01

    The Macroscopic Fundamental diagram (MFD) has proven to be a powerful concept in understanding and managing vehicular network dynamics, both from a theoretical angle and from a more application-oriented perspective. In this contribution, we explore the existence and the characteristics of the

  11. Measure-valued differentiation for finite products of measures : theory and applications

    NARCIS (Netherlands)

    Leahu, H.

    2008-01-01

    In this dissertation we perform a comprehensive analysis of measure-valued differentiation, in which weak differentiation of parameter-dependent probability measures plays a central role. We develop a theory of weak differentiation of measures and show that classical concepts such as differential

  12. Explaining crude oil prices using fundamental measures

    International Nuclear Information System (INIS)

    Coleman, Les

    2012-01-01

    Oil is the world's most important commodity, and improving the understanding of drivers of its price is a longstanding research objective. This article analyses real oil prices during 1984–2007 using a monthly dataset of fundamental and market parameters that cover financial markets, global economic growth, demand and supply of oil, and geopolitical measures. The innovation is to incorporate proxies for speculative and terrorist activity and dummies for major industry events, and quantify price impacts of each. New findings are positive links between oil prices and speculative activity, bond yields, an interaction term incorporating OPEC market share and OECD import dependence, and the number of US troops and frequency of terrorist attacks in the Middle East. Shocks also prove significant with a $6–18 per barrel impact on price for several months. - Highlights: ► Article introduces new variables to the study of oil prices. ► New variables are terrorist incidents and military activity, and oil futures market size. ► Shocks prove important affecting prices by $6–18 per barrel for several months. ► OPEC market influence rises with OECD import dependence.

  13. Relativistic quantum chemistry the fundamental theory of molecular science

    CERN Document Server

    Reiher, Markus

    2014-01-01

    Einstein proposed his theory of special relativity in 1905. For a long time it was believed that this theory has no significant impact on chemistry. This view changed in the 1970s when it was realized that (nonrelativistic) Schrödinger quantum mechanics yields results on molecular properties that depart significantly from experimental results. Especially when heavy elements are involved, these quantitative deviations can be so large that qualitative chemical reasoning and understanding is affected. For this to grasp the appropriate many-electron theory has rapidly evolved. Nowadays relativist

  14. A laboratory scale fundamental time?

    International Nuclear Information System (INIS)

    Mendes, R.V.

    2012-01-01

    The existence of a fundamental time (or fundamental length) has been conjectured in many contexts. However, the ''stability of physical theories principle'' seems to be the one that provides, through the tools of algebraic deformation theory, an unambiguous derivation of the stable structures that Nature might have chosen for its algebraic framework. It is well-known that c and ℎ are the deformation parameters that stabilize the Galilean and the Poisson algebra. When the stability principle is applied to the Poincare-Heisenberg algebra, two deformation parameters emerge which define two time (or length) scales. In addition there are, for each of them, a plus or minus sign possibility in the relevant commutators. One of the deformation length scales, related to non-commutativity of momenta, is probably related to the Planck length scale but the other might be much larger and already detectable in laboratory experiments. In this paper, this is used as a working hypothesis to look for physical effects that might settle this question. Phase-space modifications, resonances, interference, electron spin resonance and non-commutative QED are considered. (orig.)

  15. Microwave engineering concepts and fundamentals

    CERN Document Server

    Khan, Ahmad Shahid

    2014-01-01

    Detailing the active and passive aspects of microwaves, Microwave Engineering: Concepts and Fundamentals covers everything from wave propagation to reflection and refraction, guided waves, and transmission lines, providing a comprehensive understanding of the underlying principles at the core of microwave engineering. This encyclopedic text not only encompasses nearly all facets of microwave engineering, but also gives all topics—including microwave generation, measurement, and processing—equal emphasis. Packed with illustrations to aid in comprehension, the book: •Describes the mathematical theory of waveguides and ferrite devices, devoting an entire chapter to the Smith chart and its applications •Discusses different types of microwave components, antennas, tubes, transistors, diodes, and parametric devices •Examines various attributes of cavity resonators, semiconductor and RF/microwave devices, and microwave integrated circuits •Addresses scattering parameters and their properties, as well a...

  16. Discrete time interval measurement system: fundamentals, resolution and errors in the measurement of angular vibrations

    International Nuclear Information System (INIS)

    Gómez de León, F C; Meroño Pérez, P A

    2010-01-01

    The traditional method for measuring the velocity and the angular vibration in the shaft of rotating machines using incremental encoders is based on counting the pulses at given time intervals. This method is generically called the time interval measurement system (TIMS). A variant of this method that we have developed in this work consists of measuring the corresponding time of each pulse from the encoder and sampling the signal by means of an A/D converter as if it were an analog signal, that is to say, in discrete time. For this reason, we have denominated this method as the discrete time interval measurement system (DTIMS). This measurement system provides a substantial improvement in the precision and frequency resolution compared with the traditional method of counting pulses. In addition, this method permits modification of the width of some pulses in order to obtain a mark-phase on every lap. This paper explains the theoretical fundamentals of the DTIMS and its application for measuring the angular vibrations of rotating machines. It also displays the required relationship between the sampling rate of the signal, the number of pulses of the encoder and the rotating velocity in order to obtain the required resolution and to delimit the methodological errors in the measurement

  17. Thermodynamics and the structure of quantum theory

    International Nuclear Information System (INIS)

    Krumm, Marius; Müller, Markus P; Barnum, Howard; Barrett, Jonathan

    2017-01-01

    Despite its enormous empirical success, the formalism of quantum theory still raises fundamental questions: why is nature described in terms of complex Hilbert spaces, and what modifications of it could we reasonably expect to find in some regimes of physics? Here we address these questions by studying how compatibility with thermodynamics constrains the structure of quantum theory. We employ two postulates that any probabilistic theory with reasonable thermodynamic behaviour should arguably satisfy. In the framework of generalised probabilistic theories, we show that these postulates already imply important aspects of quantum theory, like self-duality and analogues of projective measurements, subspaces and eigenvalues. However, they may still admit a class of theories beyond quantum mechanics. Using a thought experiment by von Neumann, we show that these theories admit a consistent thermodynamic notion of entropy, and prove that the second law holds for projective measurements and mixing procedures. Furthermore, we study additional entropy-like quantities based on measurement probabilities and convex decomposition probabilities, and uncover a relation between one of these quantities and Sorkin’s notion of higher-order interference. (paper)

  18. Measurement theory and the Schroedinger equation

    International Nuclear Information System (INIS)

    Schwarz, A.S.; Tyupkin, Yu.S.

    1987-01-01

    The paper is an analysis of the measuring process in quantum mechanics based on the Schroedinger equation. The arguments employed use an assumption reflecting, to some extent, the statistical properties of the vacuum. A description is given of the cases in which different incoherent superpositions of pure states in quantum mechanics are physically equivalent. The fundamental difference between quantum and classical mechanics as explained by the existence of unobservable variables is discussed. (U.K.)

  19. Fundamentals of quantum mechanics

    CERN Document Server

    House, J E

    2017-01-01

    Fundamentals of Quantum Mechanics, Third Edition is a clear and detailed introduction to quantum mechanics and its applications in chemistry and physics. All required math is clearly explained, including intermediate steps in derivations, and concise review of the math is included in the text at appropriate points. Most of the elementary quantum mechanical models-including particles in boxes, rigid rotor, harmonic oscillator, barrier penetration, hydrogen atom-are clearly and completely presented. Applications of these models to selected “real world” topics are also included. This new edition includes many new topics such as band theory and heat capacity of solids, spectroscopy of molecules and complexes (including applications to ligand field theory), and small molecules of astrophysical interest.

  20. Elementary Concepts and Fundamental Laws of the Theory of Heat

    Science.gov (United States)

    de Oliveira, Mário J.

    2018-06-01

    The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.

  1. Elementary Concepts and Fundamental Laws of the Theory of Heat

    Science.gov (United States)

    de Oliveira, Mário J.

    2018-03-01

    The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.

  2. ŽAMPA’S SYSTEMS THEORY: A COMPREHENSIVE THEORY OF MEASUREMENT IN DYNAMIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2018-04-01

    Full Text Available The article outlines in memoriam Prof. Pavel Žampa’s concepts of system theory which enable us to devise a measurement in dynamic systems independently of the particular system behaviour. From the point of view of Žampa’s theory, terms like system time, system attributes, system link, system element, input, output, sub-systems, and state variables are defined. In Conclusions, Žampa’s theory is discussed together with another mathematical approaches of qualitative dynamics known since the 19th century. In Appendices, we present applications of Žampa’s technical approach to measurement of complex dynamical (chemical and biological systems at the Institute of Complex Systems, University of South Bohemia in České Budějovice.

  3. Fundamentals of wireless sensor networks theory and practice

    CERN Document Server

    Dargie, Waltenegus

    2010-01-01

    In this book, the authors describe the fundamental concepts and practical aspects of wireless sensor networks. The book provides a comprehensive view to this rapidly evolving field, including its many novel applications, ranging from protecting civil infrastructure to pervasive health monitoring. Using detailed examples and illustrations, this book provides an inside track on the current state of the technology. The book is divided into three parts. In Part I, several node architectures, applications and operating systems are discussed. In Part II, the basic architectural frameworks, including

  4. Search for fundamental 'God Particle' speeds up

    CERN Multimedia

    Spotts, P N

    2000-01-01

    This month researchers at CERN are driving the accelerator to its limits and beyond to find the missing Higgs boson. Finding it would confirm a 30-yr-old theory about why matter's most fundamental particles have mass (1 page).

  5. Overview: Parity Violation and Fundamental Symmetries

    Science.gov (United States)

    Carlini, Roger

    2017-09-01

    The fields of nuclear and particle physics have undertaken extensive programs of research to search for evidence of new phenomena via the precision measurement of observables that are well predicted within the standard model of electroweak interaction. It is already known that the standard model is incomplete as it does not include gravity and dark matter/energy and therefore likely the low energy approximation of a more complex theory. This talk will be an overview of the motivation, experimental methods and status of some of these efforts (past and future) related to precision in-direct searches that are complementary to the direct searches underway at the Large Hadron Collider. This abstract is for the invited talk associated with the Mini-symposium titled ``Electro-weak Physics and Fundamental Symmetries'' organized by Julie Roche.

  6. Environmental pollution measurements and countermeasures

    International Nuclear Information System (INIS)

    Schuetz, M.

    1994-01-01

    This book gives the interested layman an insight into fundamental processes of ecology and closes the gap between theory and practice. The practical part shows how measuring instruments for environmental applications work, how errors of measurement can be avoided, and how to make use of the measured results. (orig./EF) [de

  7. Fundamental frequency and voice perturbation measures in smokers and non-smokers: An acoustic and perceptual study

    Science.gov (United States)

    Freeman, Allison

    This research examined the fundamental frequency and perturbation (jitter % and shimmer %) measures in young adult (20-30 year-old) and middle-aged adult (40-55 year-old) smokers and non-smokers; there were 36 smokers and 36 non-smokers. Acoustic analysis was carried out utilizing one task: production of sustained /a/. These voice samples were analyzed utilizing Multi-Dimensional Voice Program (MDVP) software, which provided values for fundamental frequency, jitter %, and shimmer %.These values were analyzed for trends regarding smoking status, age, and gender. Statistical significance was found regarding the fundamental frequency, jitter %, and shimmer % for smokers as compared to non-smokers; smokers were found to have significantly lower fundamental frequency values, and significantly higher jitter % and shimmer % values. Statistical significance was not found regarding fundamental frequency, jitter %, and shimmer % for age group comparisons. With regard to gender, statistical significance was found regarding fundamental frequency; females were found to have statistically higher fundamental frequencies as compared to males. However, the relationships between gender and jitter % and shimmer % lacked statistical significance. These results indicate that smoking negatively affects voice quality. This study also examined the ability of untrained listeners to identify smokers and non-smokers based on their voices. Results of this voice perception task suggest that listeners are not accurately able to identify smokers and non-smokers, as statistical significance was not reached. However, despite a lack of significance, trends in data suggest that listeners are able to utilize voice quality to identify smokers and non-smokers.

  8. Hybrid Fundamental Solution Based Finite Element Method: Theory and Applications

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2015-01-01

    Full Text Available An overview on the development of hybrid fundamental solution based finite element method (HFS-FEM and its application in engineering problems is presented in this paper. The framework and formulations of HFS-FEM for potential problem, plane elasticity, three-dimensional elasticity, thermoelasticity, anisotropic elasticity, and plane piezoelectricity are presented. In this method, two independent assumed fields (intraelement filed and auxiliary frame field are employed. The formulations for all cases are derived from the modified variational functionals and the fundamental solutions to a given problem. Generation of elemental stiffness equations from the modified variational principle is also described. Typical numerical examples are given to demonstrate the validity and performance of the HFS-FEM. Finally, a brief summary of the approach is provided and future trends in this field are identified.

  9. Absence of a fundamental acceleration scale in galaxies

    Science.gov (United States)

    Rodrigues, Davi C.; Marra, Valerio; del Popolo, Antonino; Davari, Zahra

    2018-06-01

    Dark matter is currently one of the main mysteries of the Universe. There is much strong indirect evidence that supports its existence, but there is yet no sign of a direct detection1-3. Moreover, at the scale of galaxies, there is tension between the theoretically expected dark matter distribution and its indirectly observed distribution4-7. Therefore, phenomena associated with dark matter have a chance of serving as a window towards new physics. The radial acceleration relation8,9 confirms that a non-trivial acceleration scale a0 can be found from the internal dynamics of several galaxies. The existence of such a scale is not obvious as far as the standard cosmological model is concerned10,11, and it has been interpreted as a possible sign of modified gravity12,13. Here, we consider 193 high-quality disk galaxies and, using Bayesian inference, show that the probability of existence of a fundamental acceleration is essentially 0: the null hypothesis is rejected at more than 10σ. We conclude that a0 is of emergent nature. In particular, the modified Newtonian dynamics theory14-17—a well-known alternative to dark matter based on the existence of a fundamental acceleration scale—or any other theory that behaves like it at galactic scales, is ruled out as a fundamental theory for galaxies at more than 10σ.

  10. Elementary number theory with programming

    CERN Document Server

    Lewinter, Marty

    2015-01-01

    A successful presentation of the fundamental concepts of number theory and computer programming Bridging an existing gap between mathematics and programming, Elementary Number Theory with Programming provides a unique introduction to elementary number theory with fundamental coverage of computer programming. Written by highly-qualified experts in the fields of computer science and mathematics, the book features accessible coverage for readers with various levels of experience and explores number theory in the context of programming without relying on advanced prerequisite knowledge and con

  11. Quantum Theory and Beyond

    Science.gov (United States)

    Bastin, Ted

    2009-07-01

    List of participants; Preface; Part I. Introduction: 1. The function of the colloquium - editorial; 2. The conceptual problem of quantum theory from the experimentalist's point of view O. R. Frisch; Part II. Niels Bohr and Complementarity: The Place of the Classical Language: 3. The Copenhagen interpretation C. F. von Weizsäcker; 4. On Bohr's views concerning the quantum theory D. Bohm; Part III. The Measurement Problem: 5. Quantal observation in statistical interpretation H. J. Groenewold; 6. Macroscopic physics, quantum mechanics and quantum theory of measurement G. M. Prosperi; 7. Comment on the Daneri-Loinger-Prosperi quantum theory of measurement Jeffrey Bub; 8. The phenomenology of observation and explanation in quantum theory J. H. M. Whiteman; 9. Measurement theory and complex systems M. A. Garstens; Part IV. New Directions within Quantum Theory: What does the Quantum Theoretical Formalism Really Tell Us?: 10. On the role of hidden variables in the fundamental structure of physics D. Bohm; 11. Beyond what? Discussion: space-time order within existing quantum theory C. W. Kilmister; 12. Definability and measurability in quantum theory Yakir Aharonov and Aage Petersen; 13. The bootstrap idea and the foundations of quantum theory Geoffrey F. Chew; Part V. A Fresh Start?: 14. Angular momentum: an approach to combinatorial space-time Roger Penrose; 15. A note on discreteness, phase space and cohomology theory B. J. Hiley; 16. Cohomology of observations R. H. Atkin; 17. The origin of half-integral spin in a discrete physical space Ted Bastin; Part VI. Philosophical Papers: 18. The unity of physics C. F. von Weizsäcker; 19. A philosophical obstacle to the rise of new theories in microphysics Mario Bunge; 20. The incompleteness of quantum mechanics or the emperor's missing clothes H. R. Post; 21. How does a particle get from A to B?; Ted Bastin; 22. Informational generalization of entropy in physics Jerome Rothstein; 23. Can life explain quantum mechanics? H. H

  12. Atomic spectroscopy and highly accurate measurement: determination of fundamental constants

    International Nuclear Information System (INIS)

    Schwob, C.

    2006-12-01

    This document reviews the theoretical and experimental achievements of the author concerning highly accurate atomic spectroscopy applied for the determination of fundamental constants. A pure optical frequency measurement of the 2S-12D 2-photon transitions in atomic hydrogen and deuterium has been performed. The experimental setting-up is described as well as the data analysis. Optimized values for the Rydberg constant and Lamb shifts have been deduced (R = 109737.31568516 (84) cm -1 ). An experiment devoted to the determination of the fine structure constant with an aimed relative uncertainty of 10 -9 began in 1999. This experiment is based on the fact that Bloch oscillations in a frequency chirped optical lattice are a powerful tool to transfer coherently many photon momenta to the atoms. We have used this method to measure accurately the ratio h/m(Rb). The measured value of the fine structure constant is α -1 = 137.03599884 (91) with a relative uncertainty of 6.7*10 -9 . The future and perspectives of this experiment are presented. This document presented before an academic board will allow his author to manage research work and particularly to tutor thesis students. (A.C.)

  13. Foundations of quantum theory and thermodynamics

    International Nuclear Information System (INIS)

    Olkhov, Victor

    1998-01-01

    Physical reasons to support the statement that Quantum theory (Quantum Gravity in particular as well as Classical Gravity) loose applicability due to Thermodynamical effects are presented. The statement is based on several points: 1. N.Bohr requirement that measuring units must have macro size is one of common fundamentals of Quantum theory. 2. The Reference System--the base notion of Classical and Quantum theory and of any observation process as well, must be protected from any external Thermal influence to provide precise measurements of Time and Distance. 3. No physical screen or process, that can reduce or reflect the action of Gravity is known and hence nothing can cool or protect the measuring units of the Reference System from heating by Thermal Gravity fluctuations. 4. Thermal Gravity fluctuations--Thermal fluctuations of Gravity free fall acceleration, are induced by Thermal behavior of matter and Thermal properties of Electromagnetic fields, but usually are neglected as near zero values. Matter heat Gravity and Gravity heat Matter. Thermal fluctuations of Gravity free fall acceleration act as a Universal Heater on any kind of Matter or Field. 5. Nevertheless the usual Thermal properties of Gravity are negligible, they can be dramatically increased by Gravity Blue Shift (near Gravitational Radius) or usual Doppler effects. 6. If Thermal action of Gravity become significant all measurements of Time and Distance that determine the Reference System notion, must depend on the Thermal properties of Gravity, like Temperature or Entropy, and that violate applicability of the Reference System notion and Quantum and Classical theories as well. If so, Thermal notions, like Temperature or Entropy, become more fundamental than common Time and Distance characters. The definition of the Temperature of the Gravity fluctuations and it's possible measurements are suggested

  14. Fundamentals of thermophotovoltaic energy conversion

    CERN Document Server

    Chubb, Donald L

    2007-01-01

    This is a text book presenting the fundamentals of thermophotovoltaic(TPV) energy conversion suitable for an upper undergraduate or first year graduate course. In addition it can serve as a reference or design aid for engineers developing TPV systems. Mathematica design programs for interference filters and a planar TPV system are included on a CD-Rom disk. Each chapter includes a summary and concludes with a set of problems. The first chapter presents the electromagnetic theory and radiation transfer theory necessary to calculate the optical properties of the components in a TPV optical cavity. Using a simplified model, Chapter 2 develops expressions for the maximum efficiency and power density for an ideal TPV system. The next three chapters consider the three major components in a TPV system; the emitter, filter and photovoltaic(PV) array. Chapter 3 applies the electromagnetic theory and radiation transfer theory presented in Chapter 1 in the calculation of spectral emittance. From the spectral emittance t...

  15. Foam engineering fundamentals and applications

    CERN Document Server

    2012-01-01

    Containing contributions from leading academic and industrial researchers, this book provides a much needed update of foam science research. The first section of the book presents an accessible summary of the theory and fundamentals of foams. This includes chapters on morphology, drainage, Ostwald ripening, coalescence, rheology, and pneumatic foams. The second section demonstrates how this theory is used in a wide range of industrial applications, including foam fractionation, froth flotation and foam mitigation. It includes chapters on suprafroths, flotation of oil sands, foams in enhancing petroleum recovery, Gas-liquid Mass Transfer in foam, foams in glass manufacturing, fire-fighting foam technology and consumer product foams.

  16. Quantum field theory III. Gauge theory. A bridge between mathematicians and physicists

    International Nuclear Information System (INIS)

    Zeidler, Eberhard

    2011-01-01

    In this third volume of his modern introduction to quantum field theory, Eberhard Zeidler examines the mathematical and physical aspects of gauge theory as a principle tool for describing the four fundamental forces which act in the universe: gravitative, electromagnetic, weak interaction and strong interaction. Volume III concentrates on the classical aspects of gauge theory, describing the four fundamental forces by the curvature of appropriate fiber bundles. This must be supplemented by the crucial, but elusive quantization procedure. The book is arranged in four sections, devoted to realizing the universal principle force equals curvature: Part I: The Euclidean Manifold as a Paradigm Part II: Ariadne's Thread in Gauge Theory Part III: Einstein's Theory of Special Relativity Part IV: Ariadne's Thread in Cohomology For students of mathematics the book is designed to demonstrate that detailed knowledge of the physical background helps to reveal interesting interrelationships among diverse mathematical topics. Physics students will be exposed to a fairly advanced mathematics, beyond the level covered in the typical physics curriculum. Quantum Field Theory builds a bridge between mathematicians and physicists, based on challenging questions about the fundamental forces in the universe (macrocosmos), and in the world of elementary particles (microcosmos). (orig.)

  17. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  18. On the relation of the theoretical foundations of quantum theory and general relativity theory

    International Nuclear Information System (INIS)

    Kober, Martin

    2010-01-01

    The specific content of the present thesis is presented in the following way. First the most important contents of quantum theory and general relativity theory are presented. In connection with the general relativity theory the mathematical property of the diffeomorphism invariance plays the deciding role, while concerning the quantum theory starting from the Copenhagen interpretation first the measurement problem is treated, before basing on the analysis of concrete phenomena and the mathematical apparatus of quantum theory the nonlocality is brought into focus as an important property. This means that both theories suggest a relationalistic view of the nature of the space. This analysis of the theoretical foundations of quantum theory and general relativity theory in relation to the nature of the space obtains only under inclusion of Kant's philosophy and his analysis of the terms space and time as fundamental forms of perception its full persuasive power. Then von Weizsaeckers quantum theory of the ur-alternatives is presented. Finally attempts are made to apply the obtained knowledge to the question of the quantum-theoretical formulation of general relativity theory.

  19. Topological Field Theory of Time-Reversal Invariant Insulators

    Energy Technology Data Exchange (ETDEWEB)

    Qi, Xiao-Liang; Hughes, Taylor; Zhang, Shou-Cheng; /Stanford U., Phys. Dept.

    2010-03-19

    We show that the fundamental time reversal invariant (TRI) insulator exists in 4 + 1 dimensions, where the effective field theory is described by the 4 + 1 dimensional Chern-Simons theory and the topological properties of the electronic structure is classified by the second Chern number. These topological properties are the natural generalizations of the time reversal breaking (TRB) quantum Hall insulator in 2 + 1 dimensions. The TRI quantum spin Hall insulator in 2 + 1 dimensions and the topological insulator in 3 + 1 dimension can be obtained as descendants from the fundamental TRI insulator in 4 + 1 dimensions through a dimensional reduction procedure. The effective topological field theory, and the Z{sub 2} topological classification for the TRI insulators in 2+1 and 3+1 dimensions are naturally obtained from this procedure. All physically measurable topological response functions of the TRI insulators are completely described by the effective topological field theory. Our effective topological field theory predicts a number of novel and measurable phenomena, the most striking of which is the topological magneto-electric effect, where an electric field generates a magnetic field in the same direction, with an universal constant of proportionality quantized in odd multiples of the fine structure constant {alpha} = e{sup 2}/hc. Finally, we present a general classification of all topological insulators in various dimensions, and describe them in terms of a unified topological Chern-Simons field theory in phase space.

  20. Nonperturbative theory of weak pre- and post-selected measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kofman, Abraham G., E-mail: kofmana@gmail.com; Ashhab, Sahel; Nori, Franco

    2012-11-01

    This paper starts with a brief review of the topic of strong and weak pre- and post-selected (PPS) quantum measurements, as well as weak values, and afterwards presents original work. In particular, we develop a nonperturbative theory of weak PPS measurements of an arbitrary system with an arbitrary meter, for arbitrary initial states of the system and the meter. New and simple analytical formulas are obtained for the average and the distribution of the meter pointer variable. These formulas hold to all orders in the weak value. In the case of a mixed preselected state, in addition to the standard weak value, an associated weak value is required to describe weak PPS measurements. In the linear regime, the theory provides the generalized Aharonov–Albert–Vaidman formula. Moreover, we reveal two new regimes of weak PPS measurements: the strongly-nonlinear regime and the inverted region (the regime with a very large weak value), where the system-dependent contribution to the pointer deflection decreases with increasing the measurement strength. The optimal conditions for weak PPS measurements are obtained in the strongly-nonlinear regime, where the magnitude of the average pointer deflection is equal or close to the maximum. This maximum is independent of the measurement strength, being typically of the order of the pointer uncertainty. In the optimal regime, the small parameter of the theory is comparable to the overlap of the pre- and post-selected states. We show that the amplification coefficient in the weak PPS measurements is generally a product of two qualitatively different factors. The effects of the free system and meter Hamiltonians are discussed. We also estimate the size of the ensemble required for a measurement and identify optimal and efficient meters for weak measurements. Exact solutions are obtained for a certain class of the measured observables. These solutions are used for numerical calculations, the results of which agree with the theory

  1. Nonperturbative theory of weak pre- and post-selected measurements

    International Nuclear Information System (INIS)

    Kofman, Abraham G.; Ashhab, Sahel; Nori, Franco

    2012-01-01

    This paper starts with a brief review of the topic of strong and weak pre- and post-selected (PPS) quantum measurements, as well as weak values, and afterwards presents original work. In particular, we develop a nonperturbative theory of weak PPS measurements of an arbitrary system with an arbitrary meter, for arbitrary initial states of the system and the meter. New and simple analytical formulas are obtained for the average and the distribution of the meter pointer variable. These formulas hold to all orders in the weak value. In the case of a mixed preselected state, in addition to the standard weak value, an associated weak value is required to describe weak PPS measurements. In the linear regime, the theory provides the generalized Aharonov–Albert–Vaidman formula. Moreover, we reveal two new regimes of weak PPS measurements: the strongly-nonlinear regime and the inverted region (the regime with a very large weak value), where the system-dependent contribution to the pointer deflection decreases with increasing the measurement strength. The optimal conditions for weak PPS measurements are obtained in the strongly-nonlinear regime, where the magnitude of the average pointer deflection is equal or close to the maximum. This maximum is independent of the measurement strength, being typically of the order of the pointer uncertainty. In the optimal regime, the small parameter of the theory is comparable to the overlap of the pre- and post-selected states. We show that the amplification coefficient in the weak PPS measurements is generally a product of two qualitatively different factors. The effects of the free system and meter Hamiltonians are discussed. We also estimate the size of the ensemble required for a measurement and identify optimal and efficient meters for weak measurements. Exact solutions are obtained for a certain class of the measured observables. These solutions are used for numerical calculations, the results of which agree with the theory

  2. A Local Approximation of Fundamental Measure Theory Incorporated into Three Dimensional Poisson-Nernst-Planck Equations to Account for Hard Sphere Repulsion Among Ions

    Science.gov (United States)

    Qiao, Yu; Liu, Xuejiao; Chen, Minxin; Lu, Benzhuo

    2016-04-01

    The hard sphere repulsion among ions can be considered in the Poisson-Nernst-Planck (PNP) equations by combining the fundamental measure theory (FMT). To reduce the nonlocal computational complexity in 3D simulation of biological systems, a local approximation of FMT is derived, which forms a local hard sphere PNP (LHSPNP) model. In the derivation, the excess chemical potential from hard sphere repulsion is obtained with the FMT and has six integration components. For the integrands and weighted densities in each component, Taylor expansions are performed and the lowest order approximations are taken, which result in the final local hard sphere (LHS) excess chemical potential with four components. By plugging the LHS excess chemical potential into the ionic flux expression in the Nernst-Planck equation, the three dimensional LHSPNP is obtained. It is interestingly found that the essential part of free energy term of the previous size modified model (Borukhov et al. in Phys Rev Lett 79:435-438, 1997; Kilic et al. in Phys Rev E 75:021502, 2007; Lu and Zhou in Biophys J 100:2475-2485, 2011; Liu and Eisenberg in J Chem Phys 141:22D532, 2014) has a very similar form to one term of the LHS model, but LHSPNP has more additional terms accounting for size effects. Equation of state for one component homogeneous fluid is studied for the local hard sphere approximation of FMT and is proved to be exact for the first two virial coefficients, while the previous size modified model only presents the first virial coefficient accurately. To investigate the effects of LHS model and the competitions among different counterion species, numerical experiments are performed for the traditional PNP model, the LHSPNP model, the previous size modified PNP (SMPNP) model and the Monte Carlo simulation. It's observed that in steady state the LHSPNP results are quite different from the PNP results, but are close to the SMPNP results under a wide range of boundary conditions. Besides, in both

  3. Fundamental Fields of Post-Schumpeterian Evolutionary Economics

    DEFF Research Database (Denmark)

    Andersen, Esben Sloth

    2014-01-01

    economic evolution as a process of the innovative renewal of business routines. He also explored the idea that the development of economics requires coordinated efforts within the “fundamental fields” of theory, history, statistics, and economic sociology. The paper applies this idea in an analysis...

  4. Fundamental fields of post-Schumpeterian evolutionary economics

    DEFF Research Database (Denmark)

    Andersen, Esben Sloth

    economic evolution as a process of the innovative renewal of business routines. He also explored the idea that the development of economics requires coordinated efforts within the “fundamental fields” of theory, history, statistics, and economic sociology. The paper applies this idea in an analysis...

  5. Some aspects of fundamental symmetries and interactions

    NARCIS (Netherlands)

    Jungmann, KP; Grzonka, D; Czyzykiewicz, R; Oelert, W; Rozek, T; Winter, P

    2005-01-01

    The known fundamental symmetries and interactions are well described by the Standard Model. Features of this powerful theory, which are described but not deeper explained, are addressed in a variety of speculative models. Experimental tests of the predictions in such approaches can be either through

  6. Field theories with multiple fermionic excitations

    International Nuclear Information System (INIS)

    Crawford, J.P.

    1978-01-01

    The reason for the existence of the muon has been an enigma since its discovery. Since that time there has been a continuing proliferation of elementary particles. It is proposed that this proliferation of leptons and quarks is comprehensible if there are only four fundamental particles, the leptons ν/sub e/ and e - , and the quarks u and d. All other leptons and quarks are imagined to be excited states of these four fundamental entities. Attention is restricted to the charged leptons and the electromagnetic interactions only. A detailed study of a field theory in which there is only one fundamental charged fermionic field having two (or more) excitations is made. When the electromagnetic interactions are introduced and the theory is second quantized, under certain conditions this theory reproduces the S matrix obtained from usual OED. In this case no electromagnetic transitions are allowed. A leptonic charge operator is defined and a superselection rule for this leptonic charge is found. Unfortunately, the mass spectrum cannot be obtained. This theory has many renormalizable generalizations including non-abelian gauge theories, Yukawa-type theories, and Fermi-type theories. Under certain circumstances the Yukawa- and Fermi-type theories are finite in perturbation theory. It is concluded that there are no fundamental objections to having fermionic fields with more than one excitation

  7. On the unification of all fundamental forces in a fundamentally fuzzy Cantorian ε(∞) manifold and high energy particle physics

    International Nuclear Information System (INIS)

    Marek-Crnjac, L.

    2004-01-01

    Quantum space time as given by topology and geometry of El Naschie's ε (∞) theory must be regarded as fundamentally fuzzy. It's geometry and topology belong to the mathematical category of fuzzy logic and fuzzy set theory. All lines are fuzzy fractal lines in fuzzy spaces and all exact values are exact fuzzy expectation values. That way we remove many paradoxes and contradictions in the standard model of high energy particle physics

  8. Pedagogical Review of Quantum Measurement Theory with an Emphasis on Weak Measurements

    Directory of Open Access Journals (Sweden)

    Bengt E. Y. Svensson

    2013-05-01

    Full Text Available The quantum theory of measurement has been with us since quantum mechanics was invented. It has recently been invigorated, partly due to the increasing interest in quantum information science. In this partly pedagogical review I attempt to give a self-contained overview of non-relativistic quantum theory of measurement expressed in density matrix formalism. I will not dwell on the applications in quantum information theory; it is well covered by several books in that field. The focus is instead on applications to the theory of weak measurement, as developed by Aharonov and collaborators. Their development of weak measurement combined with what they call post-selection - judiciously choosing not only the initial state of a system (pre-selection but also its final state - has received much attention recently. Not the least has it opened up new, fruitful experimental vistas, like novel approaches to amplification. But the approach has also attached to it some air of mystery. I will attempt to demystify it by showing that (almost all results can be derived in a straight-forward way from conventional quantum mechanics. Among other things, I develop the formalism not only to first order but also to second order in the weak interaction responsible for the measurement. I apply it to the so called Leggett-Garg inequalities, also known as Bell inequalities in time. I also give an outline, even if rough, of some of the ingenious experiments that the work by Aharonov and collaborators has inspired. As an application of weak measurement, not related to the approach by Aharonov and collaborators, the formalism also allows me to derive the master equation for the density matrix of an open system in interaction with an environment. An issue that remains in the weak measurement plus post-selection approach is the interpretation of the so called weak value of an observable. Is it a bona fide property of the system considered? I have no definite answer to this

  9. Theoretical physics. Field theory

    International Nuclear Information System (INIS)

    Landau, L.; Lifchitz, E.

    2004-01-01

    This book is the fifth French edition of the famous course written by Landau/Lifchitz and devoted to both the theory of electromagnetic fields and the gravity theory. The talk of the theory of electromagnetic fields is based on special relativity and relates to only the electrodynamics in vacuum and that of pointwise electric charges. On the basis of the fundamental notions of the principle of relativity and of relativistic mechanics, and by using variational principles, the authors develop the fundamental equations of the electromagnetic field, the wave equation and the processes of emission and propagation of light. The theory of gravitational fields, i.e. the general theory of relativity, is exposed in the last five chapters. The fundamentals of the tensor calculus and all that is related to it are progressively introduced just when needed (electromagnetic field tensor, energy-impulse tensor, or curve tensor...). The worldwide reputation of this book is generally allotted to clearness, to the simplicity and the rigorous logic of the demonstrations. (A.C.)

  10. Theory of precision electroweak measurements

    International Nuclear Information System (INIS)

    Peskin, M.E.

    1990-03-01

    In these lectures, I will review the theoretical concepts needed to understand the goals and implications of experiments in this new era of weak interactions. I will explain how to compute the most important order-α radiative corrections to weak interaction processes and discuss the physical implications of these correction terms. I hope that this discussion will be useful to those --- experimentalists and theorists --- who will try to interpret the new data that we will soon receive. This paper is organized as follows: I will review the structure of the standard weak interaction model at zeroth order. I will discuss the measurement of the Z 0 boson mass in e + e - annihilation. This measurement is affected by radiative correction to the form of the Z 0 resonance, and so I will review the theory of the resonance line shape. I will briefly review the modifications of the properties of the Z 0 which would be produced by additional neutral gauge bosons. I will review the theory of the renormalization of weak interaction parameters such as sin 2 θ ω , concentrating especially on the contributions of the top quark and other heavy, undiscovered particles

  11. The finite element method its basis and fundamentals

    CERN Document Server

    Zienkiewicz, Olek C; Zhu, JZ

    2013-01-01

    The Finite Element Method: Its Basis and Fundamentals offers a complete introduction to the basis of the finite element method, covering fundamental theory and worked examples in the detail required for readers to apply the knowledge to their own engineering problems and understand more advanced applications. This edition sees a significant rearrangement of the book's content to enable clearer development of the finite element method, with major new chapters and sections added to cover: Weak forms Variational forms Multi-dimensional field prob

  12. Fundamentals of machine theory and mechanisms

    CERN Document Server

    Simón Mata, Antonio; Cabrera Carrillo, Juan Antonio; Ezquerro Juanco, Francisco; Guerra Fernández, Antonio Jesús; Nadal Martínez, Fernando; Ortiz Fernández, Antonio

    2016-01-01

    This book covers the basic contents for an introductory course in Mechanism and Machine Theory. The topics dealt with are: kinematic and dynamic analysis of machines, introduction to vibratory behaviour, rotor and piston balance, kinematics of gears, ordinary and planetary gear trains and synthesis of planar mechanisms. A new approach to dimensional synthesis of mechanisms based on turning functions has been added for closed and open path generation using an optimization method based on evolutionary algorithms. The text, developed by a group of experts in kinematics and dynamics of mechanisms at the University of Málaga, Spain, is clear and is supported by more than 350 images. More than 60 outlined and explained problems have been included to clarify the theoretical concepts. .

  13. Geometric Measure Theory and Minimal Surfaces

    CERN Document Server

    Bombieri, Enrico

    2011-01-01

    W.K. ALLARD: On the first variation of area and generalized mean curvature.- F.J. ALMGREN Jr.: Geometric measure theory and elliptic variational problems.- E. GIUSTI: Minimal surfaces with obstacles.- J. GUCKENHEIMER: Singularities in soap-bubble-like and soap-film-like surfaces.- D. KINDERLEHRER: The analyticity of the coincidence set in variational inequalities.- M. MIRANDA: Boundaries of Caciopoli sets in the calculus of variations.- L. PICCININI: De Giorgi's measure and thin obstacles.

  14. Measurements of Fundamental Fluid Physics of SNF Storage Canisters

    Energy Technology Data Exchange (ETDEWEB)

    Condie, Keith Glenn; Mc Creery, Glenn Ernest; McEligot, Donald Marinus

    2001-09-01

    With the University of Idaho, Ohio State University and Clarksean Associates, this research program has the long-term goal to develop reliable predictive techniques for the energy, mass and momentum transfer plus chemical reactions in drying / passivation (surface oxidation) operations in the transfer and storage of spent nuclear fuel (SNF) from wet to dry storage. Such techniques are needed to assist in design of future transfer and storage systems, prediction of the performance of existing and proposed systems and safety (re)evaluation of systems as necessary at later dates. Many fuel element geometries and configurations are accommodated in the storage of spent nuclear fuel. Consequently, there is no one generic fuel element / assembly, storage basket or canister and, therefore, no single generic fuel storage configuration. One can, however, identify generic flow phenomena or processes which may be present during drying or passivation in SNF canisters. The objective of the INEEL tasks was to obtain fundamental measurements of these flow processes in appropriate parameter ranges.

  15. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, A.V.

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments, which are our concern in this review [ru

  16. Quantum field theory III. Gauge theory. A bridge between mathematicians and physicists

    Energy Technology Data Exchange (ETDEWEB)

    Zeidler, Eberhard [Max Planck Institute for Mathematics in the Sciences, Leipzig (Germany)

    2011-07-01

    In this third volume of his modern introduction to quantum field theory, Eberhard Zeidler examines the mathematical and physical aspects of gauge theory as a principle tool for describing the four fundamental forces which act in the universe: gravitative, electromagnetic, weak interaction and strong interaction. Volume III concentrates on the classical aspects of gauge theory, describing the four fundamental forces by the curvature of appropriate fiber bundles. This must be supplemented by the crucial, but elusive quantization procedure. The book is arranged in four sections, devoted to realizing the universal principle force equals curvature: Part I: The Euclidean Manifold as a Paradigm Part II: Ariadne's Thread in Gauge Theory Part III: Einstein's Theory of Special Relativity Part IV: Ariadne's Thread in Cohomology For students of mathematics the book is designed to demonstrate that detailed knowledge of the physical background helps to reveal interesting interrelationships among diverse mathematical topics. Physics students will be exposed to a fairly advanced mathematics, beyond the level covered in the typical physics curriculum. Quantum Field Theory builds a bridge between mathematicians and physicists, based on challenging questions about the fundamental forces in the universe (macrocosmos), and in the world of elementary particles (microcosmos). (orig.)

  17. Fundamental course of measuring. II. The electrical measuring of non-electrical parameters. Grundkurs der Messtechnik. T. 2. Das elektrische Messen nichtelektrischer Groessen

    Energy Technology Data Exchange (ETDEWEB)

    Merz, L [Technische Univ. Muenchen (F.R. Germany). Lehrstuhl und Lab. fuer Steuerungs- und Regelungstechnik

    1975-01-01

    The fundamental course of the electrical measuring of non-electrical parameters aims to fulfill the task of presenting the present knowledge on the basic measuring methods in simple language and illustrative form. The present part II deals especially with measuring methods in heat and process engineering in the industrial field. Following the introduction in part A, the techniques of electrical probes are mainly described, and it is shown which mechanical probes cannot yet be replaced by electrical ones. Part C describes the techniques of measuring transducers.

  18. Energy and Entropy as the Fundaments of Theoretical Physics

    Directory of Open Access Journals (Sweden)

    Pharis E. Williams

    2002-05-01

    Full Text Available Einstein's article titled, "The Fundaments of Theoretical Physics", from Science, Washington, D.C., May 24, 1940, is presented in its entirety as it is an outstanding presentation of the history and status of the foundations of theoretical physics as it stood in 1940. Further, it provides the background for discussing the new view of the fundaments of theoretical physics provided by the energy and entropy foundation of the Dynamic Theory.

  19. Higher spin gauge theories

    CERN Document Server

    Henneaux, Marc; Vasiliev, Mikhail A

    2017-01-01

    Symmetries play a fundamental role in physics. Non-Abelian gauge symmetries are the symmetries behind theories for massless spin-1 particles, while the reparametrization symmetry is behind Einstein's gravity theory for massless spin-2 particles. In supersymmetric theories these particles can be connected also to massless fermionic particles. Does Nature stop at spin-2 or can there also be massless higher spin theories. In the past strong indications have been given that such theories do not exist. However, in recent times ways to evade those constraints have been found and higher spin gauge theories have been constructed. With the advent of the AdS/CFT duality correspondence even stronger indications have been given that higher spin gauge theories play an important role in fundamental physics. All these issues were discussed at an international workshop in Singapore in November 2015 where the leading scientists in the field participated. This volume presents an up-to-date, detailed overview of the theories i...

  20. Loose ends of the theory of everything

    International Nuclear Information System (INIS)

    Linden, Noah

    1990-01-01

    This article examines to what extent string theory has achieved its objective of being a 'theory of everything' and unifying the four fundamental forces of nature, gravity, the strong and weak nuclear forces and electromagnetism. String theory uses one-dimensional strings, rather than points, as the fundamental objects. String theory, unlike previous models, provides a quantum theory of gravitation which has a meaningful perturbative expansion. However our present understanding of string theory does not match up with our observed spectrum of particles, nor answer questions about spacetime at the Planck scale. (UK)

  1. Fundamentals of attosecond optics

    CERN Document Server

    Chang, Zenghu

    2011-01-01

    Attosecond optical pulse generation, along with the related process of high-order harmonic generation, is redefining ultrafast physics and chemistry. A practical understanding of attosecond optics requires significant background information and foundational theory to make full use of these cutting-edge lasers and advance the technology toward the next generation of ultrafast lasers. Fundamentals of Attosecond Optics provides the first focused introduction to the field. The author presents the underlying concepts and techniques required to enter the field, as well as recent research advances th

  2. Fundamental quadratic variational principle underlying general relativity

    International Nuclear Information System (INIS)

    Atkins, W.K.

    1983-01-01

    The fundamental result of Lanczos is used in a new type of quadratic variational principle whose field equations are the Einstein field equations together with the Yang-Mills type equations for the Riemann curvature. Additionally, a spin-2 theory of gravity for the special case of the Einstein vacuum is discussed

  3. Laser Resonators and Beam Propagation Fundamentals, Advanced Concepts and Applications

    CERN Document Server

    Hodgson, Norman

    2005-01-01

    Optical Resonators provides a detailed discussion of the properties of optical resonators for lasers from basic theory to recent research. In addition to describing the fundamental theories of resonators such as geometrical optics, diffraction, and polarisation the characteristics of all important resonator schemes and their calculation are presented. Experimental examples, practical problems and a collection of measurement techniques support the comprehensive treatment of the subject. Optical Resonators is the only book currently available that provides a comprehensive overview of the the subject. Combined with the structure of the text and the autonomous nature of the chapters this work will be as suitable for those new to the field as it will be invaluable to specialists conducting research. This second edition has been enlarged by new sections on Q-switching and resonators with internal phase/amplitude control. In addition, the whole book has been brought up-to-date.

  4. Strings and fundamental physics

    International Nuclear Information System (INIS)

    Baumgartl, Marco; Brunner, Ilka; Haack, Michael

    2012-01-01

    The basic idea, simple and revolutionary at the same time, to replace the concept of a point particle with a one-dimensional string, has opened up a whole new field of research. Even today, four decades later, its multifaceted consequences are still not fully conceivable. Up to now string theory has offered a new way to view particles as different excitations of the same fundamental object. It has celebrated success in discovering the graviton in its spectrum, and it has naturally led scientists to posit space-times with more than four dimensions - which in turn has triggered numerous interesting developments in fields as varied as condensed matter physics and pure mathematics. This book collects pedagogical lectures by leading experts in string theory, introducing the non-specialist reader to some of the newest developments in the field. The carefully selected topics are at the cutting edge of research in string theory and include new developments in topological strings, AdS/CFT dualities, as well as newly emerging subfields such as doubled field theory and holography in the hydrodynamic regime. The contributions to this book have been selected and arranged in such a way as to form a self-contained, graduate level textbook. (orig.)

  5. Strings and fundamental physics

    Energy Technology Data Exchange (ETDEWEB)

    Baumgartl, Marco [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Brunner, Ilka; Haack, Michael (eds.) [Muenchen Univ. (Germany). Fakultaet fuer Physik

    2012-07-01

    The basic idea, simple and revolutionary at the same time, to replace the concept of a point particle with a one-dimensional string, has opened up a whole new field of research. Even today, four decades later, its multifaceted consequences are still not fully conceivable. Up to now string theory has offered a new way to view particles as different excitations of the same fundamental object. It has celebrated success in discovering the graviton in its spectrum, and it has naturally led scientists to posit space-times with more than four dimensions - which in turn has triggered numerous interesting developments in fields as varied as condensed matter physics and pure mathematics. This book collects pedagogical lectures by leading experts in string theory, introducing the non-specialist reader to some of the newest developments in the field. The carefully selected topics are at the cutting edge of research in string theory and include new developments in topological strings, AdS/CFT dualities, as well as newly emerging subfields such as doubled field theory and holography in the hydrodynamic regime. The contributions to this book have been selected and arranged in such a way as to form a self-contained, graduate level textbook. (orig.)

  6. Educational measurement for applied researchers theory into practice

    CERN Document Server

    Wu, Margaret; Jen, Tsung-Hau

    2016-01-01

    This book is a valuable read for a diverse group of researchers and practitioners who analyze assessment data and construct test instruments. It focuses on the use of classical test theory (CTT) and item response theory (IRT), which are often required in the fields of psychology (e.g. for measuring psychological traits), health (e.g. for measuring the severity of disorders), and education (e.g. for measuring student performance), and makes these analytical tools accessible to a broader audience. Having taught assessment subjects to students from diverse backgrounds for a number of years, the three authors have a wealth of experience in presenting educational measurement topics, in-depth concepts and applications in an accessible format. As such, the book addresses the needs of readers who use CTT and IRT in their work but do not necessarily have an extensive mathematical background. The book also sheds light on common misconceptions in applying measurement models, and presents an integrated approach to differ...

  7. Theory of thermal stresses

    CERN Document Server

    Boley, Bruno A

    1997-01-01

    Highly regarded text presents detailed discussion of fundamental aspects of theory, background, problems with detailed solutions. Basics of thermoelasticity, heat transfer theory, thermal stress analysis, more. 1985 edition.

  8. 167th International School of Physics "Enrico Fermi" : Strangeness and Spin in Fundamental Physics

    CERN Document Server

    Bressani, T; Feliciello, A; Ratcliffe, Ph G

    2008-01-01

    Strangeness and Spin in Fundamental Physics is dedicated to the discussion of the role played by two subtle and somehow puzzling quantum numbers, the strangeness and the spin, in fundamental physics. They both relate to basic properties of the fundamental quantum field theories describing strong and electro-weak interactions and to their phenomenological applications. In some instances, like the partonic spin structure of the proton, they are deeply correlated. The many puzzling results recently obtained by measuring several spin asymmetries have stimulated gigantic progresses in the study of the spin structure of protons and neutrons. Intense theoretical activity has discovered new features of non-perturbative QCD, like strong correlations between the spin and the intrinsic motions of quarks inside the nucleons. The purpose of this publication is that of providing a complete, updated and critical account of the most recent and relevant discoveries in the above fields, both from the experimental and theoretic...

  9. Multiphase flow dynamics 1 fundamentals

    CERN Document Server

    Kolev, Nikolay Ivanov

    2004-01-01

    Multi-phase flows are part of our natural environment such as tornadoes, typhoons, air and water pollution and volcanic activities as well as part of industrial technology such as power plants, combustion engines, propulsion systems, or chemical and biological industry. The industrial use of multi-phase systems requires analytical and numerical strategies for predicting their behavior. In its third extended edition this monograph contains theory, methods and practical experience for describing complex transient multi-phase processes in arbitrary geometrical configurations, providing a systematic presentation of the theory and practice of numerical multi-phase fluid dynamics. In the present first volume the fundamentals of multiphase dynamics are provided. This third edition includes various updates, extensions and improvements in all book chapters.

  10. Theory of calorimetry

    CERN Document Server

    Zielenkiewicz, Wojciech

    2004-01-01

    The purpose of this book is to give a comprehensive description of the theoretical fundamentals of calorimetry. The considerations are based on the relations deduced from the laws and general equations of heat exchange theory and steering theory.

  11. Fundamental concepts of mathematics

    CERN Document Server

    Goodstein, R L

    Fundamental Concepts of Mathematics, 2nd Edition provides an account of some basic concepts in modern mathematics. The book is primarily intended for mathematics teachers and lay people who wants to improve their skills in mathematics. Among the concepts and problems presented in the book include the determination of which integral polynomials have integral solutions; sentence logic and informal set theory; and why four colors is enough to color a map. Unlike in the first edition, the second edition provides detailed solutions to exercises contained in the text. Mathematics teachers and people

  12. Electron spectroscopy in the fundamental process of electron-nucleus bremsstrahlung

    International Nuclear Information System (INIS)

    Hillenbrand, Pierre-Michel

    2013-07-01

    Within the scope of this thesis the fundamental process of electron-nucleus bremsstrahlung was studied in inverse kinematics at the Experimental Storage Ring ESR at GSI. For the system U 88+ + N 2 at 90 MeV/u it was shown, that by using inverse kinematics coincidence measurements between the scattered electron and the emitted photon can be performed for the case, in which the incoming electron transfers almost all of its kinetic energy onto the emitted photon. The sensitivity to the fundamental process could be achieved by measuring triple differential cross sections as a function of the emission angle of the photon and the scattered electron as well as the energy of the scattered electron. The optics of the magnetic electron spectrometer used were thoroughly revised and optimized to the experimental requirements. Analyzing different coincidences in this collision system, it was possible to determine the contributions to the electron distribution arising from radiative electron capture to the projectile continuum, nonradiative electron capture to the projectile continuum, and electron loss to the projectile continuum. The experimental results of each of these processes were compared to theoretical calculations. The electron spectra for the radiative and the nonradiative electron capture to continuum clearly reproduce the opposite asymmetry predicted by theory. Furthermore electron spectra for collisions of U 28+ with different gases were measured.

  13. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, Andrei V

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of string theory in the modern picture of the physical world. Even though quantum field theory describes a wide range of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments which are our concern in this review. (reviews of topical problems)

  14. Fundamentals of estuarine physical oceanography

    CERN Document Server

    Bruner de Miranda, Luiz; Kjerfve, Björn; Castro Filho, Belmiro Mendes de

    2017-01-01

    This book provides an introduction to the complex system functions, variability and human interference in ecosystem between the continent and the ocean. It focuses on circulation, transport and mixing of estuarine and coastal water masses, which is ultimately related to an understanding of the hydrographic and hydrodynamic characteristics (salinity, temperature, density and circulation), mixing processes (advection and diffusion), transport timescales such as the residence time and the exposure time. In the area of physical oceanography, experiments using these water bodies as a natural laboratory and interpreting their circulation and mixing processes using theoretical and semi-theoretical knowledge are of fundamental importance. Small-scale physical models may also be used together with analytical and numerical models. The book highlights the fact that research and theory are interactive, and the results provide the fundamentals for the development of the estuarine research.

  15. A systems approach to theoretical fluid mechanics: Fundamentals

    Science.gov (United States)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  16. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    International Nuclear Information System (INIS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-01-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  17. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    Energy Technology Data Exchange (ETDEWEB)

    Egan, James; McMillan, Normal; Denieffe, David, E-mail: eganj@itcarlow.ie [IT Carlow (Ireland)

    2011-08-17

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  18. An introduction to single-user information theory

    CERN Document Server

    Alajaji, Fady

    2018-01-01

    This book presents a succinct and mathematically rigorous treatment of the main pillars of Shannon’s information theory, discussing the fundamental concepts and indispensable results of Shannon’s mathematical theory of communications. It includes five meticulously written core chapters (with accompanying problems), emphasizing the key topics of information measures; lossless and lossy data compression; channel coding; and joint source-channel coding for single-user (point-to-point) communications systems. It also features two appendices covering necessary background material in real analysis and in probability theory and stochastic processes. The book is ideal for a one-semester foundational course on information theory for senior undergraduate and entry-level graduate students in mathematics, statistics, engineering, and computing and information sciences. A comprehensive instructor’s solutions manual is available.

  19. Infrared fixed point of SU(2) gauge theory with six flavors

    Science.gov (United States)

    Leino, Viljami; Rummukainen, Kari; Suorsa, Joni; Tuominen, Kimmo; Tähtinen, Sara

    2018-06-01

    We compute the running of the coupling in SU(2) gauge theory with six fermions in the fundamental representation of the gauge group. We find strong evidence that this theory has an infrared stable fixed point at strong coupling and measure also the anomalous dimension of the fermion mass operator at the fixed point. This theory therefore likely lies close to the boundary of the conformal window and will display novel infrared dynamics if coupled with the electroweak sector of the Standard Model.

  20. M-theory and Dualities

    International Nuclear Information System (INIS)

    Paulot, Louis

    2003-01-01

    In their search for a unified theory of fundamental interactions, with quantum gravity, physicists introduced superstring theories. In addition to the fundamental strings, they contain extended objects of diverse dimensions, exchanged by U-duality groups. There is also a conjectured mother theory, called 'M-theory', which would give eleven-dimensional supergravity in the low energy limit. In this work, we show that one can construct from del Pezzo surfaces generalized Kac-Moody super-algebras which extend U-duality groups. These super-algebras give the bosonic fields content of M-theory dimensional reductions. We recover the fields equations of motion as a self-duality condition, related to a symmetry of the Picard lattice of the corresponding del Pezzo surface. This allows to explain the symmetry of the 'magic triangle' of Cremmer, Julia, Lue and Pope. (author) [fr

  1. Unitary representations of the fundamental group of orbifolds

    Indian Academy of Sciences (India)

    in Theorem 1.2 are topological, taking values in rational cohomological ..... this is the fundamental group defined using Galois theory of covering stacks of Y .... natural action of G := Z/mZ on T given by the action of Gm on L; by the choice of the.

  2. Astrophysical probes of fundamental physics

    International Nuclear Information System (INIS)

    Martins, C.J.A.P.

    2009-01-01

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  3. Astrophysical probes of fundamental physics

    Energy Technology Data Exchange (ETDEWEB)

    Martins, C.J.A.P. [Centro de Astrofisica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); DAMTP, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom)

    2009-10-15

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  4. Constructivist Grounded Theory?

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon. PhD

    2012-06-01

    Full Text Available AbstractI refer to and use as scholarly inspiration Charmaz’s excellent article on constructivist grounded theory as a tool of getting to the fundamental issues on why grounded theory is not constructivist. I show that constructivist data, if it exists at all, is a very, very small part of the data that grounded theory uses.

  5. Understanding band gaps of solids in generalized Kohn-Sham theory.

    Science.gov (United States)

    Perdew, John P; Yang, Weitao; Burke, Kieron; Yang, Zenghui; Gross, Eberhard K U; Scheffler, Matthias; Scuseria, Gustavo E; Henderson, Thomas M; Zhang, Igor Ying; Ruzsinszky, Adrienn; Peng, Haowei; Sun, Jianwei; Trushin, Egor; Görling, Andreas

    2017-03-14

    The fundamental energy gap of a periodic solid distinguishes insulators from metals and characterizes low-energy single-electron excitations. However, the gap in the band structure of the exact multiplicative Kohn-Sham (KS) potential substantially underestimates the fundamental gap, a major limitation of KS density-functional theory. Here, we give a simple proof of a theorem: In generalized KS theory (GKS), the band gap of an extended system equals the fundamental gap for the approximate functional if the GKS potential operator is continuous and the density change is delocalized when an electron or hole is added. Our theorem explains how GKS band gaps from metageneralized gradient approximations (meta-GGAs) and hybrid functionals can be more realistic than those from GGAs or even from the exact KS potential. The theorem also follows from earlier work. The band edges in the GKS one-electron spectrum are also related to measurable energies. A linear chain of hydrogen molecules, solid aluminum arsenide, and solid argon provide numerical illustrations.

  6. The measurement theory of radioactivity in building materials

    International Nuclear Information System (INIS)

    Qu Jinhui; Wang Renbo; Zhang Xiongjie; Tan Hai; Zhu Zhipu; Man Zaigang

    2010-01-01

    Radioactivity in Building Materials is the main source of natural radiation dose that the individual is received, which has caused serious concern of all Social Sector. The paper completely introduce the measurement theory of the Radioactivity in Building Materials along with the measurement principle of natural radioactivity, design of shielding facility, choosing measurement time, sample prepared and spectrum analyzed. (authors)

  7. Ergodic theory and dynamical systems

    CERN Document Server

    Coudène, Yves

    2016-01-01

    This textbook is a self-contained and easy-to-read introduction to ergodic theory and the theory of dynamical systems, with a particular emphasis on chaotic dynamics. This book contains a broad selection of topics and explores the fundamental ideas of the subject. Starting with basic notions such as ergodicity, mixing, and isomorphisms of dynamical systems, the book then focuses on several chaotic transformations with hyperbolic dynamics, before moving on to topics such as entropy, information theory, ergodic decomposition and measurable partitions. Detailed explanations are accompanied by numerous examples, including interval maps, Bernoulli shifts, toral endomorphisms, geodesic flow on negatively curved manifolds, Morse-Smale systems, rational maps on the Riemann sphere and strange attractors. Ergodic Theory and Dynamical Systems will appeal to graduate students as well as researchers looking for an introduction to the subject. While gentle on the beginning student, the book also contains a number of commen...

  8. Testing the time-invariance of fundamental constants using microwave spectroscopy on cold diatomic radicals

    NARCIS (Netherlands)

    Bethlem, H.L.; Ubachs, W.M.G.

    2009-01-01

    The recently demonstrated methods to cool and manipulate neutral molecules offer new possibilities for precision tests of fundamental physics theories. We here discuss the possibility of testing the time-invariance of fundamental constants using near degeneracies between rotational levels in the

  9. Fundamentals of human resource management : emerging experiences from Africa

    NARCIS (Netherlands)

    Itika, J.

    2011-01-01

    The fundamentals of human resource management are extensively described in European and American literature. This book summarises the general human resource management philosophies, theories, strategies and techniques and links them to the specific African context. The usefulness of these general

  10. Decompositional equivalence: A fundamental symmetry underlying quantum theory

    OpenAIRE

    Fields, Chris

    2014-01-01

    Decompositional equivalence is the principle that there is no preferred decomposition of the universe into subsystems. It is shown here, by using simple thought experiments, that quantum theory follows from decompositional equivalence together with Landauer's principle. This demonstration raises within physics a question previously left to psychology: how do human - or any - observers agree about what constitutes a "system of interest"?

  11. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  12. Generalized G-theory

    International Nuclear Information System (INIS)

    Sladkowski, J.

    1991-01-01

    Various attempts to formulate the fundamental physical interactions in the framework of unified geometric theories have recently gained considerable success (Kaluza, 1921; Klein, 1926; Trautmann, 1970; Cho, 1975). Symmetries of the spacetime and so-called internal spaces seem to play a key role in investigating both the fundamental interactions and the abundance of elementary particles. The author presents a category-theoretic description of a generalization of the G-theory concept and its application to geometric compactification and dimensional reduction. The main reasons for using categories and functors as tools are the clearness and the level of generalization one can obtain

  13. Introductory photoemission theory

    International Nuclear Information System (INIS)

    Arai, Hiroko; Fujikawa, Takashi

    2010-01-01

    An introductory review is presented on the basis of many-body scattering theory. Some fundamental aspects of photoemission theory are discussed in detail. A few applications are also discussed; photoelectron diffraction, depth distribution function and multi-atom resonant photoemission are also discussed briefly. (author)

  14. Global anomalies in chiral gauge theories on the lattice

    International Nuclear Information System (INIS)

    Baer, O.; Campos, I.

    2000-01-01

    We discuss the issue of global anomalies in chiral gauge theories on the lattice. In Luescher's approach, these obstructions make it impossible to define consistently a fermionic measure for the path integral. We show that an SU(2) theory has such a global anomaly if the Weyl fermion is in the fundamental representation. The anomaly in higher representations is also discussed. We finally show that this obstruction is the lattice analogue of the SU(2) anomaly first discovered by Witten. (orig.)

  15. SU(2) with fundamental fermions and scalars

    Science.gov (United States)

    Hansen, Martin; Janowski, Tadeusz; Pica, Claudio; Toniato, Arianna

    2018-03-01

    We present preliminary results on the lattice simulation of an SU(2) gauge theory with two fermion flavors and one strongly interacting scalar field, all in the fundamental representation of SU(2). The motivation for this study comes from the recent proposal of "fundamental" partial compositeness models featuring strongly interacting scalar fields in addition to fermions. Here we describe the lattice setup for our study of this class of models and a first exploration of the lattice phase diagram. In particular we then investigate how the presence of a strongly coupled scalar field affects the properties of light meson resonances previously obtained for the SU(2) model. Preprint: CP3-Origins-2017-047 DNRF90

  16. Fundamentals of the fuzzy logic-based generalized theory of decisions

    CERN Document Server

    Aliev, Rafik Aziz

    2013-01-01

    Every day decision making and decision making in complex human-centric systems are characterized by imperfect decision-relevant information. Main drawback of the existing decision theories is namely incapability to deal with imperfect information and modeling vague preferences. Actually, a paradigm of non-numerical probabilities in decision making has a long history and arose also in Keynes’s analysis of uncertainty. There is a need for further generalization – a move to decision theories with perception-based imperfect information described in NL. The languages of new decision models for human-centric systems should be not languages based on binary logic but human-centric computational schemes able to operate on NL-described information. Development of new theories is now possible due to an increased computational power of information processing systems which allows for computations with imperfect information, particularly, imprecise and partially true information, which are much more complex than comput...

  17. Historical-systematic fundaments of the Trinitarian theory of the liturgical event

    Directory of Open Access Journals (Sweden)

    Robert Woźniak

    2011-12-01

    Full Text Available The object of present research is to develop some fundamental traces of the Trinitarian understanding of the Christian liturgy. The article attempts to point out to the fundamental coordinates of Trinitarian comprehension of the liturgy from the historical perspective. In order to do this, it traces the links between first formulations of Trinitarian faith and early development of the Christian liturgy. The argument starts with consideration of some new biblical approaches to the phenomena of early Christian cult seen in its theological (Christological and Trinitarian constellation (Bauckham, Hurtado. After this preliminary biblical-theological inquiry, some fundamental patristic texts are taken into account. The last stage of investigation is presentation of Second Vatican Council’s account of the theology of liturgy which proofs itself to be openly Trinitarian.

  18. A new fundamental type of conformational isomerism

    Science.gov (United States)

    Canfield, Peter J.; Blake, Iain M.; Cai, Zheng-Li; Luck, Ian J.; Krausz, Elmars; Kobayashi, Rika; Reimers, Jeffrey R.; Crossley, Maxwell J.

    2018-06-01

    Isomerism is a fundamental chemical concept, reflecting the fact that the arrangement of atoms in a molecular entity has a profound influence on its chemical and physical properties. Here we describe a previously unclassified fundamental form of conformational isomerism through four resolved stereoisomers of a transoid (BF)O(BF)-quinoxalinoporphyrin. These comprise two pairs of enantiomers that manifest structural relationships not describable within existing IUPAC nomenclature and terminology. They undergo thermal diastereomeric interconversion over a barrier of 104 ± 2 kJ mol-1, which we term `akamptisomerization'. Feasible interconversion processes between conceivable synthesis products and reaction intermediates were mapped out by density functional theory calculations, identifying bond-angle inversion (BAI) at a singly bonded atom as the reaction mechanism. We also introduce the necessary BAI stereodescriptors parvo and amplo. Based on an extended polytope formalism of molecular structure and stereoisomerization, BAI-driven akamptisomerization is shown to be the final fundamental type of conformational isomerization.

  19. Applications of measure theory to statistics

    CERN Document Server

    Pantsulaia, Gogi

    2016-01-01

    This book aims to put strong reasonable mathematical senses in notions of objectivity and subjectivity for consistent estimations in a Polish group by using the concept of Haar null sets in the corresponding group. This new approach – naturally dividing the class of all consistent estimates of an unknown parameter in a Polish group into disjoint classes of subjective and objective estimates – helps the reader to clarify some conjectures arising in the criticism of null hypothesis significance testing. The book also acquaints readers with the theory of infinite-dimensional Monte Carlo integration recently developed for estimation of the value of infinite-dimensional Riemann integrals over infinite-dimensional rectangles. The book is addressed both to graduate students and to researchers active in the fields of analysis, measure theory, and mathematical statistics.

  20. The Parliamentary Council's understanding of fundamental rights, and how fundamental rights are protected against nuclear power stations in operation

    International Nuclear Information System (INIS)

    Roth-Stielow, K.

    1980-01-01

    The author explains fundamentals rights in terms of protection of the individual, giving quotations from sittings of the Parliamentary Council. Fundamental rights are to be intergrated completely into Atomic Energy Law. Expert's opinions ought to be scrutinized in depth by court. Experts have to get down to the opinion of minorities. The author advocates the theory of margins. Experts' opinions or established values are nothing but referential values if they do not represent the well-balanced result obtained after considering all relevant, qualified experts' opinions including the opinions of minorities. Following precedents set by the Federal Constitutional Court, findings of a minority of natural scientists have to be considered the state of art or realized danger. (HSCH) [de

  1. Scoring and Classifying Examinees Using Measurement Decision Theory

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2009-04-01

    Full Text Available This paper describes and evaluates the use of measurement decision theory (MDT to classify examinees based on their item response patterns. The model has a simple framework that starts with the conditional probabilities of examinees in each category or mastery state responding correctly to each item. The presented evaluation investigates: (1 the classification accuracy of tests scored using decision theory; (2 the effectiveness of different sequential testing procedures; and (3 the number of items needed to make a classification. A large percentage of examinees can be classified accurately with very few items using decision theory. A Java Applet for self instruction and software for generating, calibrating and scoring MDT data are provided.

  2. Macroscopic fundamental strings in cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Aharonov, Y; Englert, F; Orloff, J

    1987-12-24

    We show that, when D greater than or equal to 4, theories of closed strings of closed strings in D, non-compact space-time dimensions exhibit a phase transition. The high-temperature phase is characterized by a condensate of arbitrarily long strings with Hausdorff dimension two (area filling curves). We suggest that this stringy phase is the ancestor of the adiabatic era. Fundamental strings could then both drive the inflation and seed, in a way reminiscent of the cosmic string mechanism, the large structures in the universe.

  3. Quantum Measurement Theory in Gravitational-Wave Detectors

    Directory of Open Access Journals (Sweden)

    Stefan L. Danilishin

    2012-04-01

    Full Text Available The fast progress in improving the sensitivity of the gravitational-wave detectors, we all have witnessed in the recent years, has propelled the scientific community to the point at which quantum behavior of such immense measurement devices as kilometer-long interferometers starts to matter. The time when their sensitivity will be mainly limited by the quantum noise of light is around the corner, and finding ways to reduce it will become a necessity. Therefore, the primary goal we pursued in this review was to familiarize a broad spectrum of readers with the theory of quantum measurements in the very form it finds application in the area of gravitational-wave detection. We focus on how quantum noise arises in gravitational-wave interferometers and what limitations it imposes on the achievable sensitivity. We start from the very basic concepts and gradually advance to the general linear quantum measurement theory and its application to the calculation of quantum noise in the contemporary and planned interferometric detectors of gravitational radiation of the first and second generation. Special attention is paid to the concept of the Standard Quantum Limit and the methods of its surmounting.

  4. Quantum Measurement Theory in Gravitational-Wave Detectors.

    Science.gov (United States)

    Danilishin, Stefan L; Khalili, Farid Ya

    2012-01-01

    The fast progress in improving the sensitivity of the gravitational-wave detectors, we all have witnessed in the recent years, has propelled the scientific community to the point at which quantum behavior of such immense measurement devices as kilometer-long interferometers starts to matter. The time when their sensitivity will be mainly limited by the quantum noise of light is around the corner, and finding ways to reduce it will become a necessity. Therefore, the primary goal we pursued in this review was to familiarize a broad spectrum of readers with the theory of quantum measurements in the very form it finds application in the area of gravitational-wave detection. We focus on how quantum noise arises in gravitational-wave interferometers and what limitations it imposes on the achievable sensitivity. We start from the very basic concepts and gradually advance to the general linear quantum measurement theory and its application to the calculation of quantum noise in the contemporary and planned interferometric detectors of gravitational radiation of the first and second generation. Special attention is paid to the concept of the Standard Quantum Limit and the methods of its surmounting.

  5. Variation of nonlinearity parameter at low fundamental amplitudes

    Science.gov (United States)

    Barnard, Daniel J.

    1999-04-01

    Recent harmonic generation measurements of the nonlinearity parameter β in polycrystalline Cu-Al alloys have shown a transition to lower values at low fundamental amplitude levels. Values for β at high (>10 Å) fundamental levels are in the range predicted by single-crystal second- and third-order elastic constants while lower fundamental levels (alloy by others. The source of the effect is unclear but initial results may require a reexamination of current methods for measurement of third-order elastic constants.

  6. Effective field theories

    International Nuclear Information System (INIS)

    Mack, G.; Kalkreuter, T.; Palma, G.; Speh, M.

    1992-05-01

    Effective field theories encode the predictions of a quantum field theory at low energy. The effective theory has a fairly low utraviolet cutoff. As a result, loop corrections are small, at least if the effective action contains a term which is quadratic in the fields, and physical predictions can be read straight from the effective Lagrangean. Methods will be discussed how to compute an effective low energy action from a given fundamental action, either analytically or numerically, or by a combination of both methods. Basically, the idea is to integrate out the high frequency components of fields. This requires the choice of a 'blockspin', i.e. the specification af a low frequency field as a function of the fundamental fields. These blockspins will be fields of the effective field theory. The blockspin need not be a field of the same type as one of the fundamental fields, and it may be composite. Special features of blockspin in nonabelian gauge theories will be discussed in some detail. In analytical work and in multigrid updating schemes one needs interpolation kernels A from coarse to fine grid in addition to the averaging kernels C which determines the blockspin. A neural net strategy for finding optimal kernels is presented. Numerical methods are applicable to obtain actions of effective theories on lattices of finite volume. The special case of a 'lattice' with a single site (the constraint effective potential) is of particular interest. In a higgs model, the effective action reduces in this case to the free energy, considered as a function of a gauge covariant magnetization. Its shape determines the phase structure of the theory. Its loop expansion with and without gauge fields can be used to determine finite size corrections to numerical data. (orig.)

  7. Fundamental Limit of Nanophotonic Light-trapping in Solar Cells

    OpenAIRE

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2010-01-01

    Establishing the fundamental limit of nanophotonic light-trapping schemes is of paramount importance and is becoming increasingly urgent for current solar cell research. The standard theory of light trapping demonstrated that absorption enhancement in a medium cannot exceed a factor of 4n^2/ sin^2(\\theta), where n is the refractive index of the active layer, and \\theta is the angle of the emission cone in the medium surrounding the cell. This theory, however, is not applicable in the nanophot...

  8. Spectral theory and quantum mechanics mathematical foundations of quantum theories, symmetries and introduction to the algebraic formulation

    CERN Document Server

    Moretti, Valter

    2017-01-01

    This book discusses the mathematical foundations of quantum theories. It offers an introductory text on linear functional analysis with a focus on Hilbert spaces, highlighting the spectral theory features that are relevant in physics. After exploring physical phenomenology, it then turns its attention to the formal and logical aspects of the theory. Further, this Second Edition collects in one volume a number of useful rigorous results on the mathematical structure of quantum mechanics focusing in particular on von Neumann algebras, Superselection rules, the various notions of Quantum Symmetry and Symmetry Groups, and including a number of fundamental results on the algebraic formulation of quantum theories. Intended for Master's and PhD students, both in physics and mathematics, the material is designed to be self-contained: it includes a summary of point-set topology and abstract measure theory, together with an appendix on differential geometry. The book also benefits established researchers by organizing ...

  9. Coupled variations of fundamental couplings and primordial nucleosynthesis

    International Nuclear Information System (INIS)

    Coc, Alain; Nunes, Nelson J.; Olive, Keith A.; Uzan, Jean-Philippe; Vangioni, Elisabeth

    2006-10-01

    The effect of variations of the fundamental nuclear parameters on big-bang nucleosynthesis are modeled and discussed in detail taking into account the interrelations between the fundamental parameters arising in unified theories. Considering only 4 He, strong constraints on the variation of the neutron lifetime, neutron-proton mass difference are set. These constraints are then translated into constraints on the time variation of the Yukawa couplings and the fine structure constant. Furthermore, we show that a variation of the deuterium binding energy is able to reconcile the 7 Li abundance deduced from the WMAP analysis with its spectroscopically determined value while maintaining concordance with D and 4 He. (authors)

  10. Institute for Nuclear Theory

    International Nuclear Information System (INIS)

    Haxton, W.; Bertsch, G.; Henley, E.M.

    1993-01-01

    This report briefly discussion the following programs of the Institute for Nuclear Theory: fundamental interactions in nuclei; strangeness in hadrons and nuclei; microscopic nuclear structure theory; nuclear physics in atoms and molecules; phenomenology and lattice QCD; and large amplitude collective motion

  11. Examining Teacher Grades Using Rasch Measurement Theory

    Science.gov (United States)

    Randall, Jennifer; Engelhard, George, Jr.

    2009-01-01

    In this study, we present an approach to questionnaire design within educational research based on Guttman's mapping sentences and Many-Facet Rasch Measurement Theory. We designed a 54-item questionnaire using Guttman's mapping sentences to examine the grading practices of teachers. Each item in the questionnaire represented a unique student…

  12. The emergence of spacetime in string theory

    CERN Document Server

    Vistarini, Tiziana

    2018-01-01

    The nature of space and time is one of the most fascinating and fundamental philosophical issues which presently engages at the deepest level with physics. During the last thirty years this notion has been object of an intense critical review in the light of new scientific theories which try to combine the principles of both general relativity and quantum theory—called theories of quantum gravity. This book considers the way string theory shapes its own account of spacetime disappearance from the fundamental level.

  13. Fundamental Design Principles for Transcription-Factor-Based Metabolite Biosensors.

    Science.gov (United States)

    Mannan, Ahmad A; Liu, Di; Zhang, Fuzhong; Oyarzún, Diego A

    2017-10-20

    Metabolite biosensors are central to current efforts toward precision engineering of metabolism. Although most research has focused on building new biosensors, their tunability remains poorly understood and is fundamental for their broad applicability. Here we asked how genetic modifications shape the dose-response curve of biosensors based on metabolite-responsive transcription factors. Using the lac system in Escherichia coli as a model system, we built promoter libraries with variable operator sites that reveal interdependencies between biosensor dynamic range and response threshold. We developed a phenomenological theory to quantify such design constraints in biosensors with various architectures and tunable parameters. Our theory reveals a maximal achievable dynamic range and exposes tunable parameters for orthogonal control of dynamic range and response threshold. Our work sheds light on fundamental limits of synthetic biology designs and provides quantitative guidelines for biosensor design in applications such as dynamic pathway control, strain optimization, and real-time monitoring of metabolism.

  14. The theory of particle interactions

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Shirkov, D.V.

    1991-01-01

    The Theory of Particle Interactions introduces students and physicists to the chronological development, concepts, main methods, and results of modern quantum field theory -- the most fundamental, abstract, and mathematical branch of theoretical physics. Belokurov and Shirkov, two prominent Soviet theoretical physicists, carefully describe the many facets of modern quantum theory including: renormalization theory and renormalization group; gauge theories and spontaneous symmetry breaking; the electroweak interaction theory and quantum chromodynamics; the schemes of the unification of the fundamental interactions; and super-symmetry and super-strings. The authors use a minimum of mathematical concepts and equations in describing the historical development, the current status, and the role of quantum field theory in modern theoretical physics. Because readers will be able to comprehend the main concepts of modern quantum theory without having to master its rather difficult apparatus, The Theory of Particle Interactions is ideal for those who seek a conceptual understanding of the subject. Students, physicists, mathematicians, and theoreticians involved in astrophysics, cosmology, and nuclear physics, as well as those interested in the philosophy and history of natural sciences will find The Theory of Particle Interactions invaluable and an important addition to their reading list

  15. Lithium-ion batteries fundamentals and applications

    CERN Document Server

    Wu, Yuping

    2015-01-01

    Lithium-Ion Batteries: Fundamentals and Applications offers a comprehensive treatment of the principles, background, design, production, and use of lithium-ion batteries. Based on a solid foundation of long-term research work, this authoritative monograph:Introduces the underlying theory and history of lithium-ion batteriesDescribes the key components of lithium-ion batteries, including negative and positive electrode materials, electrolytes, and separatorsDiscusses electronic conductive agents, binders, solvents for slurry preparation, positive thermal coefficient (PTC) materials, current col

  16. The relationship between theory of mind and relational frame theory : convergence of perspective-taking measures

    OpenAIRE

    Hendriks, A.; Barnes-Holmes, Yvonne; McEnteggart, Ciara; de Mey, H.; Witteman, C.; Janssen, G.; Egger, J.

    2016-01-01

    Objective: Perspective-taking difficulties have been demonstrated in autism and schizophrenia spectrum disorders, among other clinical presentations, and are traditionally examined from a Theory of Mind (ToM) point of view. Relational Frame Theory (RFT) offers a behavioural and contextual interpretation of perspective-taking, proposing that this ability can be studied in more detail by examining specific perspective-taking relations. To implement relational perspective-taking measures in clin...

  17. Quantifying and handling errors in instrumental measurements using the measurement error theory

    DEFF Research Database (Denmark)

    Andersen, Charlotte Møller; Bro, R.; Brockhoff, P.B.

    2003-01-01

    . This is a new way of using the measurement error theory. Reliability ratios illustrate that the models for the two fish species are influenced differently by the error. However, the error seems to influence the predictions of the two reference measures in the same way. The effect of using replicated x...... measurements. A new general formula is given for how to correct the least squares regression coefficient when a different number of replicated x-measurements is used for prediction than for calibration. It is shown that the correction should be applied when the number of replicates in prediction is less than...

  18. A signal detection-item response theory model for evaluating neuropsychological measures.

    Science.gov (United States)

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G

    2018-02-05

    Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the

  19. Noncontractible hyperloops in gauge models with Higgs fields in the fundamental representation

    Science.gov (United States)

    Burzlaff, Jürgen

    1984-11-01

    We study finite-energy configurations in SO( N) gauge theories with Higgs fields in the fundamental representation. For all winding numbers, noncontractible hyperloops are constructed. The corresponding energy density is spherically symmetric, and the configuration with maximal energy on each hyperloop can be determined. Noncontractible hyperloops with an arbitrary winding number for SU(2) gauge theory are also given.

  20. Noncontractible hyperloops in gauge models with Higgs fields in the fundamental representation

    Energy Technology Data Exchange (ETDEWEB)

    Burzlaff, J. (Dublin Inst. for Advanced Studies (Ireland). School of Theoretical Physics)

    1984-11-01

    We study finite-energy configurations in SO(N) gauge theories with Higgs fields in the fundamental representation. For all winding numbers, noncontractible hyperloops are constructed. The corresponding energy density is spherically symmetric, and the configuration with maximal energy on each hyperloop can be determined. Noncontractible hyperloops with an arbitrary winding number for SU(2) gauge theory are also given.

  1. Fundamental Theories and Key Technologies for Smart and Optimal Manufacturing in the Process Industry

    Directory of Open Access Journals (Sweden)

    Feng Qian

    2017-04-01

    Full Text Available Given the significant requirements for transforming and promoting the process industry, we present the major limitations of current petrochemical enterprises, including limitations in decision-making, production operation, efficiency and security, information integration, and so forth. To promote a vision of the process industry with efficient, green, and smart production, modern information technology should be utilized throughout the entire optimization process for production, management, and marketing. To focus on smart equipment in manufacturing processes, as well as on the adaptive intelligent optimization of the manufacturing process, operating mode, and supply chain management, we put forward several key scientific problems in engineering in a demand-driven and application-oriented manner, namely: ① intelligent sensing and integration of all process information, including production and management information; ② collaborative decision-making in the supply chain, industry chain, and value chain, driven by knowledge; ③ cooperative control and optimization of plant-wide production processes via human-cyber-physical interaction; and ④ life-cycle assessments for safety and environmental footprint monitoring, in addition to tracing analysis and risk control. In order to solve these limitations and core scientific problems, we further present fundamental theories and key technologies for smart and optimal manufacturing in the process industry. Although this paper discusses the process industry in China, the conclusions in this paper can be extended to the process industry around the world.

  2. Fundamentals and approximations of multilevel resonance theory for reactor physics applications

    International Nuclear Information System (INIS)

    Moore, M.S.

    1980-01-01

    The formal theory of nuclear reactions leads to any of a number of alternative representations for describing resonance behavior. None of these is satisfactory for applications, and, depending on the problem to be addressed, approximate expressions are used. The specializations and approximations found to be most useful by evaluators are derived from R-matrix theory and are discussed from the viewpoint of convenience in numerical calculations. Finally, we illustrate the application of the theory by reviewing a particular example: the spin-separated neutron-induced cross sections of 235 U in the resolved and unresolved resonance regions and the use of these results in the U.S. evaluated nuclear data file ENDF/B. (author)

  3. Measurement Models for Reasoned Action Theory

    OpenAIRE

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...

  4. Proof of the fundamental BCJ relations for QCD amplitudes

    International Nuclear Information System (INIS)

    Cruz, Leonardo de la; Kniss, Alexander; Weinzierl, Stefan

    2015-01-01

    The fundamental BCJ-relation is a linear relation between primitive tree amplitudes with different cyclic orderings. The cyclic orderings differ by the insertion place of one gluon. The coefficients of the fundamental BCJ-relation are linear in the Lorentz invariants 2p_ip_j. The BCJ-relations are well established for pure gluonic amplitudes as well as for amplitudes in N=4 super-Yang-Mills theory. Recently, it has been conjectured that the BCJ-relations hold also for QCD amplitudes. In this paper we give a proof of this conjecture. The proof is valid for massless and massive quarks.

  5. Renormalized plasma turbulence theory: A quasiparticle picture

    International Nuclear Information System (INIS)

    DuBois, D.F.

    1981-01-01

    A general renormalized statistical theory of Vlasov turbulence is given which proceeds directly from the Vlasov equation and does not assume prior knowledge of sophisticated field-theoretic techniques. Quasiparticles are the linear excitations of the turbulent system away from its instantaneous mean (ensemble-averaged) state or background; the properties of this background state ''dress'' or renormalize the quasiparticle responses. It is shown that all two-point responses (including the dielectric) and all two-point correlation functions can be completely described by the mean distribution function and three fundamental quantities. Two of these are the quasiparticle responses: the propagator and the potential source: which measure, respectively, the separate responses of the mean distribution function and the mean electrostatic potential to functional changes in an external phase-space source added to Vlasov's equation. The third quantity is the two-point correlation function of the incoherent part of the phase-space density which acts as a self-consistent source of quasiparticle and potential fluctuations. This theory explicitly takes into account the self-consistent nature of the electrostatic-field fluctuations which introduces new effects not found in the usual ''test-particle'' theories. Explicit equations for the fundamental quantities are derived in the direct interaction approximation. Special attention is paid to the two-point correlations and the relation to theories of phase-space granulation

  6. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) are counted by high resolution gamma spectrometry registrating a pulse-heights distribution (acquisition of a multichannel spectrum), for example on samples. It considers exclusively the random character of radioactive decay and of pulse counting and ignores all other influences (e.g. arising from sample treatment, weighing, enrichment or the instability of the test setup). It assumes that the distance of neighbouring peaks of gamma lines is not smaller than four times the full width half maximum (FWHM) of gamma line and that the background near to gamma line is nearly a straight line. Otherwise ISO 11929-1 or ISO 11929-2 should be used. ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as nonradioactive waste, and ISO 11929-1, ISO 11929-2 and ISO 11929-4, and is, consequently, complementary to these documents

  7. Theory of “Weak Value" and Quantum Mechanical Measurements

    OpenAIRE

    Shikano, Yutaka

    2012-01-01

    Comment: to be published from "Measurements in Quantum Mechanics", edited by M. R. Pahlavani (InTech, 2012) Chapter 4 page 75. Yutaka Shikano (2012). ISBN: 978-953-51-0058-4 Available from: http://www.intechopen.com/articles/show/title/theory-of-weak-value-and-quantum-mechanical-measurement

  8. Particles, fields and quantum theory

    International Nuclear Information System (INIS)

    Bongaarts, P.J.M.

    1982-01-01

    The author gives an introduction to the development of gauge theories of the fundamental interactions. Starting from classical mechanics and quantum mechanics the development of quantum electrodynamics and non-abelian gauge theories is described. (HSI)

  9. General physical fundamentals of isotope hydrology

    International Nuclear Information System (INIS)

    Moser, H.; Rauert, W.

    1976-01-01

    A description is given of the measurement and measuring units of stable isotopes, the physical properties, measurement and measuring units of radioactive isotopes, the fundamentals of the tracer technique, the environmental isotope distribution in the hydrosphere and the radiation protection in isotope hydrological investigations. (HK) [de

  10. Fundamental Structure of Loop Quantum Gravity

    Science.gov (United States)

    Han, Muxin; Ma, Yongge; Huang, Weiming

    In the recent twenty years, loop quantum gravity, a background independent approach to unify general relativity and quantum mechanics, has been widely investigated. The aim of loop quantum gravity is to construct a mathematically rigorous, background independent, non-perturbative quantum theory for a Lorentzian gravitational field on a four-dimensional manifold. In the approach, the principles of quantum mechanics are combined with those of general relativity naturally. Such a combination provides us a picture of, so-called, quantum Riemannian geometry, which is discrete on the fundamental scale. Imposing the quantum constraints in analogy from the classical ones, the quantum dynamics of gravity is being studied as one of the most important issues in loop quantum gravity. On the other hand, the semi-classical analysis is being carried out to test the classical limit of the quantum theory. In this review, the fundamental structure of loop quantum gravity is presented pedagogically. Our main aim is to help non-experts to understand the motivations, basic structures, as well as general results. It may also be beneficial to practitioners to gain insights from different perspectives on the theory. We will focus on the theoretical framework itself, rather than its applications, and do our best to write it in modern and precise langauge while keeping the presentation accessible for beginners. After reviewing the classical connection dynamical formalism of general relativity, as a foundation, the construction of the kinematical Ashtekar-Isham-Lewandowski representation is introduced in the content of quantum kinematics. The algebraic structure of quantum kinematics is also discussed. In the content of quantum dynamics, we mainly introduce the construction of a Hamiltonian constraint operator and the master constraint project. At last, some applications and recent advances are outlined. It should be noted that this strategy of quantizing gravity can also be extended to

  11. Fundamental limit of nanophotonic light trapping in solar cells.

    Science.gov (United States)

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2010-10-12

    Establishing the fundamental limit of nanophotonic light-trapping schemes is of paramount importance and is becoming increasingly urgent for current solar cell research. The standard theory of light trapping demonstrated that absorption enhancement in a medium cannot exceed a factor of 4n(2)/sin(2)θ, where n is the refractive index of the active layer, and θ is the angle of the emission cone in the medium surrounding the cell. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the conventional limit can be substantially surpassed when optical modes exhibit deep-subwavelength-scale field confinement, opening new avenues for highly efficient next-generation solar cells.

  12. Construct Measurement in Management Research

    DEFF Research Database (Denmark)

    Nielsen, Bo Bernhard

    2014-01-01

    Far too often do management scholars resort to crude and often inappropriate measures of fundamental constructs in their research; an approach which calls in question the interpretation and validity of their findings. Scholars often legitimize poor choices in measurement with a lack of availability......, this research note raises important questions about the use of proxies in management research and argues for greater care in operationalizing constructs with particular attention to matching levels of theory and measurement....

  13. Fundamental plasma emission involving ion sound waves

    International Nuclear Information System (INIS)

    Cairns, I.H.

    1987-01-01

    The theory for fundamental plasma emission by the three-wave processes L ± S → T (where L, S and T denote Langmuir, ion sound and transverse waves, respectively) is developed. Kinematic constraints on the characteristics and growth lengths of waves participating in the wave processes are identified. In addition the rates, path-integrated wave temperatures, and limits on the brightness temperature of the radiation are derived. (author)

  14. Compatible quantum theory

    International Nuclear Information System (INIS)

    Friedberg, R; Hohenberg, P C

    2014-01-01

    completion of the theory requires a macroscopic mechanism for selecting a physical framework, which is part of the macroscopic theory (MAQM). The selection of a physical framework involves the breaking of the microscopic ‘framework symmetry’, which can proceed either phenomenologically as in the standard quantum measurement theory, or more fundamentally by considering the quantum system under study to be a subsystem of a macroscopic quantum system. The decoherent histories formulation of Gell-Mann and Hartle, as well as that of Omnès, are theories of this fundamental type, where the physical framework is selected by a coarse-graining procedure in which the physical phenomenon of decoherence plays an essential role. Various well-known interpretations of QM are described from the perspective of CQT. Detailed definitions and proofs are presented in the appendices. (key issues reviews)

  15. Geometric group theory

    CERN Document Server

    Druţu, Cornelia

    2018-01-01

    The key idea in geometric group theory is to study infinite groups by endowing them with a metric and treating them as geometric spaces. This applies to many groups naturally appearing in topology, geometry, and algebra, such as fundamental groups of manifolds, groups of matrices with integer coefficients, etc. The primary focus of this book is to cover the foundations of geometric group theory, including coarse topology, ultralimits and asymptotic cones, hyperbolic groups, isoperimetric inequalities, growth of groups, amenability, Kazhdan's Property (T) and the Haagerup property, as well as their characterizations in terms of group actions on median spaces and spaces with walls. The book contains proofs of several fundamental results of geometric group theory, such as Gromov's theorem on groups of polynomial growth, Tits's alternative, Stallings's theorem on ends of groups, Dunwoody's accessibility theorem, the Mostow Rigidity Theorem, and quasiisometric rigidity theorems of Tukia and Schwartz. This is the f...

  16. ACADEMIC TRAINING: Low Energy Experiments that Measure Fundamental Constants and Test Basic Symmetries

    CERN Multimedia

    Françoise Benz

    2002-01-01

    17, 18, 19 , 21 June LECTURE SERIES from 11.00 to 12.00 hrs - Auditorium, bldg. 500 Low Energy Experiments that Measure Fundamental Constants and Test Basic Symmetries by G. GABRIELSE / Professor of Physics and Chair of the Harvard Physics Department, Spokesperson for the ATRAP Collaboration Lecture 1: Particle Traps: the World's Tiniest Accelerators A single elementary particle, or a single ion, can be confined in a tiny accelerator called a particle trap. A single electron was held this way for more than ten months, and antiprotons for months. Mass spectroscopy of exquisite precision is possible with such systems. CERN's TRAP Collaboration thereby compared the charge-to-mass ratios of the antiproton and proton to a precision of 90 parts per trillion, by far the most stringent CPT test done with a baryon system. The important ratio of the masses of the electron and proton have been similarly measured, as have a variety of ions masses, and the neutron mass is most accurately known from such measurements. An i...

  17. A measurement theory of illusory conjunctions.

    Science.gov (United States)

    Prinzmetal, William; Ivry, Richard B; Beck, Diane; Shimizu, Naomi

    2002-04-01

    Illusory conjunctions refer to the incorrect perceptual combination of correctly perceived features, such as color and shape. Research on the phenomenon has been hampered by the lack of a measurement theory that accounts for guessing features, as well as the incorrect combination of correctly perceived features. Recently, several investigators have suggested using multinomial models as a tool for measuring feature integration. The authors examined the adequacy of these models in 2 experiments by testing whether model parameters reflect changes in stimulus factors. In a third experiment, confidence ratings were used as a tool for testing the model. Multinomial models accurately reflected both variations in stimulus factors and observers' trial-by-trial confidence ratings.

  18. Fundamental ecology is fundamental.

    Science.gov (United States)

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Overview of Classical Test Theory and Item Response Theory for Quantitative Assessment of Items in Developing Patient-Reported Outcome Measures

    Science.gov (United States)

    Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.

    2014-01-01

    Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753

  20. Noncontractible hyperloops in gauge models with Higgs fields in the fundamental representation

    International Nuclear Information System (INIS)

    Burzlaff, J.

    1984-01-01

    We study finite-energy configurations in SO(N) gauge theories with Higgs fields in the fundamental representation. For all winding numbers, noncontractible hyperloops are constructed. The corresponding energy density is spherically symmetric, and the configuration with maximal energy on each hyperloop can be determined. Noncontractible hyperloops with an arbitrary winding number for SU(2) gauge theory are also given. (orig.)

  1. Quantum dissipative systems from theory of continuous measurements

    International Nuclear Information System (INIS)

    Mensky, Michael B.; Stenholm, Stig

    2003-01-01

    We apply the restricted-path-integral (RPI) theory of non-minimally disturbing continuous measurements for correct description of frictional Brownian motion. The resulting master equation is automatically of the Lindblad form, so that the difficulties typical of other approaches do not exist. In the special case of harmonic oscillator the known familiar master equation describing its frictionally driven Brownian motion is obtained. A thermal reservoir as a measuring environment is considered

  2. Investigating fundamental physics and space environment with a dedicated Earth-orbiting spacecraft

    Science.gov (United States)

    Peron, Roberto

    -year requirement and thus they need specific arrangements for deorbiting at the end of life or they can simply rely on mother nature for reentry. The goal of this proposed approach is to utilize existing technology developed for acceleration measurement in space and state-of-the-art satellite tracking to precisely determine the orbit of a satellite with well-defined geometrical and mass characteristics (i.e., (A/m) ratio), at the same time accurately measuring over a long period of time the drag deceleration (as well as others non-gravitational effects) acting on the satellite. This will result in a virtually drag-free object that can be exploited to: 1. perform fundamental physics tests by verifying the equation of motion of a test mass in the general relativistic context and placing limits to alternative theories of gravitation; 2. improve the knowledge of selected tidal terms; 3. map, through acceleration measurements, the atmospheric density in the orbital region of interest. In its preliminary incarnation, the satellite would be cylindrical in shape and spinning about its cylinder axis that would be also orthogonal to the orbital plane. The satellite should be placed on a dawn-dusk, sun-synchronous, elliptical orbit spanning the orbital altitudes of interest (e.g., between 500 and 1200 km of altitude). The satellite should be equipped with a 3-axis accelerometer package with an acceleration resolution better than (10^{-11} g) (with (g) the acceleration at the Earth's surface). The expected measurement range is (10^{-8} - 10^{-11} g) considering estimates of drag forces at minimum and maximum solar activity conditions in the altitude range of interest and a preliminary estimate of the satellite (A/m) ratio. The overall concept of the mission will be discussed, concentrating on the fundamental aspects and main scientific return. The main instrumentation to be hosted on-board the spacecraft will be then reviewed, with a focus on current and projected capabilities.

  3. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  4. Theory of colours

    CERN Document Server

    Goethe, Johann Wolfgang von

    2006-01-01

    The wavelength theory of light and color had been firmly established by the time the great German poet published his Theory of Colours in 1810. Nevertheless, Goethe believed that the theory derived from a fundamental error, in which an incidental result was mistaken for a elemental principle. Far from affecting a knowledge of physics, he maintained that such a background would inhibit understanding. The conclusions Goethe draws here rest entirely upon his personal observations.This volume does not have to be studied to be appreciated. The author's subjective theory of colors permits him to spe

  5. Chronology protection in string theory

    International Nuclear Information System (INIS)

    Dyson, Lisa

    2004-01-01

    Many solutions of General Relativity appear to allow the possibility of time travel. This was initially a fascinating discovery, but geometries of this type violate causality, a basic physical law which is believed to be fundamental. Although string theory is a proposed fundamental theory of quantum gravity, geometries with closed timelike curves have resurfaced as solutions to its low energy equations of motion. In this paper, we will study the class of solutions to low energy effective supergravity theories related to the BMPV black hole and the rotating wave-D1-D5-brane system. Time travel appears to be possible in these geometries. We will attempt to build the causality violating regions and propose that stringy effects prohibit their construction. The proposed chronology protection agent for these geometries mirrors a mechanism string theory employs to resolve a class of naked singularities. (author)

  6. Moessbauer spectroscopy and transition metal chemistry. Fundamentals and applications

    International Nuclear Information System (INIS)

    Guetlich, Philipp; Trautwein, Alfred X.

    2011-01-01

    Moessbauer spectroscopy is a profound analytical method which has nevertheless continued to develop. The authors now present a state-of-the art book which consists of two parts. The first part details the fundamentals of Moessbauer spectroscopy and is based on a book published in 1978 in the Springer series 'Inorganic Chemistry Concepts' by P. Guetlich, R. Link and A.X. Trautwein. The second part covers useful practical aspects of measurements, and the application of the techniques to many problems of materials characterization. The update includes the use of synchroton radiation and many instructive and illustrative examples in fields such as solid state chemistry, biology and physics, materials and the geosciences, as well as industrial applications. Special chapters on magnetic relaxation phenomena (S. Morup) and computation of hyperfine interaction parameters (F. Neese) are also included. An attached CD-ROM with more than 400 full-color PowerPoint images provides self-explanatory examples. The book concentrates on teaching the technique using theory as much as needed and as little as possible. The reader will learn the fundamentals of the technique and how to apply it to many problems of materials characterization. Transition metal chemistry, studied on the basis of the most widely used Moessbauer isotopes, is in the foreground. (orig.)

  7. Probabilities and Shannon's Entropy in the Everett Many-Worlds Theory

    Directory of Open Access Journals (Sweden)

    Andreas Wichert

    2016-12-01

    Full Text Available Following a controversial suggestion by David Deutsch that decision theory can solve the problem of probabilities in the Everett many-worlds we suggest that the probabilities are induced by Shannon's entropy that measures the uncertainty of events. We argue that a relational person prefers certainty to uncertainty due to fundamental biological principle of homeostasis.

  8. Fundamentals of electro-engineering I

    International Nuclear Information System (INIS)

    Rapsik, M.; Smola, M.; Bohac, M.; Mucha, M.

    2004-01-01

    This is the text-book of the fundamentals of electro-engineering. It contains the following chapters: (1) Selected terms in electro-engineering; (2) Fundamental electric values; (3) Energy and their transformations; (4) Water, hydro-energy and hydro-energetic potential of the Slovak Republic; (5) Nuclear power engineering; (6) Conventional thermal power plants; (7) Heating and cogeneration of electric power and heat production; (8) Equipment of electricity supply system; (9) Measurements in electro-engineering ; (10) Regulation of frequency and voltage, electric power quality

  9. Measuring Theory of Mind in Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Brewer, Neil; Young, Robyn L.; Barnett, Emily

    2017-01-01

    Deficits in Theory of Mind (ToM)--the ability to interpret others' beliefs, intentions and emotions--undermine the ability of individuals with Autism Spectrum Disorder (ASD) to interact in socially normative ways. This study provides psychometric data for the Adult-Theory of Mind (A-ToM) measure using video-scenarios based in part on Happé's…

  10. Is education a fundamental right? People's lay theories about intellectual potential drive their positions on education

    OpenAIRE

    Savani, K; Rattan, A; Dweck, C S

    2017-01-01

    Does every child have a fundamental right to receive a high quality education? We propose that people’s beliefs about whether “nearly everyone” or “only some people” have high intellectual potential drive their positions on education. Three studies found that the more people believed that nearly everyone has high potential, the more they viewed education as a fundamental human right. Further, people who viewed education as a fundamental right, in turn, (1) were more likely to support the inst...

  11. Measurement Invariance: A Foundational Principle for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…

  12. Historical Views of Invariance: Evidence from the Measurement Theories of Thorndike, Thurstone, and Rasch.

    Science.gov (United States)

    Engelhard, George, Jr.

    1992-01-01

    A historical perspective is provided of the concept of invariance in measurement theory, describing sample-invariant item calibration and item-invariant measurement of individuals. Invariance as a key measurement concept is illustrated through the measurement theories of E. L. Thorndike, L. L. Thurstone, and G. Rasch. (SLD)

  13. Global anomalies in chiral lattice gauge theories

    International Nuclear Information System (INIS)

    Baer, O.

    2000-07-01

    We study global anomalies in a new approach to chiral gauge theories on the lattice, which is based on the Ginsparg-Wilson relation. In this approach, global anomalies make it impossible to define consistently a fermionic measure for the functional integral. We show that a global anomaly occurs in an SU(2) theory if the fundamental representation is used for the fermion fields. The generalization to higher representations is also discussed. In addition we establish a close relation between global anomalies and the spectral flow of the Dirac operator and employ it in a numerical computation to prove the existence of the global SU(2) anomaly in a different way. This method is inspired by an earlier work of Witten who first discovered this type of anomalies in continuum field theory. (orig.)

  14. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures.

    Science.gov (United States)

    Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D

    2014-05-01

    The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  15. Geometric measure theory a beginner's guide

    CERN Document Server

    Morgan, Frank

    1995-01-01

    Geometric measure theory is the mathematical framework for the study of crystal growth, clusters of soap bubbles, and similar structures involving minimization of energy. Morgan emphasizes geometry over proofs and technicalities, and includes a bibliography and abundant illustrations and examples. This Second Edition features a new chapter on soap bubbles as well as updated sections addressing volume constraints, surfaces in manifolds, free boundaries, and Besicovitch constant results. The text will introduce newcomers to the field and appeal to mathematicians working in the field.

  16. DOE Fundamentals Handbook: Classical Physics

    International Nuclear Information System (INIS)

    1992-06-01

    The Classical Physics Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of physical forces and their properties. The handbook includes information on the units used to measure physical properties; vectors, and how they are used to show the net effect of various forces; Newton's Laws of motion, and how to use these laws in force and motion applications; and the concepts of energy, work, and power, and how to measure and calculate the energy involved in various applications. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility systems and equipment

  17. The Power of M Theory

    OpenAIRE

    Schwarz, John H.

    1995-01-01

    A proposed duality between type IIB superstring theory on R^9 X S^1 and a conjectured 11D fundamental theory (``M theory'') on R^9 X T^2 is investigated. Simple heuristic reasoning leads to a consistent picture relating the various p-branes and their tensions in each theory. Identifying the M theory on R^{10} X S^1 with type IIA superstring theory on R^{10}, in a similar fashion, leads to various relations among the p-branes of the IIA theory.

  18. A theory of the coherent fundamental plasma emission in Tokamaks

    International Nuclear Information System (INIS)

    Alves, M.V.; Chian, A.C.-L.

    1987-01-01

    A theoretical model of coherent radiation near the fundamental plasma frequency in tokamaks is proposed. It is shown that, in the presence of runaway electrons, the beam-generated Langmuir waves (L) can be parametrically converted into electromagnetic waves (T) through ponderomotive coupling to ion acoustic waves (S). Two types of pumps are considered: travelling wave pump and standing wave pump. Expressions are derived for the excitation conditions and the growth rates of electromagnetic decay instabilities (L-> T + S), electromagnetic fusion instabilities (L + S -> T) and electromagnetic oscillating two-stream instabilities (L -> T+- S * , where S * is a purely growing mode). (author) [pt

  19. A theory of the coherent fundamental plasma emission in Tokamaks

    International Nuclear Information System (INIS)

    Alves, M.V.; Chian, A.C.-L.

    1987-07-01

    A theoretical model of coherent radiation near the fundamental plasma frequency in Tokamaks is proposed. It is shown that, in the presence of runaway electrons, the beam-generated Langmuir waves (L) can be paarmetrically converted into electromagnetic waves (T) through ponderomotive coupling to ion acoustic waves (S). Two types of pumps are considered: traveling wave and standing wave pump. Expressions are derived for the excitation conditions and the growth rates of electomagnetic decay instabilities (L → T + S), electromagnetic fusion instabilities (L + S → T) and electromagnetic oscillating two-stream instabilities (L → T+-S sup(*) is a purely growing mode). (author) [pt

  20. Measure and integration an advanced course in basic procedures and applications

    CERN Document Server

    König, Heinz

    1997-01-01

    This book sets out to restructure certain fundamentals in measure and integration theory, and thus to fee the theory from some notorious drawbacks. It centers around the ubiquitous task of producing appropriate contents and measures from more primitive data, in order to extend elementary contents and to represent elementary integrals. This task has not been met with adequate unified means so far. The traditional main tools, the Carathéodory and Daniell-Stone theorems, are too restrictive and had to be supplemented by other ad-hoc procedures. Around 1970 a new approach emerged, based on the notion of regularity, which in traditional measure theory is linked to topology. The present book develops the new approach into a systematic theory. The theory unifies the entire context and is much more powerful than the former means. It has striking implications all over measure theory and beyond. Thus it extends the Riesz representation theorem in terms of Randon measures from locally compact to arbitrary Hausdorff top...

  1. SUSY field theories in higher dimensions and integrable spin chains

    International Nuclear Information System (INIS)

    Gorsky, A.; Gukov, S.; Mironov, A.

    1998-01-01

    Five- and six-dimensional SUSY gauge theories, with one or two compactified directions, are discussed. The 5d theories with the matter hypermultiplets in the fundamental representation are associated with the twisted XXZ spin chain, while the group product case with bi-fundamental matter corresponds to the higher rank spin chains. The Riemann surfaces for 6d theories with fundamental matter and two compact directions are proposed to correspond to the XYZ spin chain based on the Sklyanin algebra. We also discuss the obtained results within the brane and geometrical engineering frameworks and explain the relation to the toric diagrams. (orig.)

  2. Improving measurement of injection drug risk behavior using item response theory.

    Science.gov (United States)

    Janulis, Patrick

    2014-03-01

    Recent research highlights the multiple steps to preparing and injecting drugs and the resultant viral threats faced by drug users. This research suggests that more sensitive measurement of injection drug HIV risk behavior is required. In addition, growing evidence suggests there are gender differences in injection risk behavior. However, the potential for differential item functioning between genders has not been explored. To explore item response theory as an improved measurement modeling technique that provides empirically justified scaling of injection risk behavior and to examine for potential gender-based differential item functioning. Data is used from three studies in the National Institute on Drug Abuse's Criminal Justice Drug Abuse Treatment Studies. A two-parameter item response theory model was used to scale injection risk behavior and logistic regression was used to examine for differential item functioning. Item fit statistics suggest that item response theory can be used to scale injection risk behavior and these models can provide more sensitive estimates of risk behavior. Additionally, gender-based differential item functioning is present in the current data. Improved measurement of injection risk behavior using item response theory should be encouraged as these models provide increased congruence between construct measurement and the complexity of injection-related HIV risk. Suggestions are made to further improve injection risk behavior measurement. Furthermore, results suggest direct comparisons of composite scores between males and females may be misleading and future work should account for differential item functioning before comparing levels of injection risk behavior.

  3. Practical application of the theory of errors in measurement

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This chapter addresses the practical application of the theory of errors in measurement. The topics of the chapter include fixing on a maximum desired error, selecting a maximum error, the procedure for limiting the error, utilizing a standard procedure, setting specifications for a standard procedure, and selecting the number of measurements to be made

  4. Selfish goals serve more fundamental social and biological goals.

    Science.gov (United States)

    Becker, D Vaughn; Kenrick, Douglas T

    2014-04-01

    Proximate selfish goals reflect the machinations of more fundamental goals such as self-protection and reproduction. Evolutionary life history theory allows us to make predictions about which goals are prioritized over others, which stimuli release which goals, and how the stages of cognitive processing are selectively influenced to better achieve the aims of those goals.

  5. A theory of the strong interactions

    International Nuclear Information System (INIS)

    Gross, D.J.

    1979-01-01

    The most promising candidate for a fundamental microscopic theory of the strong interactions is a gauge theory of colored quarks-Quantum Chromodynamics (QCD). There are many excellent reasons for believing in this theory. It embodies the broken symmetries, SU(3) and chiral SU(3)xSU(3), of the strong interactions and reflects the success of (albeit crude) quark models in explaining the spectrum of the observed hadrons. The hidden quantum number of color, necessary to account for the quantum numbers of the low lying hadrons, plays a fundamental role in this theory as the SU(3) color gauge vector 'gluons' are the mediators of the strong interactions. The absence of physical quark states can be 'explained' by the hypothesis of color confinement i.e. that hadrons are permanently bound in color singlet bound states. Finally this theory is unique in being asymptotically free, thus accounting for the almost free field theory behvior of quarks observed at short distances. (Auth.)

  6. When fast is better: protein folding fundamentals and mechanisms from ultrafast approaches.

    Science.gov (United States)

    Muñoz, Victor; Cerminara, Michele

    2016-09-01

    Protein folding research stalled for decades because conventional experiments indicated that proteins fold slowly and in single strokes, whereas theory predicted a complex interplay between dynamics and energetics resulting in myriad microscopic pathways. Ultrafast kinetic methods turned the field upside down by providing the means to probe fundamental aspects of folding, test theoretical predictions and benchmark simulations. Accordingly, experimentalists could measure the timescales for all relevant folding motions, determine the folding speed limit and confirm that folding barriers are entropic bottlenecks. Moreover, a catalogue of proteins that fold extremely fast (microseconds) could be identified. Such fast-folding proteins cross shallow free energy barriers or fold downhill, and thus unfold with minimal co-operativity (gradually). A new generation of thermodynamic methods has exploited this property to map folding landscapes, interaction networks and mechanisms at nearly atomic resolution. In parallel, modern molecular dynamics simulations have finally reached the timescales required to watch fast-folding proteins fold and unfold in silico All of these findings have buttressed the fundamentals of protein folding predicted by theory, and are now offering the first glimpses at the underlying mechanisms. Fast folding appears to also have functional implications as recent results connect downhill folding with intrinsically disordered proteins, their complex binding modes and ability to moonlight. These connections suggest that the coupling between downhill (un)folding and binding enables such protein domains to operate analogically as conformational rheostats. © 2016 The Author(s).

  7. Real analysis an introduction to the theory of real functions and integration

    CERN Document Server

    Dshalalow, Jewgeni H

    2000-01-01

    Designed for use in a two-semester course on abstract analysis, REAL ANALYSIS: An Introduction to the Theory of Real Functions and Integration illuminates the principle topics that constitute real analysis. Self-contained, with coverage of topology, measure theory, and integration, it offers a thorough elaboration of major theorems, notions, and constructions needed not only by mathematics students but also by students of statistics and probability, operations research, physics, and engineering.Structured logically and flexibly through the author''s many years of teaching experience, the material is presented in three main sections:Part 1, chapters 1through 3, covers the preliminaries of set theory and the fundamentals of metric spaces and topology. This section can also serves as a text for first courses in topology.Part II, chapter 4 through 7, details the basics of measure and integration and stands independently for use in a separate measure theory course.Part III addresses more advanced topics, includin...

  8. Quantum information and relativity theory

    International Nuclear Information System (INIS)

    Peres, Asher; Terno, Daniel R.

    2004-01-01

    This article discusses the intimate relationship between quantum mechanics, information theory, and relativity theory. Taken together these are the foundations of present-day theoretical physics, and their interrelationship is an essential part of the theory. The acquisition of information from a quantum system by an observer occurs at the interface of classical and quantum physics. The authors review the essential tools needed to describe this interface, i.e., Kraus matrices and positive-operator-valued measures. They then discuss how special relativity imposes severe restrictions on the transfer of information between distant systems and the implications of the fact that quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about that Lorentz transformations of reduced density matrices for entangled systems may not be completely positive maps. Quantum field theory is, of course, necessary for a consistent description of interactions. Its structure implies a fundamental tradeoff between detector reliability and localizability. Moreover, general relativity produces new and counterintuitive effects, particularly when black holes (or, more generally, event horizons) are involved. In this more general context the authors discuss how most of the current concepts in quantum information theory may require a reassessment

  9. Quantum Mechanics of Fundamental Systems: The Quest for Beauty and Simplicity

    CERN Document Server

    Zanelli, Jorge

    2009-01-01

    A collection of contributed papers by former collaborators and colleagues of the author. It includes such topics as, General Relativity, Quantum Gravity, String Theory; from mathematical structures underlying the fundamental interactions, to cosmological scenarios describing the universe at its birth

  10. Information theory and rate distortion theory for communications and compression

    CERN Document Server

    Gibson, Jerry

    2013-01-01

    This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the cover

  11. Pure Gravities via Color-Kinematics Duality for Fundamental Matter

    CERN Document Server

    Johansson, Henrik

    2015-01-01

    We give a prescription for the computation of loop-level scattering amplitudes in pure Einstein gravity, and four-dimensional pure supergravities, using the color-kinematics duality. Amplitudes are constructed using double copies of pure (super-)Yang-Mills parts and additional contributions from double copies of fundamental matter, which are treated as ghosts. The opposite-statistics states cancel the unwanted dilaton and axion in the bosonic theory, as well as the extra matter supermultiplets in supergravities. As a spinoff, we obtain a prescription for obtaining amplitudes in supergravities with arbitrary non-self-interacting matter. As a prerequisite, we extend the color-kinematics duality from the adjoint to the fundamental representation of the gauge group. We explain the numerator relations that the fundamental kinematic Lie algebra should satisfy. We give nontrivial evidence supporting our construction using explicit tree and loop amplitudes, as well as more general arguments.

  12. Fundamentals of convex analysis duality, separation, representation, and resolution

    CERN Document Server

    Panik, Michael J

    1993-01-01

    Fundamentals of Convex Analysis offers an in-depth look at some of the fundamental themes covered within an area of mathematical analysis called convex analysis. In particular, it explores the topics of duality, separation, representation, and resolution. The work is intended for students of economics, management science, engineering, and mathematics who need exposure to the mathematical foundations of matrix games, optimization, and general equilibrium analysis. It is written at the advanced undergraduate to beginning graduate level and the only formal preparation required is some familiarity with set operations and with linear algebra and matrix theory. Fundamentals of Convex Analysis is self-contained in that a brief review of the essentials of these tool areas is provided in Chapter 1. Chapter exercises are also provided. Topics covered include: convex sets and their properties; separation and support theorems; theorems of the alternative; convex cones; dual homogeneous systems; basic solutions and comple...

  13. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    Science.gov (United States)

    Lira, Ignacio

    2003-08-01

    on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

  14. Fundamentals of semiconductor manufacturing and process control

    CERN Document Server

    May, Gary S

    2006-01-01

    A practical guide to semiconductor manufacturing from process control to yield modeling and experimental design Fundamentals of Semiconductor Manufacturing and Process Control covers all issues involved in manufacturing microelectronic devices and circuits, including fabrication sequences, process control, experimental design, process modeling, yield modeling, and CIM/CAM systems. Readers are introduced to both the theory and practice of all basic manufacturing concepts. Following an overview of manufacturing and technology, the text explores process monitoring methods, including those that focus on product wafers and those that focus on the equipment used to produce wafers. Next, the text sets forth some fundamentals of statistics and yield modeling, which set the foundation for a detailed discussion of how statistical process control is used to analyze quality and improve yields. The discussion of statistical experimental design offers readers a powerful approach for systematically varying controllable p...

  15. Fundamental care and knowledge interests: Implications for nursing science.

    Science.gov (United States)

    Granero-Molina, José; Fernández-Sola, Cayetano; Mateo-Aguilar, Ester; Aranda-Torres, Cayetano; Román-López, Pablo; Hernández-Padilla, José Manuel

    2017-11-09

    To characterise the intratheoretical interests of knowledge in nursing science as an epistemological framework for fundamental care. For Jürgen Habermas, theory does not separate knowledge interests from life. All knowledge, understanding and human research is always interested. Habermas formulated the knowledge interests in empirical-analytical, historical-hermeneutic and critical social sciences; but said nothing about health sciences and nursing science. Discursive paper. The article is organised into five sections that develop our argument about the implications of the Habermasian intratheoretical interests in nursing science and fundamental care: the persistence of a technical interest, the predominance of a practical interest, the importance of an emancipatory interest, "being there" to understand individuals' experience and an "existential crisis" that uncovers the individual's subjectivity. The nursing discipline can take on practical and emancipatory interests (together with a technical interest) as its fundamental knowledge interests. Nurses' privileged position in the delivery of fundamental care gives them the opportunity to gain a deep understanding of the patient's experience and illness process through physical contact and empathic communication. In clinical, academic and research environments, nurses should highlight the importance of fundamental care, showcasing the value of practical and emancipatory knowledge. This process could help to improve nursing science's leadership, social visibility and idiosyncrasy. © 2017 John Wiley & Sons Ltd.

  16. Polarization asymmetries and gauge theory interactions at short distances

    International Nuclear Information System (INIS)

    Craigie, N.S.

    1983-01-01

    In this talk, we give the arguments as to why spin asymmetries test fundamental properties of the underlying gauge theories of elementary particles, concentrating mainly on electro-weak and QCD interactions, but also looking at the future and possible signatures for supersymmetric strong interactions. We also mention briefly the role helicity asymmetry measurements can play as regards higher order corrections, including higher twist, in QCD. (orig./HSI)

  17. A short course on measure and probability theories

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.

  18. Mermin Non-Locality in Abstract Process Theories

    Directory of Open Access Journals (Sweden)

    Stefano Gogioso

    2015-11-01

    Full Text Available The study of non-locality is fundamental to the understanding of quantum mechanics. The past 50 years have seen a number of non-locality proofs, but its fundamental building blocks, and the exact role it plays in quantum protocols, has remained elusive. In this paper, we focus on a particular flavour of non-locality, generalising Mermin's argument on the GHZ state. Using strongly complementary observables, we provide necessary and sufficient conditions for Mermin non-locality in abstract process theories. We show that the existence of more phases than classical points (aka eigenstates is not sufficient, and that the key to Mermin non-locality lies in the presence of certain algebraically non-trivial phases. This allows us to show that fRel, a favourite toy model for categorical quantum mechanics, is Mermin local. We show Mermin non-locality to be the key resource ensuring the device-independent security of the HBB CQ (N,N family of Quantum Secret Sharing protocols. Finally, we challenge the unspoken assumption that the measurements involved in Mermin-type scenarios should be complementary (like the pair X,Y, opening the doors to a much wider class of potential experimental setups than currently employed. In short, we give conditions for Mermin non-locality tests on any number of systems, where each party has an arbitrary number of measurement choices, where each measurement has an arbitrary number of outcomes and further, that works in any abstract process theory.

  19. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  20. Image and video compression for multimedia engineering fundamentals, algorithms, and standards

    CERN Document Server

    Shi, Yun Q

    2008-01-01

    Part I: Fundamentals Introduction Quantization Differential Coding Transform Coding Variable-Length Coding: Information Theory Results (II) Run-Length and Dictionary Coding: Information Theory Results (III) Part II: Still Image Compression Still Image Coding: Standard JPEG Wavelet Transform for Image Coding: JPEG2000 Nonstandard Still Image Coding Part III: Motion Estimation and Compensation Motion Analysis and Motion Compensation Block Matching Pel-Recursive Technique Optical Flow Further Discussion and Summary on 2-D Motion Estimation Part IV: Video Compression Fundam

  1. Fundamental trends in fluid-structure interaction

    CERN Document Server

    Galdi, Giovanni P

    2010-01-01

    The interaction of a fluid with a solid body is a widespread phenomenon in nature, occurring at different scales and different applied disciplines. Interestingly enough, even though the mathematical theory of the motion of bodies in a liquid is one of the oldest and most classical problems in fluid mechanics, mathematicians have, only very recently, become interested in a systematic study of the basic problems related to fluid-structure interaction, from both analytical and numerical viewpoints. ""Fundamental Trends in Fluid-Structure Interaction"" is a unique collection of important papers wr

  2. Metric-independent measures for supersymmetric extended object theories on curved backgrounds

    International Nuclear Information System (INIS)

    Nishino, Hitoshi; Rajpoot, Subhash

    2014-01-01

    For Green–Schwarz superstring σ-model on curved backgrounds, we introduce a non-metric measure Φ≡ϵ ij ϵ IJ (∂ i φ I )(∂ j φ J ) with two scalars φ I (I=1,2) used in ‘Two-Measure Theory’ (TMT). As in the flat-background case, the string tension T=(2πα ′ ) −1 emerges as an integration constant for the A i -field equation. This mechanism is further generalized to supermembrane theory, and to super-p-brane theory, both on general curved backgrounds. This shows the universal applications of dynamical measure of TMT to general supersymmetric extended objects on general curved backgrounds

  3. String theory considered as a local gauge theory of an extended object

    International Nuclear Information System (INIS)

    Chan Hongmo; Tsou Sheungtsun.

    1986-11-01

    In attempting to understand more about the physical origin of the so-called 'chordal gauge symmetry' in string field theory it is found that one can, at least formally, consider the theory as a generalised local gauge theory. However, the fundamental object is no longer a point, as in ordinary gauge theory, but a point with a tail, and it is the motion of this tail which represents the internal gauge degree of freedom. Moreover, the differential geometry is based on the non-abelian conformal group instead of the usual translation group. (author)

  4. Determination of the potential for fundamental- and adjoint-representation sources in SU(2) in three dimensions

    International Nuclear Information System (INIS)

    Mawhinney, R.D.

    1990-01-01

    Pure SU(2) lattice gauge theory in three dimensions is studied by Monte Carlo simulation with a determination of the potential between fundamental- and adjoint-representation sources as a major goal. A 32 3 lattice is used and Wilson loops up to 16 by 16 are measured using a modification to the standard multihit variance reduction which improves the statistics by at least a factor of 3 at β=6.0. The integrated autocorrelation times measured for the loops show a peak for loops of size β by β. The fundamental- and adjoint-representation potentials are seen to have the same functional form to very high accuracy and their numerical values are in the ratio of their Casimir operators. The string tension is extracted and scaling is seen to within a few percent over a range of couplings which correspond to a factor of 2 change in the glueball mass. Correlated errors have been taken into account in the extraction of the potentials from the Wilson-loop values

  5. Discrete field theories and spatial properties of strings

    International Nuclear Information System (INIS)

    Klebanov, I.; Susskind, L.

    1988-10-01

    We use the ground-state wave function in the light-cone gauge to study the spatial properties of fundamental strings. We find that, as the cut-off in the parameter space is removed, the strings are smooth and have a divergent size. Guided by these properties, we consider a large-N lattice gauge theory which has an unstable phase where the size of strings diverges. We show that this phase exactly describes free fundamental strings. The lattice spacing does not have to be taken to zero for this equivalence to hold. Thus, exact rotation and translation invariance is restored in a discrete space. This suggests that the number of fundamental short-distance degrees of freedom in string theory is much smaller than in a conventional field theory. 11 refs., 4 figs

  6. Scattering amplitudes in gauge theories

    Energy Technology Data Exchange (ETDEWEB)

    Henn, Johannes M. [Institute for Advanced Study, Princeton, NJ (United States). School of Natural Sciences; Plefka, Jan C. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2014-03-01

    First monographical text on this fundamental topic. Course-tested, pedagogical and self-contained exposition. Includes exercises and solutions. At the fundamental level, the interactions of elementary particles are described by quantum gauge field theory. The quantitative implications of these interactions are captured by scattering amplitudes, traditionally computed using Feynman diagrams. In the past decade tremendous progress has been made in our understanding of and computational abilities with regard to scattering amplitudes in gauge theories, going beyond the traditional textbook approach. These advances build upon on-shell methods that focus on the analytic structure of the amplitudes, as well as on their recently discovered hidden symmetries. In fact, when expressed in suitable variables the amplitudes are much simpler than anticipated and hidden patterns emerge. These modern methods are of increasing importance in phenomenological applications arising from the need for high-precision predictions for the experiments carried out at the Large Hadron Collider, as well as in foundational mathematical physics studies on the S-matrix in quantum field theory. Bridging the gap between introductory courses on quantum field theory and state-of-the-art research, these concise yet self-contained and course-tested lecture notes are well-suited for a one-semester graduate level course or as a self-study guide for anyone interested in fundamental aspects of quantum field theory and its applications. The numerous exercises and solutions included will help readers to embrace and apply the material presented in the main text.

  7. Scattering amplitudes in gauge theories

    International Nuclear Information System (INIS)

    Henn, Johannes M.; Plefka, Jan C.

    2014-01-01

    First monographical text on this fundamental topic. Course-tested, pedagogical and self-contained exposition. Includes exercises and solutions. At the fundamental level, the interactions of elementary particles are described by quantum gauge field theory. The quantitative implications of these interactions are captured by scattering amplitudes, traditionally computed using Feynman diagrams. In the past decade tremendous progress has been made in our understanding of and computational abilities with regard to scattering amplitudes in gauge theories, going beyond the traditional textbook approach. These advances build upon on-shell methods that focus on the analytic structure of the amplitudes, as well as on their recently discovered hidden symmetries. In fact, when expressed in suitable variables the amplitudes are much simpler than anticipated and hidden patterns emerge. These modern methods are of increasing importance in phenomenological applications arising from the need for high-precision predictions for the experiments carried out at the Large Hadron Collider, as well as in foundational mathematical physics studies on the S-matrix in quantum field theory. Bridging the gap between introductory courses on quantum field theory and state-of-the-art research, these concise yet self-contained and course-tested lecture notes are well-suited for a one-semester graduate level course or as a self-study guide for anyone interested in fundamental aspects of quantum field theory and its applications. The numerous exercises and solutions included will help readers to embrace and apply the material presented in the main text.

  8. Measuring up advances in how we assess reading ability

    CERN Document Server

    Sabatini, John; O'Reilly, Tenaha

    2012-01-01

    Sabatini, Albro and O'Reilly believe that in light of federal legislation towards common core standards and assessments, as well as significant national investments in reading and literacy education, it is a critical and opportune time to bring together the research and measurement community to address fundamental issues of measuring reading comprehension, in theory and in practice.

  9. All the fundamental massless bosonic fields in superstring theory

    International Nuclear Information System (INIS)

    Manoukian, E.B.

    2012-01-01

    A systematic analysis of all the massless bosonic fields in superstring theory is carried out. Emphasis is put on the derivation of their propagators, their polarization aspects and the investigation of their underlying constraints as well as their number of degrees of freedom. The treatment is given in the presence of external sources, in the celebrated Coulomb gauge, ensuring the positivity of the formalism - a result which is also established in the process. The challenge here is the investigation involved in the self-dual fourth rank anti-symmetric tensor field. No constraints are imposed on the external sources so that their components may be varied independently, thus the complete expressions of the propagators may be obtained. As emphasized in our earlier work, the latter condition is an important one in dynamical theories with constraints giving rise to modifications as Faddeev-Popov factors. The analysis is carried out in 10 dimensions, not only because of the consistency requirement by the superstrings, but also in order to take into account of the self-duality character of the fourth rank anti-symmetric tensor field as spelled out in the paper. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  10. String Theory in a Nutshell

    CERN Document Server

    Kiritsis, Elias

    2007-01-01

    This book is the essential new introduction to modern string theory, by one of the world's authorities on the subject. Concise, clearly presented, and up-to-date, String Theory in a Nutshell brings together the best understood and most important aspects of a theory that has been evolving since the early 1980s. A core model of physics that substitutes one-dimensional extended ""strings"" for zero-dimensional point-like particles (as in quantum field theory), string theory has been the leading candidate for a theory that would successfully unify all fundamental forces of nature, includin

  11. Critical investigation of Jauch's approach to the quantum theory of measurement

    International Nuclear Information System (INIS)

    Herbut, Fedor

    1986-01-01

    To make Jauch's approach more realistic, his assumptions are modified in two ways: (1) On the quantum system plus the measuring apparatus (S + MA) after the measuring interaction has ceased, one can actually measure only operators of the form given. (2) Measurement is defined in the most general way (including, besides first-kind, also second-kind and third-kind or indirect measurements). It is shown that Jauch's basic result that the microstates (statistical operators) of S + MA before and after the collapse correspond to the same macrostate (belong to the same equivalence class of microstates) remains valid under the above modifications, and that the significance of this result goes beyond measurement theory. On the other hand, it is argued that taking the orthodox (i.e. uncompromisingly quantum) view of quantum mechanics, it is not the collapse, but the Jauch-type macrostates that are spurious in a Jauch-type theory. (author)

  12. Fundamentals of functional analysis

    CERN Document Server

    Farenick, Douglas

    2016-01-01

    This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...

  13. A new theory of gravitation

    International Nuclear Information System (INIS)

    Logunov, A.A.

    1989-01-01

    The author believes that the General Relativity Theory (GRT) suffers from a substantial deficiency since it ignors the fundamental laws of conservation of energy. Einstein neglected the classical concept of the field due to his belief in the truth of the principle of equivalence between forces of inertid gravitation. This equivalence leads, as the author says, to nonequivalence of these forces, making GRT logically contradictory from the physical point of view. The author considers GRT as a certain stage in the course of the study of space-time and gravitation, and suggests a new theory called the Relativistic Theory of Gravitation (RTG) which obeys the fundamental laws of conservation, and which is justified in some of its aspects by astronomical observations. RTG does not suffer from some deficiencies met in Einsteins theory. One is nonunique predictions of gravitation effects within the boundaries of the solar system. Also, RTG refuses some hypothesis as that of black holes. 7 refs

  14. Random vibrations theory and practice

    CERN Document Server

    Wirsching, Paul H; Ortiz, Keith

    1995-01-01

    Random Vibrations: Theory and Practice covers the theory and analysis of mechanical and structural systems undergoing random oscillations due to any number of phenomena— from engine noise, turbulent flow, and acoustic noise to wind, ocean waves, earthquakes, and rough pavement. For systems operating in such environments, a random vibration analysis is essential to the safety and reliability of the system. By far the most comprehensive text available on random vibrations, Random Vibrations: Theory and Practice is designed for readers who are new to the subject as well as those who are familiar with the fundamentals and wish to study a particular topic or use the text as an authoritative reference. It is divided into three major sections: fundamental background, random vibration development and applications to design, and random signal analysis. Introductory chapters cover topics in probability, statistics, and random processes that prepare the reader for the development of the theory of random vibrations a...

  15. A FUNDAMENTAL PLANE OF SPIRAL STRUCTURE IN DISK GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Benjamin L.; Kennefick, Daniel; Kennefick, Julia; Shields, Douglas W. [Arkansas Center for Space and Planetary Sciences, University of Arkansas, 346 1/2 North Arkansas Avenue, Fayetteville, AR 72701 (United States); Westfall, Kyle B. [Kapteyn Astronomical Institute, University of Groningen, P.O. Box 800, NL-9700 AV Groningen (Netherlands); Flatman, Russell [School of Physics, Georgia Institute of Technology, 837 State Street, Atlanta, GA 30332 (United States); Hartley, Matthew T. [Department of Physics, University of Arkansas, 226 Physics Building, 835 West Dickson Street, Fayetteville, AR 72701 (United States); Berrier, Joel C. [Department of Physics and Astronomy, Rutgers, The State University of New Jersey, 136 Frelinghuysen Road, Piscataway, NJ 08854-8019 (United States); Martinsson, Thomas P. K. [Leiden Observatory, P.O. Box 9513, NL-2300 RA Leiden (Netherlands); Swaters, Rob A., E-mail: bld002@email.uark.edu [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States)

    2015-03-20

    Spiral structure is the most distinctive feature of disk galaxies and yet debate persists about which theory of spiral structure is correct. Many versions of the density wave theory demand that the pitch angle be uniquely determined by the distribution of mass in the bulge and disk of the galaxy. We present evidence that the tangent of the pitch angle of logarithmic spiral arms in disk galaxies correlates strongly with the density of neutral atomic hydrogen in the disk and with the central stellar bulge mass of the galaxy. These three quantities, when plotted against each other, form a planar relationship that we argue should be fundamental to our understanding of spiral structure in disk galaxies. We further argue that any successful theory of spiral structure must be able to explain this relationship.

  16. The Scientific Value of Cognitive Load Theory: A Research Agenda Based on the Structuralist View of Theories

    Science.gov (United States)

    Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele

    2009-01-01

    In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…

  17. Solid state physics. Introduction to the fundamentals. 7. ed.

    International Nuclear Information System (INIS)

    Ibach, Harald; Lueth, Hans

    2009-01-01

    The present seventh edition of solid-state physics accomodates to the trend to nanophysics in research and teaching. The book applies to studying and teachings of physics, material science, as well as micro- and nanoelectronics. It treats equally experiment and theory. Tables with fundamental experiments, preparation methods, and special physical effects as well as exercise problems round the book off [de

  18. Theory and Application of an Economic Performance Measure of Risk

    NARCIS (Netherlands)

    C. Niu (Cuizhen); X. Guo (Xu); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung)

    2017-01-01

    textabstractHomm and Pigorsch (2012a) use the Aumann and Serrano index to develop a new economic performance measure (EPM), which is well known to have advantages over other measures. In this paper, we extend the theory by constructing a one-sample confidence interval of EPM, and construct

  19. Tales of the quantum understanding physics' most fundamental theory

    CERN Document Server

    Hobson, Art

    2017-01-01

    Everybody has heard that we live in a world made of atoms. But far more fundamentally, we live in a universe made of quanta. Many things are not made of atoms: light, radio waves, electric current, magnetic fields, Earth's gravitational field, not to mention exotica such a neutron stars, black holes, dark energy, and dark matter. But everything, including atoms, is made of highly unified or "coherent" bundles of energy called "quanta" that (like everything else) obey certain rules. In the case of the quantum, these rules are called "quantum physics." This is a book about quanta and their unexpected, some would say peculiar, behavior--tales, if you will, of the quantum. The quantum has developed the reputation of being capricious, bewildering, even impossible to understand. The peculiar habits of quanta are certainly not what we would have expected to find at the foundation of physical reality, but these habits are not necessarily bewildering and not at all impossible or paradoxical. This book explains those h...

  20. The renaissance of gauge theory

    International Nuclear Information System (INIS)

    Moriyasu, K.

    1982-01-01

    Gauge theory is a classic example of a good idea proposed before its time. A brief historical review of gauge theory is presented to see why it required over 50 years for gauge invariance to be rediscovered as the basic principle governing the fundamental forces of Nature. (author)

  1. Theory of relations

    CERN Document Server

    Fraïssé, R

    2011-01-01

    The first part of this book concerns the present state of the theory of chains (= total or linear orderings), in connection with some refinements of Ramsey's theorem, due to Galvin and Nash-Williams. This leads to the fundamental Laver's embeddability theorem for scattered chains, using Nash-Williams' better quasi-orderings, barriers and forerunning.The second part (chapters 9 to 12) extends to general relations the main notions and results from order-type theory. An important connection appears with permutation theory (Cameron, Pouzet, Livingstone and Wagner) and with logics (existence criter

  2. The Price Equation, Gradient Dynamics, and Continuous Trait Game Theory.

    Science.gov (United States)

    Lehtonen, Jussi

    2018-01-01

    A recent article convincingly nominated the Price equation as the fundamental theorem of evolution and used it as a foundation to derive several other theorems. A major section of evolutionary theory that was not addressed is that of game theory and gradient dynamics of continuous traits with frequency-dependent fitness. Deriving fundamental results in these fields under the unifying framework of the Price equation illuminates similarities and differences between approaches and allows a simple, unified view of game-theoretical and dynamic concepts. Using Taylor polynomials and the Price equation, I derive a dynamic measure of evolutionary change, a condition for singular points, the convergence stability criterion, and an alternative interpretation of evolutionary stability. Furthermore, by applying the Price equation to a multivariable Taylor polynomial, the direct fitness approach to kin selection emerges. Finally, I compare these results to the mean gradient equation of quantitative genetics and the canonical equation of adaptive dynamics.

  3. String Theory: Big Problem for Small Size

    Science.gov (United States)

    Sahoo, S.

    2009-01-01

    String theory is the most promising candidate theory for a unified description of all the fundamental forces that exist in nature. It provides a mathematical framework that combines quantum theory with Einstein's general theory of relativity. The typical size of a string is of the order of 10[superscript -33] cm, called the Planck length. But due…

  4. Graph theory

    CERN Document Server

    Gould, Ronald

    2012-01-01

    This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S

  5. An Introduction to Item Response Theory for Patient-Reported Outcome Measurement

    Science.gov (United States)

    Nguyen, Tam H.; Han, Hae-Ra; Kim, Miyong T.

    2015-01-01

    The growing emphasis on patient-centered care has accelerated the demand for high-quality data from patient-reported outcome (PRO) measures. Traditionally, the development and validation of these measures has been guided by classical test theory. However, item response theory (IRT), an alternate measurement framework, offers promise for addressing practical measurement problems found in health-related research that have been difficult to solve through classical methods. This paper introduces foundational concepts in IRT, as well as commonly used models and their assumptions. Existing data on a combined sample (n = 636) of Korean American and Vietnamese American adults who responded to the High Blood Pressure Health Literacy Scale and the Patient Health Questionnaire-9 are used to exemplify typical applications of IRT. These examples illustrate how IRT can be used to improve the development, refinement, and evaluation of PRO measures. Greater use of methods based on this framework can increase the accuracy and efficiency with which PROs are measured. PMID:24403095

  6. Peaks and Valleys : Experimental Asset Markets With Non-Monotonic Fundamentals

    NARCIS (Netherlands)

    Noussair, C.N.; Powell, O.R.

    2008-01-01

    We report the results of an experiment designed to measure how well asset market prices track fundamentals when the latter experience peaks and troughs. We observe greater price efficiency in markets in which fundamentals rise to a peak and then decline, than in markets in which fundamentals decline

  7. Density measurement using gamma radiation - theory and application

    International Nuclear Information System (INIS)

    Springer, E.K.

    1979-01-01

    There are still widespread uncertainties about the use and safety of gamma radiation in industries. This paper describes, by the example of radiometric density measurement, the theory of gamma radiation. The differences and advantages of both types of detectors, the ionization chamber and the scintillation counter, are discussed. The degree of accuracy which can be expected from the radiometric density meter will be defined, and the inter-relationship: source strength - measuring range - measuring length(normally the pipe diameter) in relation to the measuring accuracy required will be explained in detail. The use of radioactive material requires the permission of the Atomic Energy Board. The formalities involved to receive a user's licence and the implementations of safety standards set by the local authorities are discussed in depth [af

  8. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  9. Correlator of fundamental and anti-symmetric Wilson loops in AdS/CFT correspondence

    International Nuclear Information System (INIS)

    Tai, T.-S.; Yamaguchi, Satoshi

    2007-01-01

    We study the two circular Wilson loop correlator in which one is of anti-symmetric representation, while the other is of fundamental representation in 4-dimensional N = 4 super Yang-Mills theory. This correlator has a good AdS dual, which is a system of a D5-brane and a fundamental string. We calculated the on-shell action of the string, and clarified the Gross-Ooguri transition in this correlator. Some limiting cases are also examined

  10. Real-Time Cosmology with Gaia: Developing the Theory to Use Extragalactic Proper Motions to Make Dynamical Cosmological Tests, to Measure Geometric Distances, and to Detect Primordial Gravitational Waves

    Science.gov (United States)

    Darling, Jeremy

    isotropy of the Hubble expansion and recent claims of a vertical (out-of-Galactic-plane) acceleration of the Solar System. Ultimately, we will develop the general theory and methods for measuring the proper motion power spectrum (the vector analog to the scalar Cosmic Microwave Background temperature fluctuation power spectrum) and provide some theoretical guidance as to the meaning of various features expected in this new power spectrum. Most of these cosmological measurements will be the first of their kind, and are distinguishable from observer-induced effects such as aberration, aberration drift, rotation, or gravitational acceleration. This work will show how one can illuminate the nature of the very early universe, based on the primordial gravitational wave background measurements (spanning 10 orders of magnitude in frequency), as well as the nature of the universe we live in today. Astrometry is one of five science frontier discovery areas highlighted by the New Worlds New Horizons Decadal Survey of Astronomy and Astrophysics. The proposed research will build a scientific foundation for ground-breaking work of a fundamental nature in both physics and astrophysics. The main thrust of this proposal is to create unique new opportunities for studying the universe and for testing cosmological theories in a direct and nearly model-free manner. This work may significantly impact the larger fields of astrophysics, cosmology, and gravitation. The gravitational wave and alternative theories of gravity work is of particularly broad interest and impact because it bears on fundamental issues of particle physics, gravity, and the very early universe. This program will train new scientists, it will create synergy between theoretical and observational studies of the universe in novel research directions, and it will bring the proposed research into the classroom and the public domain.

  11. Astrophysics of black holes from fundamental aspects to latest developments

    CERN Document Server

    2016-01-01

    This book discusses the state of the art of the basic theoretical and observational topics related to black hole astrophysics. It covers all the main topics in this wide field, from the theory of accretion disks and formation mechanisms of jet and outflows, to their observed electromagnetic spectrum, and attempts to measure the spin of these objects. Black holes are one of the most fascinating predictions of general relativity and are currently a very hot topic in both physics and astrophysics. In the last five years there have been significant advances in our understanding of these systems, and in the next five years it should become possible to use them to test fundamental physics, in particular to predict the general relativity in the strong field regime. The book is both a reference work for researchers and a textbook for graduate students.

  12. On the Interpretation of Measurement Within the Quantum Theory

    Science.gov (United States)

    Cooper, Leon N.; Van Vechten, Deborah

    1969-01-01

    In interpretation of the process of measurement is proposed which can be placed wholly within the quantum theory. The entire system including the apparatus and even the mind of the observer can be considered to develop according to the Schrodinger equation. (RR)

  13. Different Variants of Fundamental Portfolio

    Directory of Open Access Journals (Sweden)

    Tarczyński Waldemar

    2014-06-01

    Full Text Available The paper proposes the fundamental portfolio of securities. This portfolio is an alternative for the classic Markowitz model, which combines fundamental analysis with portfolio analysis. The method’s main idea is based on the use of the TMAI1 synthetic measure and, in limiting conditions, the use of risk and the portfolio’s rate of return in the objective function. Different variants of fundamental portfolio have been considered under an empirical study. The effectiveness of the proposed solutions has been related to the classic portfolio constructed with the help of the Markowitz model and the WIG20 market index’s rate of return. All portfolios were constructed with data on rates of return for 2005. Their effectiveness in 2006- 2013 was then evaluated. The studied period comprises the end of the bull market, the 2007-2009 crisis, the 2010 bull market and the 2011 crisis. This allows for the evaluation of the solutions’ flexibility in various extreme situations. For the construction of the fundamental portfolio’s objective function and the TMAI, the study made use of financial and economic data on selected indicators retrieved from Notoria Serwis for 2005.

  14. Post-modern portfolio theory supports diversification in an investment portfolio to measure investment's performance

    OpenAIRE

    Rasiah, Devinaga

    2012-01-01

    This study looks at the Post-Modern Portfolio Theory that maintains greater diversification in an investment portfolio by using the alpha and the beta coefficient to measure investment performance. Post-Modern Portfolio Theory appreciates that investment risk should be tied to each investor's goals and the outcome of this goal did not symbolize economic of the financial risk. Post-Modern Portfolio Theory's downside measure generated a noticeable distinction between downside and upside volatil...

  15. Thermodynamics Fundamental Equation of a "Non-Ideal" Rubber Band from Experiments

    Science.gov (United States)

    Ritacco, Herna´n A.; Fortunatti, Juan C.; Devoto, Walter; Ferna´ndez-Miconi, Eugenio; Dominguez, Claudia; Sanchez, Miguel D.

    2014-01-01

    In this paper, we describe laboratory and classroom exercises designed to obtain the "fundamental" equation of a rubber band by combining experiments and theory. The procedure shows students how classical thermodynamics formalism can help to obtain empirical equations of state by constraining and guiding in the construction of the…

  16. Hidden QCD in Chiral Gauge Theories

    DEFF Research Database (Denmark)

    Ryttov, Thomas; Sannino, Francesco

    2005-01-01

    The 't Hooft and Corrigan-Ramond limits of massless one-flavor QCD consider the two Weyl fermions to be respectively in the fundamental representation or the two index antisymmetric representation of the gauge group. We introduce a limit in which one of the two Weyl fermions is in the fundamental...... representation and the other in the two index antisymmetric representation of a generic SU(N) gauge group. This theory is chiral and to avoid gauge anomalies a more complicated chiral theory is needed. This is the generalized Georgi-Glashow model with one vector like fermion. We show that there is an interesting...... phase in which the considered chiral gauge theory, for any N, Higgses via a bilinear condensate: The gauge interactions break spontaneously to ordinary massless one-flavor SU(3) QCD. The additional elementary fermionic matter is uncharged under this SU(3) gauge theory. It is also seen that when...

  17. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  18. On the relation of the theoretical foundations of quantum theory and general relativity theory; Ueber die Beziehung der begrifflichen Grundlagen der Quantentheorie und der Allgemeinen Relativitaetstheorie

    Energy Technology Data Exchange (ETDEWEB)

    Kober, Martin

    2010-07-01

    The specific content of the present thesis is presented in the following way. First the most important contents of quantum theory and general relativity theory are presented. In connection with the general relativity theory the mathematical property of the diffeomorphism invariance plays the deciding role, while concerning the quantum theory starting from the Copenhagen interpretation first the measurement problem is treated, before basing on the analysis of concrete phenomena and the mathematical apparatus of quantum theory the nonlocality is brought into focus as an important property. This means that both theories suggest a relationalistic view of the nature of the space. This analysis of the theoretical foundations of quantum theory and general relativity theory in relation to the nature of the space obtains only under inclusion of Kant's philosophy and his analysis of the terms space and time as fundamental forms of perception its full persuasive power. Then von Weizsaeckers quantum theory of the ur-alternatives is presented. Finally attempts are made to apply the obtained knowledge to the question of the quantum-theoretical formulation of general relativity theory.

  19. Intelligence: is it the epidemiologists' elusive "fundamental cause" of social class inequalities in health?

    Science.gov (United States)

    Gottfredson, Linda S

    2004-01-01

    Virtually all indicators of physical health and mental competence favor persons of higher socioeconomic status (SES). Conventional theories in the social sciences assume that the material disadvantages of lower SES are primarily responsible for these inequalities, either directly or by inducing psychosocial harm. These theories cannot explain, however, why the relation between SES and health outcomes (knowledge, behavior, morbidity, and mortality) is not only remarkably general across time, place, disease, and kind of health system but also so finely graded up the entire SES continuum. Epidemiologists have therefore posited, but not yet identified, a more general "fundamental cause" of health inequalities. This article concatenates various bodies of evidence to demonstrate that differences in general intelligence (g) may be that fundamental cause.

  20. New foundation of quantum theory

    International Nuclear Information System (INIS)

    Schmutzer, E.

    1976-01-01

    A new foundation of quantum theory is given on the basis of the formulated 'Principle of Fundamental Covariance', combining the 'Principle of General Relativity' (coordinate-covariance in space-time) and the 'Principle of Operator-Covariance' (in Hilbert space). The fundamental quantum laws proposed are: (1) time-dependent simultaneous laws of motion for the operators, general states and eigenstates, (2) commutation relations, (3) time-dependent eigenvalue equations. All these laws fulfill the Principle of Fundamental Covariance (in non-relativistic quantum mechanics with restricted coordinate transformations). (author)

  1. Annual report 1986 Foundation for fundamental research on matter

    International Nuclear Information System (INIS)

    Eggen, J.J.H.; Ebbing, G.E.G.

    1987-01-01

    The Dutch Foundation for Fundamental Research of Matter (FOM) makes it her aim to stimulate the fundamentally scientific research of matter in the Netherlands. She attempts to obtain this by coordinating of existing research projects and by involving her institutes and research groups in the education of young physicists. The research groups are classified in eight socalled research communities: nuclear physics, atomic physics, metals, semiconductors, solid state, thermonuclear research and plasma physics, theoretic high-energy physics. Besides accounts of the management, financial and personnel affairs, and professional/organizational reports of the aforementioned research communities and corresponding research groups, this annual report presents a number of trend articles of which one, treating superstring theory, is in INIS scope. (H.W.) refs.; figs.; tabs

  2. The Foundations of Einstein's Theory of Gravitation

    Science.gov (United States)

    Freundlich, Erwin; Brose, Translated by Henry L.; Einstein, Preface by Albert; Turner, Introduction by H. H.

    2011-06-01

    Introduction; 1. The special theory of relativity as a stepping-stone to the general theory of relativity; 2. Two fundamental postulates in the mathematical formulation of physical laws; 3. Concerning the fulfilment of the two postulates; 4. The difficulties in the principles of classical mechanics; 5. Einstein's theory of gravitation; 6. The verification of the new theory by actual experience; Appendix; Index.

  3. Current challenges in fundamental physics

    Science.gov (United States)

    Egana Ugrinovic, Daniel

    The discovery of the Higgs boson at the Large Hadron Collider completed the Standard Model of particle physics. The Standard Model is a remarkably successful theory of fundamental physics, but it suffers from severe problems. It does not provide an explanation for the origin or stability of the electroweak scale nor for the origin and structure of flavor and CP violation. It predicts vanishing neutrino masses, in disagreement with experimental observations. It also fails to explain the matter-antimatter asymmetry of the universe, and it does not provide a particle candidate for dark matter. In this thesis we provide experimentally testable solutions for most of these problems and we study their phenomenology.

  4. A dynamical theory for the Rishon model

    International Nuclear Information System (INIS)

    Harari, H.; Seiberg, N.

    1980-09-01

    We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)

  5. Concerning the equivalence of Lorentz's and Einstein's theories

    International Nuclear Information System (INIS)

    Clube, S.V.M.

    1978-01-01

    A clear distinction is drawn between derivations of the Lorentz transformations by Lorentz and Einstein. The choice as to which derivation is correct is still open to experimental test. Possible reasons are given for preferring the Lorentz derivation in terms of a material aether, and the role of covariance in physical theory is considered to be heuristic rather than fundamental. The existence of a material aether also permits one to question the fundamental role of fields in modern theory

  6. Covariance operator of functional measure in P(φ)2-quantum field theory

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.; Zhidkov, E.P.

    1988-01-01

    Functional integration measure in the Euclidean quantum field theory with polynomial interactions of boson fields with zero spin in two-dimensional space-time is investigated. The representation for the kernal of the measure covariance operator is obtained in the form of expansion over the eigenfunctions of some boundary problem for the heat equation. Two cases of the integration domains with different configurations are considered. Some trends and perspectives of employing the functional integration method in quantum field theory are also discussed. 43 refs

  7. Fundamental Movement Skill Proficiency and Body Composition Measured by Dual Energy X-Ray Absorptiometry in Eight-Year-Old Children

    Science.gov (United States)

    Slotte, Sari; Sääkslahti, Arja; Metsämuuronen, Jari; Rintala, Pauli

    2015-01-01

    Objective: The main aim was to examine the association between fundamental movement skills (FMS) and objectively measured body composition using dual energy X-ray absorptiometry (DXA). Methods: A study of 304 eight-year-old children in Finland. FMS were assessed with the "Test of gross motor development," 2nd ed. Total body fat…

  8. Measuring coherence with entanglement concurrence

    Science.gov (United States)

    Qi, Xianfei; Gao, Ting; Yan, Fengli

    2017-07-01

    Quantum coherence is a fundamental manifestation of the quantum superposition principle. Recently, Baumgratz et al (2014 Phys. Rev. Lett. 113 140401) presented a rigorous framework to quantify coherence from the view of theory of physical resource. Here we propose a new valid quantum coherence measure which is a convex roof measure, for a quantum system of arbitrary dimension, essentially using the generalized Gell-Mann matrices. Rigorous proof shows that the proposed coherence measure, coherence concurrence, fulfills all the requirements dictated by the resource theory of quantum coherence measures. Moreover, strong links between the resource frameworks of coherence concurrence and entanglement concurrence is derived, which shows that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. Our work provides a clear quantitative and operational connection between coherence and entanglement based on two kinds of concurrence. This new coherence measure, coherence concurrence, may also be beneficial to the study of quantum coherence.

  9. Measurement incompatibility and Schrödinger-Einstein-Podolsky-Rosen steering in a class of probabilistic theories

    International Nuclear Information System (INIS)

    Banik, Manik

    2015-01-01

    Steering is one of the most counter intuitive non-classical features of bipartite quantum system, first noticed by Schrödinger at the early days of quantum theory. On the other hand, measurement incompatibility is another non-classical feature of quantum theory, initially pointed out by Bohr. Recently, Quintino et al. [Phys. Rev. Lett. 113, 160402 (2014)] and Uola et al. [Phys. Rev. Lett. 113, 160403 (2014)] have investigated the relation between these two distinct non-classical features. They have shown that a set of measurements is not jointly measurable (i.e., incompatible) if and only if they can be used for demonstrating Schrödinger-Einstein-Podolsky-Rosen steering. The concept of steering has been generalized for more general abstract tensor product theories rather than just Hilbert space quantum mechanics. In this article, we discuss that the notion of measurement incompatibility can be extended for general probability theories. Further, we show that the connection between steering and measurement incompatibility holds in a border class of tensor product theories rather than just quantum theory

  10. Theory of semiconductor junction devices a textbook for electrical and electronic engineers

    CERN Document Server

    Leck, J H

    1967-01-01

    Theory of Semiconductor Junction Devices: A Textbook for Electrical and Electronic Engineers presents the simplified numerical computation of the fundamental electrical equations, specifically Poisson's and the Hall effect equations. This book provides the fundamental theory relevant for the understanding of semiconductor device theory. Comprised of 10 chapters, this book starts with an overview of the application of band theory to the special case of semiconductors, both intrinsic and extrinsic. This text then describes the electrical properties of conductivity, semiconductors, and Hall effe

  11. Using classical test theory, item response theory, and Rasch measurement theory to evaluate patient-reported outcome measures: a comparison of worked examples.

    Science.gov (United States)

    Petrillo, Jennifer; Cano, Stefan J; McLeod, Lori D; Coon, Cheryl D

    2015-01-01

    To provide comparisons and a worked example of item- and scale-level evaluations based on three psychometric methods used in patient-reported outcome development-classical test theory (CTT), item response theory (IRT), and Rasch measurement theory (RMT)-in an analysis of the National Eye Institute Visual Functioning Questionnaire (VFQ-25). Baseline VFQ-25 data from 240 participants with diabetic macular edema from a randomized, double-masked, multicenter clinical trial were used to evaluate the VFQ at the total score level. CTT, RMT, and IRT evaluations were conducted, and results were assessed in a head-to-head comparison. Results were similar across the three methods, with IRT and RMT providing more detailed diagnostic information on how to improve the scale. CTT led to the identification of two problematic items that threaten the validity of the overall scale score, sets of redundant items, and skewed response categories. IRT and RMT additionally identified poor fit for one item, many locally dependent items, poor targeting, and disordering of over half the response categories. Selection of a psychometric approach depends on many factors. Researchers should justify their evaluation method and consider the intended audience. If the instrument is being developed for descriptive purposes and on a restricted budget, a cursory examination of the CTT-based psychometric properties may be all that is possible. In a high-stakes situation, such as the development of a patient-reported outcome instrument for consideration in pharmaceutical labeling, however, a thorough psychometric evaluation including IRT or RMT should be considered, with final item-level decisions made on the basis of both quantitative and qualitative results. Copyright © 2015. Published by Elsevier Inc.

  12. Theories at 10-17 and 10-33 cm

    International Nuclear Information System (INIS)

    Bars, I.

    1985-01-01

    Rapid progress is reported in the areas of Superstring Theory, Composite Quarks and Leptons, Supergravity and Kaluza-Klein Theories. We have shifted our interest heavily toward the Superstring Theory since it has become the most promising unified theory for solving the fundamental questions in the standard model as well as quantum gravity. 23 refs

  13. Gauge field theories

    International Nuclear Information System (INIS)

    Pokorski, S.

    1987-01-01

    Quantum field theory forms the present theoretical framework for the understanding of the fundamental interactions of particle physics. This book examines gauge theories and their symmetries with an emphasis on their physical and technical aspects. The author discusses field-theoretical techniques and encourages the reader to perform many of the calculations presented. This book includes a brief introduction to perturbation theory, the renormalization programme, and the use of the renormalization group equation. Several topics of current research interest are covered, including chiral symmetry and its breaking, anomalies, and low energy effective lagrangians and some basics of supersymmetry

  14. A subjective utilitarian theory of moral judgment.

    Science.gov (United States)

    Cohen, Dale J; Ahn, Minwoo

    2016-10-01

    Current theories hypothesize that moral judgments are difficult because rational and emotional decision processes compete. We present a fundamentally different theory of moral judgment: the Subjective Utilitarian Theory of moral judgment. The Subjective Utilitarian Theory posits that people try to identify and save the competing item with the greatest "personal value." Moral judgments become difficult only when the competing items have similar personal values. In Experiment 1, we estimate the personal values of 104 items. In Experiments 2-5, we show that the distributional overlaps of the estimated personal values account for over 90% of the variance in reaction times (RTs) and response choices in a moral judgment task. Our model fundamentally restructures our understanding of moral judgments from a competition between decision processes to a competition between similarly valued items. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Some remarks on general covariance of quantum theory

    International Nuclear Information System (INIS)

    Schmutzer, E.

    1977-01-01

    If one accepts Einstein's general principle of relativity (covariance principle) also for the sphere of microphysics (quantum, mechanics, quantum field theory, theory of elemtary particles), one has to ask how far the fundamental laws of traditional quantum physics fulfil this principle. Attention is here drawn to a series of papers that have appeared during the last years, in which the author criticized the usual scheme of quantum theory (Heisenberg picture, Schroedinger picture etc.) and presented a new foundation of the basic laws of quantum physics, obeying the 'principle of fundamental covariance' (Einstein's covariance principle in space-time and covariance principle in Hilbert space of quantum operators and states). (author)

  16. Statistical quasi-particle theory for open quantum systems

    Science.gov (United States)

    Zhang, Hou-Dao; Xu, Rui-Xue; Zheng, Xiao; Yan, YiJing

    2018-04-01

    This paper presents a comprehensive account on the recently developed dissipaton-equation-of-motion (DEOM) theory. This is a statistical quasi-particle theory for quantum dissipative dynamics. It accurately describes the influence of bulk environments, with a few number of quasi-particles, the dissipatons. The novel dissipaton algebra is then followed, which readily bridges the Schrödinger equation to the DEOM theory. As a fundamental theory of quantum mechanics in open systems, DEOM characterizes both the stationary and dynamic properties of system-and-bath interferences. It treats not only the quantum dissipative systems of primary interest, but also the hybrid environment dynamics that could be experimentally measurable. Examples are the linear or nonlinear Fano interferences and the Herzberg-Teller vibronic couplings in optical spectroscopies. This review covers the DEOM construction, the underlying dissipaton algebra and theorems, the physical meanings of dynamical variables, the possible identifications of dissipatons, and some recent advancements in efficient DEOM evaluations on various problems. The relations of the present theory to other nonperturbative methods are also critically presented.

  17. Differential formalism aspects of the gauge classical theories

    International Nuclear Information System (INIS)

    Stedile, E.

    1982-01-01

    The classical aspects of the gauge theories are shown using differential geometry as fundamental tool. Somme comments are done about Maxwell Electro-dynamics, classical Yang-Mills and gravitation theories. (L.C.) [pt

  18. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  19. Amorphous gauge glass theory

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Bennett, D.L.

    1987-08-01

    Assuming that a lattice gauge theory describes a fundamental attribute of Nature, it should be pointed out that such a theory in the form of a gauge glass is a weaker assumption than a regular lattice model in as much as it is not constrained by the imposition of translational invariance; translational invariance is, however, recovered approximately in the long wavelength or continuum limit. (orig./WL)

  20. Electricity markets theories and applications

    CERN Document Server

    Lin, Jeremy

    2017-01-01

    Electricity Markets: Theories and Applications offers students and practitioners a clear understanding of the fundamental concepts of the economic theories, particularly microeconomic theories, as well as information on some advanced optimization methods of electricity markets. The authors--noted experts in the field--cover the basic drivers for the transformation of the electricity industry in both the United States and around the world and discuss the fundamentals of power system operation, electricity market design and structures, and electricity market operations. The text also explores advanced topics of power system operations and electricity market design and structure including zonal versus nodal pricing, market performance and market power issues, transmission pricing, and the emerging problems electricity markets face in smart grid and micro-grid environments. The authors also examine system planning under the context of electricity market regime. They explain the new ways to solve problems with t...

  1. Building theory through design

    DEFF Research Database (Denmark)

    Markussen, Thomas

    2017-01-01

    This chapter deals with a fundamental matter of concern in research through design: how can design work lead to the building of new theory? Controversy exists about the balance between theory and design work in research through design. While some researchers see theory production as the scientific...... hallmark of this type of research, others argue for design work being the primary achievement, with theory serving the auxiliary function of inspiring new designs. This paper demonstrates how design work and theory can be appreciated as two equally important outcomes of research through design. To set...... the scene, it starts out by briefly examining ideas on this issue presented in existing research literature. Hereafter, it introduces three basic forms in which design work can lead to theory that is referred to as extending theories, scaffolding theories and blending theories. Finally, it is discussed how...

  2. Gauge theories

    International Nuclear Information System (INIS)

    Kenyon, I.R.

    1986-01-01

    Modern theories of the interactions between fundamental particles are all gauge theories. In the case of gravitation, application of this principle to space-time leads to Einstein's theory of general relativity. All the other interactions involve the application of the gauge principle to internal spaces. Electromagnetism serves to introduce the idea of a gauge field, in this case the electromagnetic field. The next example, the strong force, shows unique features at long and short range which have their origin in the self-coupling of the gauge fields. Finally the unification of the description of the superficially dissimilar electromagnetic and weak nuclear forces completes the picture of successes of the gauge principle. (author)

  3. Proceedings of Waseda international symposium on fundamental physics. New perspectives in quantum physics

    International Nuclear Information System (INIS)

    Ohba, Ichiro; Aizawa, Yoji; Daishido, Tsuneaki; Kurihara, Susumu; Maeda, Kei-ichi; Nakazato, Hiromichi; Tasaki, Shuichi; Yuasa, Kazuya

    2003-11-01

    Waseda International Symposium on Fundamental Physics - New Perspectives in Quantum Physics - was held on November 12-15, 2002 at International Conference Hall (IBUKA HALL), Waseda University, Tokyo, Japan. This symposium was organized to provide an opportunity to verify fundamental physics attainments and to discuss new prospectives in quantum physics in the 21st century. These themes of the symposium were reexamined from all aspects in terms of important key words of the symposium, fundamental quantum theory, quantum coherence and decoherence, quantum chaos, time symmetry breaking, Bose-Einstein condensation and quantum information and computation. Separate abstracts were presented for 12 of the papers in this report. The remaining 40 were considered outside the subject scope of INIS. (J.P.N.)

  4. Performance Measurement, Expectancy and Agency Theory: An Experimental Study

    OpenAIRE

    Randolph Sloof; Mirjam van Praag

    2007-01-01

    Theoretical analyses of (optimal) performance measures are typically performed within the realm of the linear agency model. An important implication of this model is that, for a given compensation scheme, the agent's optimal effort choice is unrelated to the amount of noise in the performance measure. In contrast, expectancy theory as developed by psychologists predicts that effort levels are increasing in the signal-to-noise ratio. We conduct a real effort laboratory experiment to assess the...

  5. From electroweak theory to the primordial universe. A synthesis of some experimental results; De la theorie electrofaible a l'univers primordial. Synthese de quelques resultats experimentaux

    Energy Technology Data Exchange (ETDEWEB)

    Ealet, A

    2004-12-15

    Particle physic is based on a theory which can be tested on the current large colliders. Measurements are in a very good agreement with this electroweak theory and no deviation is observed to indicate new physics. What is surprising today is that none of its results agrees with what is known from our universe, neither to explain the primordial baryogenesis, neither to explain the acceleration of the expansion of the Universe. In this work, I come back on some results obtained in the Lep collider, to test the electroweak theory (Higgs and W boson production) and on some measurements of CP violation. I compare them with what can be extrapolated in term of primordial baryogenesis and dark energy density and show that there is no possible agreement in the Standard Model. I finish by some experimental and theoretical views to answer this fundamental question. (author)

  6. Processes at superhigh energies and hypothesis on fundamental length

    International Nuclear Information System (INIS)

    Mateev, M.D.

    1977-01-01

    The possibility of the noncontradictory introduction of the fundamental length (FL) into the apparatus of the relativistic quantum field theory (QFT) is considered. The approach connected with the change in the space-time geometry is given in detail. It is considered that the most adequate apparatus of description of phenomena in the high energy physics is the QFT in the pulse space. The analysis of the basic quantities of the theory is carried out in terms of the pulse representation. The consideration of free particles, the Reinman propagator of free particles and its properties, the uncertainty relation and the Planck formula shows that quite a new physics of processes at superhigh energies appears

  7. Scattering theory

    International Nuclear Information System (INIS)

    Sitenko, A.

    1991-01-01

    This book emerged out of graduate lectures given by the author at the University of Kiev and is intended as a graduate text. The fundamentals of non-relativistic quantum scattering theory are covered, including some topics, such as the phase-function formalism, separable potentials, and inverse scattering, which are not always coverded in textbooks on scattering theory. Criticisms of the text are minor, but the reviewer feels an inadequate index is provided and the citing of references in the Russian language is a hindrance in a graduate text

  8. Quantum Opportunities and Challenges for Fundamental Sciences in Space

    Science.gov (United States)

    Yu, Nan

    2012-01-01

    Space platforms offer unique environment for and measurements of quantum world and fundamental physics. Quantum technology and measurements enhance measurement capabilities in space and result in greater science returns.

  9. Measurement sum theory and application - Application to low level measurements

    International Nuclear Information System (INIS)

    Puydarrieux, S.; Bruel, V.; Rivier, C.; Crozet, M.; Vivier, A.; Manificat, G.; Thaurel, B.; Mokili, M.; Philippot, B.; Bohaud, E.

    2015-09-01

    In laboratories, most of the Total Sum methods implemented today use substitution or censure methods for nonsignificant or negative values, and thus create biases which can sometimes be quite large. They are usually positive, and generate, for example, becquerel (Bq) counting or 'administrative' quantities of materials (= 'virtual'), thus artificially falsifying the records kept by the laboratories under regulatory requirements (environment release records, waste records, etc.). This document suggests a methodology which will enable the user to avoid such biases. It is based on the following two fundamental rules: - The Total Sum of measurement values must be established based on all the individual measurement values, even those considered non-significant including the negative values. Any modification of these values, under the pretext that they are not significant, will inevitably lead to biases in the accumulated result and falsify the evaluation of its uncertainty. - In Total Sum operations, the decision thresholds are arrived at in a similar way to the approach used for uncertainties. The document deals with four essential aspects of the notion of 'measurement Total Sums': - The expression of results and associated uncertainties close to Decision Thresholds, and Detection or Quantification Limits, - The Total Sum of these measurements: sum or mean, - The calculation of the uncertainties associated with the Total Sums, - Result presentation (particularly when preparing balance sheets or reports, etc.) Several case studies arising from different situations are used to illustrate the methodology: environmental monitoring reports, release reports, and chemical impurity Total Sums for the qualification of a finished product. The special case of material balances, in which the measurements are usually all significant and correlated (the covariance term cannot then be ignored) will be the subject of a future second document. This

  10. The theory and measurement of partial discharge transients

    DEFF Research Database (Denmark)

    Pedersen, Aage; Crichton, George C; McAllister, Iain Wilson

    1991-01-01

    A theoretical approach to partial discharge transients is presented. This approach is based on the relationship between the charge induced on the measurement electrode by those created in the interelectrode volume during partial discharge activity. The primary sources for these induced charges ar...... electrode systems of practical interest is illustrated. A discussion of the salient features and practical aspects of the theory is included...

  11. Fundamental Safety Principles

    International Nuclear Information System (INIS)

    Abdelmalik, W.E.Y.

    2011-01-01

    This work presents a summary of the IAEA Safety Standards Series publication No. SF-1 entitled F UDAMENTAL Safety PRINCIPLES p ublished on 2006. This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purposes. Safety measures and security measures have in common the aim of protecting human life and health and the environment. These safety principles are: 1) Responsibility for safety, 2) Role of the government, 3) Leadership and management for safety, 4) Justification of facilities and activities, 5) Optimization of protection, 6) Limitation of risks to individuals, 7) Protection of present and future generations, 8) Prevention of accidents, 9)Emergency preparedness and response and 10) Protective action to reduce existing or unregulated radiation risks. The safety principles concern the security of facilities and activities to the extent that they apply to measures that contribute to both safety and security. Safety measures and security measures must be designed and implemented in an integrated manner so that security measures do not compromise safety and safety measures do not compromise security.

  12. Bayesian modeling of measurement error in predictor variables using item response theory

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Glas, Cornelis A.W.

    2000-01-01

    This paper focuses on handling measurement error in predictor variables using item response theory (IRT). Measurement error is of great important in assessment of theoretical constructs, such as intelligence or the school climate. Measurement error is modeled by treating the predictors as unobserved

  13. Fundamentals, Misvaluation, and Investment: The Real Story

    OpenAIRE

    Chirinko, Robert S.; Schaller, Huntley

    2006-01-01

    Abstract: Is real investment fully determined by fundamentals or is it sometimes affected by stockmarket misvaluation? We introduce three new tests that: measure the reaction of investment to sales shocks for firms that may be overvalued; use Fama-MacBeth regressions to determine whether "overinvestment" affects subsequent returns; and analyze the time path of the marginal product of capital in reaction to fundamental and misvaluation shocks. Besides these qualitative tests, we introduce a me...

  14. The theory of groups

    CERN Document Server

    Hall, Marshall

    2018-01-01

    This 1959 text offers an unsurpassed resource for learning and reviewing the basics of a fundamental and ever-expanding area. "This remarkable book undoubtedly will become a standard text on group theory." - American Scientist.

  15. The Yang-Mills gradient flow and SU(3) gauge theory with 12 massless fundamental fermions in a colour-twisted box

    CERN Document Server

    Lin, C -J David; Ramos, Alberto

    2015-01-01

    We perform the step-scaling investigation of the running coupling constant, using the gradient-flow scheme, in SU(3) gauge theory with twelve massless fermions in the fundamental representation. The Wilson plaquette gauge action and massless unimproved staggered fermions are used in the simulations. Our lattice data are prepared at high accuracy, such that the statistical error for the renormalised coupling, g_GF, is at the subpercentage level. To investigate the reliability of the continuum extrapolation, we employ two different lattice discretisations to obtain g_GF. For our simulation setting, the corresponding gauge-field averaging radius in the gradient flow has to be almost half of the lattice size, in order to have this extrapolation under control. We can determine the renormalisation group evolution of the coupling up to g^2_GF ~ 6, before the onset of the bulk phase structure. In this infrared regime, the running of the coupling is significantly slower than the two-loop perturbative prediction, altho...

  16. Uses of solid state analogies in elementary particle theory

    International Nuclear Information System (INIS)

    Anderson, P.W.

    1976-01-01

    The solid state background of some of the modern ideas of field theory is reviewed, and additional examples of model situations in solid state or many-body theory which may have relevance to fundamental theories of elementary particles are adduced

  17. On divergence of finite measures and their applicability in statistics and information theory

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Stummer, W.

    2009-01-01

    Roč. 44, č. 2 (2009), s. 169-187 ISSN 0233-1888 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Local and global divergences of finite measures * Divergences of sigma-finite measures * Statistical censoring * Pinsker's inequality, Ornstein's distance * Differential power entropies Subject RIV: BD - Theory of Information Impact factor: 0.759, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/vajda-on divergence of finite measures and their applicability in statistics and information theory.pdf

  18. Cognitive load theory: Practical implications and an important challenge

    Directory of Open Access Journals (Sweden)

    Jimmie Leppink, Ph.D.

    2017-10-01

    Full Text Available The field of medical education has adopted a wide variety of theories from other fields. A fairly recent example is cognitive load theory, which originated in educational psychology. Several empirical studies inspired by cognitive load theory and reviews of practical implications of cognitive load theory have contributed to guidelines for the design of medical education. Simultaneously, several research groups have developed instruments for the measurement of cognitive load in a medical education context. These developments notwithstanding, obtaining evidence for different types of cognitive load remains an important challenge. Therefore, the aim of this article is twofold: to provide medical educators with three key guidelines for the design of instruction and assessment and to discuss several fundamental issues in the remaining challenges presented by different types of cognitive load. The guidelines revolve around minimizing cognitive activity that does not contribute to learning, working with specific learning goals in mind, and appreciating the multifaceted relation between learning and assessment. Key issues around the types of cognitive load include the context in which learning occurs, the continued use of single-item mental effort ratings, and the timing of cognitive load and learning outcome measurements.

  19. The Basics: What's Essential about Theory for Community Development Practice?

    Science.gov (United States)

    Hustedde, Ronald J.; Ganowicz, Jacek

    2002-01-01

    Relates three classical theories (structural functionalism, conflict theory, symbolic interactionism) to fundamental concerns of community development (structure, power, and shared meaning). Links these theories to Giddens' structuration theory, which connects macro and micro structures and community influence on change through cultural norms.…

  20. Laser measurement technology fundamentals and applications

    CERN Document Server

    Donges, Axel

    2015-01-01

    Laser measurement technology has evolved in the last years in a versatile and reflationary way. Today, its methods are indispensable for research and development activities as well as for production technology. Every physicist and engineer should therefore gain a working knowledge of laser measurement technology. This book closes the gap of existing textbooks. It introduces in a comprehensible presentation laser measurement technology in all its aspects. Numerous figures, graphs and tables allow for a fast access into the matter. In the first part of the book the important physical and optical basics are described being necessary to understand laser measurement technology. In the second part technically significant measuring methods are explained and application examples are presented. Target groups of this textbook are students of natural and engineering sciences as well as working physicists and engineers, who are interested to make themselves familiar with laser measurement technology and its fascinating p...

  1. The analytic foundations of Regge theory

    International Nuclear Information System (INIS)

    White, A.R.

    1976-01-01

    Regge poles were first introduced into relativistic scattering theory nearly fifteen years ago. The necessity for accompanying Regge cuts was discovered within two years. The intervening years have seen a gradual improvement of our understanding of Regge theory, but, particularly at the multiparticle level, the theory has remained incomplete with its fundamental status unclear. However, on the basis of recent progress a complete and systematic development of the Regge theory of elastic and multiparticle amplitude is given. (Auth.)

  2. Open problems in Banach spaces and measure theory | Rodríguez ...

    African Journals Online (AJOL)

    We collect several open questions in Banach spaces, mostly related to measure theoretic aspects of the theory. The problems are divided into five categories: miscellaneous problems in Banach spaces (non-separable Lp spaces, compactness in Banach spaces, w*-null sequences in dual spaces), measurability in Banach ...

  3. Using Generalizability Theory to Disattenuate Correlation Coefficients for Multiple Sources of Measurement Error.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-05-02

    Over the years, research in the social sciences has been dominated by reporting of reliability coefficients that fail to account for key sources of measurement error. Use of these coefficients, in turn, to correct for measurement error can hinder scientific progress by misrepresenting true relationships among the underlying constructs being investigated. In the research reported here, we addressed these issues using generalizability theory (G-theory) in both traditional and new ways to account for the three key sources of measurement error (random-response, specific-factor, and transient) that affect scores from objectively scored measures. Results from 20 widely used measures of personality, self-concept, and socially desirable responding showed that conventional indices consistently misrepresented reliability and relationships among psychological constructs by failing to account for key sources of measurement error and correlated transient errors within occasions. The results further revealed that G-theory served as an effective framework for remedying these problems. We discuss possible extensions in future research and provide code from the computer package R in an online supplement to enable readers to apply the procedures we demonstrate to their own research.

  4. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  5. Measuring Boltzmann's Constant with Carbon Dioxide

    Science.gov (United States)

    Ivanov, Dragia; Nikolov, Stefan

    2013-01-01

    In this paper we present two experiments to measure Boltzmann's constant--one of the fundamental constants of modern-day physics, which lies at the base of statistical mechanics and thermodynamics. The experiments use very basic theory, simple equipment and cheap and safe materials yet provide very precise results. They are very easy and…

  6. RCFT with defects: Factorization and fundamental world sheets

    International Nuclear Information System (INIS)

    Fjelstad, Jens; Fuchs, Jürgen; Stigner, Carl

    2012-01-01

    It is known that for any full rational conformal field theory, the correlation functions that are obtained by the TFT construction satisfy all locality, modular invariance and factorization conditions, and that there is a small set of fundamental correlators to which all others are related via factorization - provided that the world sheets considered do not contain any non-trivial defect lines. In this paper we generalize both results to oriented world sheets with an arbitrary network of topological defect lines.

  7. A new non-specificity measure in evidence theory based on belief intervals

    Institute of Scientific and Technical Information of China (English)

    Yang Yi; Han Deqiang; Jean Dezert

    2016-01-01

    In the theory of belief functions, the measure of uncertainty is an important concept, which is used for representing some types of uncertainty incorporated in bodies of evidence such as the discord and the non-specificity. For the non-specificity part, some traditional measures use for reference the Hartley measure in classical set theory;other traditional measures use the simple and heuristic function for joint use of mass assignments and the cardinality of focal elements. In this paper, a new non-specificity measure is proposed using lengths of belief intervals, which represent the degree of imprecision. Therefore, it has more intuitive physical meaning. It can be proved that our new measure can be rewritten in a general form for the non-specificity. Our new measure is also proved to be a strict non-specificity measure with some desired properties. Numerical examples, simulations, the related analyses and proofs are provided to show the characteristics and good properties of the new non-specificity definition. An example of an application of the new non-specificity measure is also presented.

  8. Evidence for the Fundamental Difference Hypothesis or Not?: Island Constraints Revisited

    Science.gov (United States)

    Belikova, Alyona; White, Lydia

    2009-01-01

    This article examines how changes in linguistic theory affect the debate between the fundamental difference hypothesis and the access-to-Universal Grammar (UG) approach to SLA. With a focus on subjacency (Chomsky, 1973), a principle of UG that places constraints on "wh"-movement and that has frequently been taken as a test case for verifying…

  9. Biomedical engineering fundamentals

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Engineering Fundamentals, the first volume of the handbook, presents material from respected scientists with diverse backgrounds in physiological systems, biomechanics, biomaterials, bioelectric phenomena, and neuroengineering. More than three dozen specific topics are examined, including cardia

  10. Optimal Measurements for Simultaneous Quantum Estimation of Multiple Phases.

    Science.gov (United States)

    Pezzè, Luca; Ciampini, Mario A; Spagnolo, Nicolò; Humphreys, Peter C; Datta, Animesh; Walmsley, Ian A; Barbieri, Marco; Sciarrino, Fabio; Smerzi, Augusto

    2017-09-29

    A quantum theory of multiphase estimation is crucial for quantum-enhanced sensing and imaging and may link quantum metrology to more complex quantum computation and communication protocols. In this Letter, we tackle one of the key difficulties of multiphase estimation: obtaining a measurement which saturates the fundamental sensitivity bounds. We derive necessary and sufficient conditions for projective measurements acting on pure states to saturate the ultimate theoretical bound on precision given by the quantum Fisher information matrix. We apply our theory to the specific example of interferometric phase estimation using photon number measurements, a convenient choice in the laboratory. Our results thus introduce concepts and methods relevant to the future theoretical and experimental development of multiparameter estimation.

  11. Optimal Measurements for Simultaneous Quantum Estimation of Multiple Phases

    Science.gov (United States)

    Pezzè, Luca; Ciampini, Mario A.; Spagnolo, Nicolò; Humphreys, Peter C.; Datta, Animesh; Walmsley, Ian A.; Barbieri, Marco; Sciarrino, Fabio; Smerzi, Augusto

    2017-09-01

    A quantum theory of multiphase estimation is crucial for quantum-enhanced sensing and imaging and may link quantum metrology to more complex quantum computation and communication protocols. In this Letter, we tackle one of the key difficulties of multiphase estimation: obtaining a measurement which saturates the fundamental sensitivity bounds. We derive necessary and sufficient conditions for projective measurements acting on pure states to saturate the ultimate theoretical bound on precision given by the quantum Fisher information matrix. We apply our theory to the specific example of interferometric phase estimation using photon number measurements, a convenient choice in the laboratory. Our results thus introduce concepts and methods relevant to the future theoretical and experimental development of multiparameter estimation.

  12. Plasmon assisted optical trapping: fundamentals and biomedical applications

    Science.gov (United States)

    Serafetinides, Alexandros A.; Makropoulou, Mersini; Tsigaridas, Georgios N.; Gousetis, Anastasios

    2015-01-01

    The field of optical trapping has dramatically grown due to implementation in various arenas including physics, biology, medicine and nanotechnology. Certainly, optical tweezers are an invaluable tool to manipulate a variation of particles, such as small dielectric spheres, cells, bacteria, chromosomes and even genes, by highly focused laser beams through microscope. As the main disadvantage of the conventional optical trapping systems is the diffraction limit of the incident light, plasmon assisted nanotrapping is reported as a suitable technique for trapping sub-wavelength metallic or dielectric particles. In this work, firstly, we report briefly on the basic theory of plasmon excitation, focusing on the interaction of nanoscale metallic structures with laser light. Secondly, experimental and numerical simulation results are also presented, demonstrating enhancement of the trapping efficiency of glass or SiO2 substrates, coated with Au and Ag nanostructures, with or without nanoparticles. The optical forces were calculated by measuring the particle's escape velocity calibration method. Finally, representative applications of plasmon assisted optical trapping are reviewed, from cancer therapeutics to fundamental biology and cell nanosurgery.

  13. Relativistic thermodynamics and kinetic theory, with applications to cosmology

    International Nuclear Information System (INIS)

    Stewart, J.M.

    1973-01-01

    The discussion of relativistic thermodynamics and kinetic theory with applications to cosmology also covers the fundamentals and nonequilibrium relativistic kinetic theory and applications to cosmology and astrophysics. (U.S.)

  14. Irreversible processes kinetic theory

    CERN Document Server

    Brush, Stephen G

    2013-01-01

    Kinetic Theory, Volume 2: Irreversible Processes deals with the kinetic theory of gases and the irreversible processes they undergo. It includes the two papers by James Clerk Maxwell and Ludwig Boltzmann in which the basic equations for transport processes in gases are formulated, together with the first derivation of Boltzmann's ""H-theorem"" and a discussion of this theorem, along with the problem of irreversibility.Comprised of 10 chapters, this volume begins with an introduction to the fundamental nature of heat and of gases, along with Boltzmann's work on the kinetic theory of gases and s

  15. Is Education a Fundamental Right? People's Lay Theories About Intellectual Potential Drive Their Positions on Education.

    Science.gov (United States)

    Savani, Krishna; Rattan, Aneeta; Dweck, Carol S

    2017-09-01

    Does every child have a fundamental right to receive a high-quality education? We propose that people's beliefs about whether "nearly everyone" or "only some people" have high intellectual potential drive their positions on education. Three studies found that the more people believed that nearly everyone has high potential, the more they viewed education as a fundamental human right. Furthermore, people who viewed education as a fundamental right, in turn (a) were more likely to support the institution of free public education, (b) were more concerned upon learning that students in the country were not performing well academically compared with students in peer nations, and (c) were more likely to support redistributing educational funds more equitably across wealthier and poorer school districts. The studies show that people's beliefs about intellectual potential can influence their positions on education, which can affect the future quality of life for countless students.

  16. Error calculations statistics in radioactive measurements

    International Nuclear Information System (INIS)

    Verdera, Silvia

    1994-01-01

    Basic approach and procedures frequently used in the practice of radioactive measurements.Statistical principles applied are part of Good radiopharmaceutical Practices and quality assurance.Concept of error, classification as systematic and random errors.Statistic fundamentals,probability theories, populations distributions, Bernoulli, Poisson,Gauss, t-test distribution,Ξ2 test, error propagation based on analysis of variance.Bibliography.z table,t-test table, Poisson index ,Ξ2 test

  17. The fundamentals of stellar astrophysics

    International Nuclear Information System (INIS)

    Collins, G.W. II.

    1989-01-01

    A broad overview of theoretical stellar astrophysics is presented in a textbook intended for graduate students. Chapters are devoted to fundamental principles, assumptions, theorems, and polytropes; energy sources and sinks; the flow of energy through the star and the construction of stellar models; the theory of stellar evolution; relativistic stellar structure; the structure of distorted stars; stellar pulsation and oscillation. Also discussed are the flow of radiation through the stellar atmosphere, the solution of the radiative-transfer equation, the environment of the radiation field, the construction of a stellar model atmosphere, the formation and shape of spectral lines, LTE breakdown, illuminated and extended stellar atmospheres, and the transfer of polarized radiation. Diagrams, graphs, and sample problems are provided. 164 refs

  18. Changing Investment in Activities and Interests in Elders' Lives: Theory and Measurement

    Science.gov (United States)

    Adams, Kathryn Betts

    2004-01-01

    Socioemotional selectivity and gerotranscendence, newer theories with roots in the disengagement theory of aging, provided the theoretical framework for a new measure of perceived change in investment in a variety of pursuits. The 30-item Change in Activity and Interest Index (CAII) was given to a sample of 327 outpatients aged 65-94. Items with…

  19. Experiment, theory and the Casimir effect

    International Nuclear Information System (INIS)

    Mostepanenko, V M

    2009-01-01

    Several problems at the interface between the field-theoretical description of the Casimir effect and experiments on measuring the Casimir force are discussed. One of these problems is connected with the definition of the Casimir free energy in ideal metal rectangular boxes satisfying the general physical requirements. It is shown that the consideration of rectangular boxes with a partition (piston) does not negate the previously known results obtained for boxes without a piston. Both sets of results are found to be in mutual agreement. Another problem is related to the use of the proximity force approximation for the interpretation of the experimental data and to the search of analytical results beyond the PFA based on the first principles of quantum field theory. Next, we discuss concepts of experimental precision and of the measure of agreement between experiment and theory. The fundamental difference between these two concepts is clarified. Finally, recent approach to the thermal Casimir force taking screening effects into account is applied to real metals. It is shown that this approach is thermodynamically and experimentally inconsistent. The physical reasons of this inconsistency are connected with the violation of thermal equilibrium which is the basic applicability condition of the Lifshitz theory.

  20. Fred Hoyle: contributions to the theory of galaxy formation

    Science.gov (United States)

    Efstathiou, George

    I review two fundamental contributions that Fred Hoyle made to the theory of galaxy formation. Hoyle was the first to propose that protogalaxies acquired their angular momentum via tidal torques from neighbouring perturbations during a period of gravitational instability. To my knowldege, he was also the first to suggest that the masses of galaxies could be explained by the requirement that primordial gas clouds cool radiatively on a suitable timescale. Tidal torques and cooling arguments play a central role in the modern theory of galaxy formation. It is a measure of Hoyle's breadth and inventiveness that he recognized the importance of these processes at such an early stage in the history of the subject.

  1. ANALYSIS OF PUBLIC COURT-ORDERED-DEBT DISCLOSURE: INFLUENCE OF LEGISLATION AND FUNDAMENTALS OF ACCOUNTING THEORY

    Directory of Open Access Journals (Sweden)

    Lucas Oliveira Gomes Ferreira

    2012-03-01

    Full Text Available The purpose of the present study is to analyze the accounting disclosure of judicial payments warrants (precatórios, issued when governmental entities are found liable for pecuniary awards in lawsuits according to accounting theory, and to verify if the current legislation interferes in the accounting treatment of these instruments. In this sense, we performed a documental and literature review about the legal framework and accounting procedures adopted, as well gathered data from the National Treasury Secretariat Data Collection System (SISTN in the period 2004-2009 and consulted a study carried out by the Supreme Court (STF in 2004. The study’s justification is based on the perception that over than a half of judicial payment warrants are not registered in the public accounts. Consequently, whereas these warrants (i vested rights of the plaintiffs and (ii debts of the public entity, the lack of accounting disclosure jeopardizes both the beneficiary, whose right is not reflected in the public accounts, thus casting doubt on the expectation to receive payment, and government managers and society, who do not have reliable information that allows effective management. The innovation of this paper consists of discussing identification of the appropriate moment of the generating event of the underlying debts and the proposal of disclosure considering the risk classification. In conclusion, the influence of the current legislation and the failure to observe accounting fundamentals are among the likely factors that have affected the proper accounting of judicial payment warrants within the Brazilian public administration.

  2. String theory for heavy-ion physicists

    International Nuclear Information System (INIS)

    Natsuume, Makoto

    2007-01-01

    In this article, we review the AdS/CFT duality for non-experts. Superstring theory has objects so-called D-branes, which are the key to understand the AdS/CFT duality. What is important about the D-brane is that the D-brane has two different descriptions. As I will explain in detail below, the D-brane can be described either by a gauge theory or by a black hole. So, the D-brane connects these two entirely different systems. However, the simplest AdS/CFT duality involves only adjoint matter and there is no fundamental matter such as quarks. Thus, an important question to address if we want to use the AdS/CFT duality for more realistic scenarios is how one can include fundamental matter. I will explain a simple way to describe fundamental matter; Sugimoto discusses more realistic methods in his article (in this volume). (author)

  3. A game theory-based trust measurement model for social networks.

    Science.gov (United States)

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  4. Applied number theory

    CERN Document Server

    Niederreiter, Harald

    2015-01-01

    This textbook effectively builds a bridge from basic number theory to recent advances in applied number theory. It presents the first unified account of the four major areas of application where number theory plays a fundamental role, namely cryptography, coding theory, quasi-Monte Carlo methods, and pseudorandom number generation, allowing the authors to delineate the manifold links and interrelations between these areas.  Number theory, which Carl-Friedrich Gauss famously dubbed the queen of mathematics, has always been considered a very beautiful field of mathematics, producing lovely results and elegant proofs. While only very few real-life applications were known in the past, today number theory can be found in everyday life: in supermarket bar code scanners, in our cars’ GPS systems, in online banking, etc.  Starting with a brief introductory course on number theory in Chapter 1, which makes the book more accessible for undergraduates, the authors describe the four main application areas in Chapters...

  5. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1989-01-01

    An important aspect of a complete measurement control program for special nuclear materials is the analysis of data from periodic control measurements of known standards. This chapter covers the following topics: basic algorithms including an introduction and terminology, the standard case (known mean and standard deviation), Shewart control charts, and sequential test for bias; modifications for nonstandard cases including modification for changing (decaying) standard value, modifications for deteriorating measurement precision, and modifications when repeated measurements are made; maintenance information including estimation of historical standard deviation (standard case), estimation of historical standard deviation (changing with time), normality and outliners, and other tests of randomness

  6. Critical Investigation of Jauch's Approach to the Quantum Theory of Measurement

    Science.gov (United States)

    Herbut, Fedor

    1986-08-01

    To make Jauch's approach more realistic, his assumptions are modified in two ways: (1) On the quantum system plus the measuring apparatus (S+MA) after the measuring interaction has ceased, one can actually measure only operators of the form A⊗∑ k b k Q k ,where A is any Hermitian operator for S, the resolution of the identity ∑kQk=1 defines MA as a classical system (following von Neumann), and the b k are real numbers (S and MA are distant). (2) Measurement is defined in the most general way (including, besides first-kind, also second-kind and third-kind or indirect measurements). It is shown that Jauch's basic result that the microstates (statistical operators) of S+MA before and after the collapse correspond to the same macrostate (belong to the same equivalence class of microstates) remains valid under the above modifications, and that the significance of this result goes beyond measurement theory. On the other hand, it is argued that taking the orthodox (i.e. uncompromisingly quantum) view of quantum mechanics, it is not the collapse, but the Jauch-type macrostates that are spurious in a Jauch-type theory.

  7. Gendered language attitudes: exploring language as a gendered construct using Rasch measurement theory.

    Science.gov (United States)

    Knisely, Kris A; Wind, Stefanie A

    2015-01-01

    Gendered language attitudes (GLAs) are gender-based perceptions of language varieties based on connections between gender-related and linguistic characteristics of individuals, including the perception of language varieties as possessing degrees of masculinity and femininity. This study combines substantive theory about language learning and gender with a model based on Rasch measurement theory to explore the psychometric properties of a new measure of GLAs. Findings suggest that GLAs is a unidimensional construct and that the items used can be used to describe differences among students in terms of the strength of their GLAs. Implications for research, theory, and practice are discussed. Special emphasis is given to the teaching and learning of languages.

  8. Quantum Measurements using Diamond Spins : From Fundamental Tests to Long-Distance Teleportation

    NARCIS (Netherlands)

    Hanson, R.

    2014-01-01

    Spin qubits in diamond provide an excellent platform both for fundamental tests and for realizing extended quantum networks . Here we present our latest results, including the deterministic teleportation over three meters.

  9. A quasi-one-dimensional theory of sound propagation in lined ducts with mean flow

    Science.gov (United States)

    Dokumaci, Erkan

    2018-04-01

    Sound propagation in ducts with locally-reacting liners has received the attention of many authors proposing two- and three-dimensional solutions of the convected wave equation and of the Pridmore-Brown equation. One-dimensional lined duct models appear to have received less attention. The present paper proposes a quasi-one-dimensional theory for lined uniform ducts with parallel sheared mean flow. The basic assumption of the theory is that the effects of refraction and wall compliance on the fundamental mode remain within ranges in which the acoustic fluctuations are essentially uniform over a duct section. This restricts the model to subsonic low Mach numbers and Helmholtz numbers of less than about unity. The axial propagation constants and the wave transfer matrix of the duct are given by simple explicit expressions and can be applied with no-slip, full-slip or partial slip boundary conditions. The limitations of the theory are discussed and its predictions are compared with the fundamental mode solutions of the convected wave equation, the Pridmore-Brown equation and measurements where available.

  10. Asymptotic freedom in the theory of the strong interaction. Comment on the nobel prize in physics 2004

    International Nuclear Information System (INIS)

    Zhang Zhaoxi

    2005-01-01

    The 2004 Nobel Prize in Physics was awarded to David J. Gross, Frank Wilczek and H. David Politzer for their decisive contributions to the theory of the asymptotic freedom of the strong interaction (a fundamental interaction). The fundamental elements of quantum chromodynamics (QCD) and the theory of the strong interaction are briefly reviewed in their historical context. How to achieve asymptotic freedom is introduced and its physical meaning explained. The latest experimental tests of asymptotic freedom are presented, and it is shown that the theoretical prediction agrees excellently with the experimental measurements. Perturbative QCD which is based on the asymptotic freedom is outlined. It is pointed out that the theoretical discovery and experimental proof of the asymptotic freedom are crucial for QCD to be the correct theory of strong interaction. Certain frontier research areas of QCD, such as 'color confinement', are mentioned. The discovery and confirmation of asymptotic freedom has indeed deeply affected particle physics, and has led to QCD becoming a main content of the standard model, and to further development of the so-called grand unification theories of interactions. (author)

  11. Value of Fundamental Science

    Science.gov (United States)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  12. The status and future prospects of string theory

    International Nuclear Information System (INIS)

    Gross, D.J.

    1990-01-01

    After a general introduction to the description of the fundamental forces by gauge theories and the difficulties occurring in the attemps of unifying these theories with gravity the reasons for the introduction of string theory are explained. After a description of the construction of a string theory the string theory of gravity is considered. Then the problems of string theory are described. Thereafter elastic scattering in string theory at energies comparable with the Planck mass is considered. Finally some prospects for string theory are discussed. (HSI)

  13. Beta particle measurement fundamentals

    International Nuclear Information System (INIS)

    Alvarez, J.L.

    1986-01-01

    The necessary concepts for understanding beta particle behavior are stopping power, range, and scattering. Dose as a consequence of beta particle interaction with tissue can be derived and explained by these concepts. Any calculations of dose, however, assume or require detailed knowledge of the beta spectrum at the tissue depth of calculation. A rudimentary knowledge of the incident spectrum can be of use in estimating dose, interpretating dose measuring devices and designing protection. The stopping power and range based on the csda will give a conservative estimate in cases of protection design, as scattering will reduce the range. Estimates of dose may be low because scattering effects were neglected

  14. Variational transition-state theory

    International Nuclear Information System (INIS)

    Truhlar, D.G.; Garrett, B.C.

    1980-01-01

    A general introduction to and some results from studies of a procedure called variational transition-state theory are presented. A fundamental assumption of this theory is that the net rate of forward reaction at equilibrium equals the equilibrium flux in the product direction through the transition state where the transition state is a surface in phase space dividing reactants from products. Classical generalized-transition-state-theory calculations for nine collinear systems are compared to classical trajectory calculations. This new technique should provide useful insight into the successes and failures of the conventional theory and useful quantitative estimates of possible errors on the predictions of conventional transition-state theory. This should also contribute to a more accurate theory now available for the practical calculations of chemical reaction rates and thermochemical and structural interpretations of rate processes

  15. Probabilistic structure of quantum theory

    International Nuclear Information System (INIS)

    Burzynski, A.

    1989-01-01

    The fundamental ideas of quantum theory are presented. It is shown that two approaches to quantum theory: Heisenberg's matrix mechanics and Schroedinger's wave mechanics, can be formulated by means of the theory of operators in Hilbert space. Some remarks on Hilbert spaces, diadic and projection operators are done. States, probabilities and observables of quantum systems are discussed and time evolution of quantum states is analysed. Some remarks on two-component systems and symmetries are given. 21 refs. (M.F.W.)

  16. Matching theory for wireless networks

    CERN Document Server

    Han, Zhu; Saad, Walid

    2017-01-01

    This book provides the fundamental knowledge of the classical matching theory problems. It builds up the bridge between the matching theory and the 5G wireless communication resource allocation problems. The potentials and challenges of implementing the semi-distributive matching theory framework into the wireless resource allocations are analyzed both theoretically and through implementation examples. Academics, researchers, engineers, and so on, who are interested in efficient distributive wireless resource allocation solutions, will find this book to be an exceptional resource. .

  17. Fundamental tests of nature with cooled and stored exotic ions

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The presentation will concentrate on recent applications with exciting results of Penning traps in atomic and nuclear physics with cooled and stored exotic ions. These are high-accuracy mass measurements of short-lived radionuclides, g-factor determinations of the bound-electron in highly-charged, hydrogen-like ions and g-factor measurements of the proton and antiproton. The experiments are dedicated, e.g., to astrophysics studies and to tests of fundamental symmetries in the case of mass measurements on radionuclides, and to the determination of fundamental constants and a CPT test in the case of the g-factor measurements.

  18. Matching theory

    CERN Document Server

    Plummer, MD

    1986-01-01

    This study of matching theory deals with bipartite matching, network flows, and presents fundamental results for the non-bipartite case. It goes on to study elementary bipartite graphs and elementary graphs in general. Further discussed are 2-matchings, general matching problems as linear programs, the Edmonds Matching Algorithm (and other algorithmic approaches), f-factors and vertex packing.

  19. Robust Measurement via A Fused Latent and Graphical Item Response Theory Model.

    Science.gov (United States)

    Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang

    2018-03-12

    Item response theory (IRT) plays an important role in psychological and educational measurement. Unlike the classical testing theory, IRT models aggregate the item level information, yielding more accurate measurements. Most IRT models assume local independence, an assumption not likely to be satisfied in practice, especially when the number of items is large. Results in the literature and simulation studies in this paper reveal that misspecifying the local independence assumption may result in inaccurate measurements and differential item functioning. To provide more robust measurements, we propose an integrated approach by adding a graphical component to a multidimensional IRT model that can offset the effect of unknown local dependence. The new model contains a confirmatory latent variable component, which measures the targeted latent traits, and a graphical component, which captures the local dependence. An efficient proximal algorithm is proposed for the parameter estimation and structure learning of the local dependence. This approach can substantially improve the measurement, given no prior information on the local dependence structure. The model can be applied to measure both a unidimensional latent trait and multidimensional latent traits.

  20. Iz ''general relativity'' necessary for the Einstein gravitation theory gravitation theory

    International Nuclear Information System (INIS)

    Bondi, G.

    1982-01-01

    Main principles of relativity and gravitation theories are deeply analyzed. Problems of boundaries of applicability for these theories and possible ways of their change and generalization are discussed. It is shown that the notion of general relativity does not introduce any post-newton physics - it only deals with coordinate transformations. It is supposed that ''general relativity'' is a physically senseless phrase which can be considered only as a historical remainder of an interesting philosophic discourse. The paper reveals that there exists appropriate physical substantiation of the Einstein gravitation theory not including a physically senseless concept of general relativity and promoting its fundamental relations with the experiment

  1. Integrable models in 1+1 dimensional quantum field theory

    International Nuclear Information System (INIS)

    Faddeev, Ludvig.

    1982-09-01

    The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR

  2. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Mill, A.J.; Charles, M.W.; Wells, J.

    1978-04-01

    A review is presented of basic radiation physics with particular relevance to radiological protection. The processes leading to the production and absorption of ionising radiation are outlined, and the important dosimetric quantities and their units of measurements. The review is the first of a series of reports presenting the fundamentals necessary for an understanding of the basis of regulatory criteria such as those recommended by the ICRP. (author)

  3. The utility of quantum field theory

    International Nuclear Information System (INIS)

    Dine, Michael

    2001-01-01

    This talk surveys a broad range of applications of quantum field theory, as well as some recent developments. The stress is on the notion of effective field theories. Topics include implications of neutrino mass and a possible small value of sin(2β), supersymmetric extensions of the standard model, the use of field theory to understand fundamental issues in string theory (the problem of multiple ground states and the question: does string theory predict low energy supersymmetry), and the use of string theory to solve problems in field theory. Also considered are a new type of field theory, and indications from black hole physics and the cosmological constant problem that effective field theories may not completely describe theories of gravity. (author)

  4. Growing up with field theory

    International Nuclear Information System (INIS)

    Vajskopf, V.F.

    1982-01-01

    The article deals with the history of the development of quantum electrodynamics since the date of publishing the work by P.A.M. Dirac ''The Quantum Theory of the Emission and Absorption of Radiation''. Classic ''before-Dirac'' electrodynamics related with the names of Maxwell, Lorenz, Hertz, is outlined. Work of Bohr and Rosenfeld is shown to clarify the physical sense of quantized field and to reveal the existence of uncertainties between the strengths of different fields. The article points to the significance of the article ''Quantum theory of radiation'' by E. Fermi which clearly describes the Dirac theory of radiation, relativistic wave equation and fundamentals of quantum electrodynamics. Shown is work on elimination of troubles related with the existence of states with negative kinetic energy or with negative mass. Hypothesis on the Dirac filled-in vacuum led to understanding of the existence of antiparticles and two unknown till then fundamental processes - pair production and annihilation. Ways of fighting against the infinite quantities in quantum electrodynamics are considered. Renormalization of the theory overcame all the infinities and gave a pattern for calculation of any processes of electron interactions with electromagnetic field to any desired accuracy

  5. The conceptual basis of Quantum Field Theory

    NARCIS (Netherlands)

    Hooft, G. 't

    2005-01-01

    Relativistic Quantum Field Theory is a mathematical scheme to describe the sub-atomic particles and forces. The basic starting point is that the axioms of Special Relativity on the one hand and those of Quantum Mechanics on the other, should be combined into one theory. The fundamental

  6. Annual report'81 Foundation for fundamental research on matter

    International Nuclear Information System (INIS)

    Heijn, J.; Hooren, M.J.H. van

    1982-01-01

    The Dutch Foundation for Fundamental Research of Matter (FOM) makes it her aim to stimulate the fundamentally scientific research of matter in the Netherlands. She attempts to obtain this by coordinating of existing research projects and by involving her institutes and research groups in the education of young physicists. The research groups are classified in eight so-called research communities: nuclear physics, atomic physics, metals, semiconductors, solid state, thermonuclear research and plasma physics, theoretic high-energy physics. Besides accounts of the management, financial and personnel affairs, and professional/organizational reports of the aforementioned research communities and corresponding research groups, this annual report presents a number of trend articles of which two are in INIS scope, entitled respectively: Non-perturbative methods in field theory; Balance between bulk and beam studies in atomic collision research. (H.W.) refs.; figs.; tabs

  7. Gadamer's Hermeneutic Contribution to a Theory of Time ...

    African Journals Online (AJOL)

    denise

    The play of a work of art and a festival provide examples of ways in which aesthetic experience is fundamentally a source of our consciousness of time.9. Drawing on the themes of Gadamer's aesthetic theory, we can thus delineate the basic themes of. Gadamer's theory of time-consciousness: a theory that locates Gadamer ...

  8. Introduction to set theory and topology

    CERN Document Server

    Kuratowski, Kazimierz; Stark, M

    1972-01-01

    Introduction to Set Theory and Topology describes the fundamental concepts of set theory and topology as well as its applicability to analysis, geometry, and other branches of mathematics, including algebra and probability theory. Concepts such as inverse limit, lattice, ideal, filter, commutative diagram, quotient-spaces, completely regular spaces, quasicomponents, and cartesian products of topological spaces are considered. This volume consists of 21 chapters organized into two sections and begins with an introduction to set theory, with emphasis on the propositional calculus and its applica

  9. A Scale Elasticity Measure for Directional Distance Function and its Dual: Theory and DEA Estimation

    OpenAIRE

    Valentin Zelenyuk

    2012-01-01

    In this paper we focus on scale elasticity measure based on directional distance function for multi-output-multi-input technologies, explore its fundamental properties and show its equivalence with the input oriented and output oriented scale elasticity measures. We also establish duality relationship between the scale elasticity measure based on the directional distance function with scale elasticity measure based on the profit function. Finally, we discuss the estimation issues of the scale...

  10. Superstring theory

    International Nuclear Information System (INIS)

    Schwarz, J.H.

    1985-01-01

    Dual string theories, initially developed as phenomenological models of hadrons, now appear more promising as candidates for a unified theory of fundamental interactions. Type I superstring theory (SST I), is a ten-dimensional theory of interacting open and closed strings, with one supersymmetry, that is free from ghosts and tachyons. It requires that an SO(eta) or Sp(2eta) gauge group be used. A light-cone-gauge string action with space-time supersymmetry automatically incorporates the superstring restrictions and leads to the discovery of type II superstring theory (SST II). SST II is an interacting theory of closed strings only, with two D=10 supersymmetries, that is also free from ghosts and tachyons. By taking six of the spatial dimensions to form a compact space, it becomes possible to reconcile the models with our four-dimensional perception of spacetime and to define low-energy limits in which SST I reduces to N=4, D=4 super Yang-Mills theory and SST II reduces to N=8, D=4 supergravity theory. The superstring theories can be described by a light-cone-gauge action principle based on fields that are functionals of string coordinates. With this formalism any physical quantity should be calculable. There is some evidence that, unlike any conventional field theory, the superstring theories provide perturbatively renormalizable (SST I) or finite (SST II) unifications of gravity with other interactions

  11. Examining Student Ideas about Energy Measurements on Quantum States across Undergraduate and Graduate Levels

    Science.gov (United States)

    Passante, Gina; Emigh, Paul J.; Shaffer, Peter S.

    2015-01-01

    Energy measurements play a fundamental role in the theory of quantum mechanics, yet there is evidence that the underlying concepts are difficult for many students, even after all undergraduate instruction. We present results from an investigation into student ability to determine the possible energies that can be measured for a given wave function…

  12. Measurement Models for Reasoned Action Theory.

    Science.gov (United States)

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  13. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  14. Ends, fundamental tones and capacity of minimal submanifolds via extrinsic comparison theory

    DEFF Research Database (Denmark)

    Gimeno, Vicent; Markvorsen, Steen

    2015-01-01

    We study the volume of extrinsic balls and the capacity of extrinsic annuli in minimal submanifolds which are properly immersed with controlled radial sectional curvatures into an ambient manifold with a pole. The key results are concerned with the comparison of those volumes and capacities with ...... with the corresponding entities in a rotationally symmetric model manifold. Using the asymptotic behavior of the volumes and capacities we then obtain upper bounds for the number of ends as well as estimates for the fundamental tone of the submanifolds in question....

  15. The Quest for a Fundamental Theory of Physics - Rise and Demise of the Field Paradigm

    NARCIS (Netherlands)

    Holman, M.

    2014-01-01

    Quite remarkably, the two physical theories that describe extremely well physical phenomena on the largest and smallest distance scales in our universe, viz. general relativity and quantum theory, respectively, are radically disparate. Both theories are now almost a century old and have passed with

  16. Linear spaces: history and theory

    OpenAIRE

    Albrecht Beutelspracher

    1990-01-01

    Linear spaces belong to the most fundamental geometric and combinatorial structures. In this paper I would like to give an onerview about the theory of embedding finite linear spaces in finite projective planes.

  17. Implications Of The Crisis Of Objectivity In Accounting Measurement On The Development Of Finance Theory

    OpenAIRE

    Saratiel Wedzerai Musvoto

    2011-01-01

    Studies in accounting measurement indicate the absence of empirical relational structures that should form the basis for accounting measurement. This suggests the lack of objectivity of accounting information. Landmarks in the development of finance theory indicate the use of accounting measurement information as a basis for their development. This indicates that subjective accounting information is incorporated in finance theory. Consequently, this questions the status of finance as a univer...

  18. Next Steps in Attachment Theory.

    Science.gov (United States)

    Bell, David C

    2012-12-01

    Thanks to the phenomenal success of attachment theory, great progress has been made in understanding child and adult relationships. The success of attachment theory opens the way to new research directions that can extend its successes even further. In particular, more work on the fundamental nature of attachment that respects recent biological research is important, as is concentrated effort on the related caregiving system.

  19. Next Steps in Attachment Theory

    OpenAIRE

    Bell, David C.

    2012-01-01

    Thanks to the phenomenal success of attachment theory, great progress has been made in understanding child and adult relationships. The success of attachment theory opens the way to new research directions that can extend its successes even further. In particular, more work on the fundamental nature of attachment that respects recent biological research is important, as is concentrated effort on the related caregiving system.

  20. Fundamentals of charged particle transport in gases and condensed matter

    CERN Document Server

    Robson, Robert E; Hildebrandt, Malte

    2018-01-01

    This book offers a comprehensive and cohesive overview of transport processes associated with all kinds of charged particles, including electrons, ions, positrons, and muons, in both gases and condensed matter. The emphasis is on fundamental physics, linking experiment, theory and applications. In particular, the authors discuss: The kinetic theory of gases, from the traditional Boltzmann equation to modern generalizations A complementary approach: Maxwell’s equations of change and fluid modeling Calculation of ion-atom scattering cross sections Extension to soft condensed matter, amorphous materials Applications: drift tube experiments, including the Franck-Hertz experiment, modeling plasma processing devices, muon catalysed fusion, positron emission tomography, gaseous radiation detectors Straightforward, physically-based arguments are used wherever possible to complement mathematical rigor.