WorldWideScience

Sample records for fundamental measure theory

  1. Fundamental measure theory for hard-sphere mixtures: a review

    International Nuclear Information System (INIS)

    Roth, Roland

    2010-01-01

    Hard-sphere systems are one of the fundamental model systems of statistical physics and represent an important reference system for molecular or colloidal systems with soft repulsive or attractive interactions in addition to hard-core repulsion at short distances. Density functional theory for classical systems, as one of the core theoretical approaches of statistical physics of fluids and solids, has to be able to treat such an important system successfully and accurately. Fundamental measure theory is up to date the most successful and most accurate density functional theory for hard-sphere mixtures. Since its introduction fundamental measure theory has been applied to many problems, tested against computer simulations, and further developed in many respects. The literature on fundamental measure theory is already large and is growing fast. This review aims to provide a starting point for readers new to fundamental measure theory and an overview of important developments. (topical review)

  2. Theory of fundamental interactions

    International Nuclear Information System (INIS)

    Pestov, A.B.

    1992-01-01

    In the present article the theory of fundamental interactions is derived in a systematic way from the first principles. In the developed theory there is no separation between space-time and internal gauge space. Main equations for basic fields are derived. In is shown that the theory satisfies the correspondence principle and gives rise to new notions in the considered region. In particular, the conclusion is made about the existence of particles which are characterized not only by the mass, spin, charge but also by the moment of inertia. These are rotating particles, the particles which represent the notion of the rigid body on the microscopical level and give the key for understanding strong interactions. The main concepts and dynamical laws for these particles are formulated. The basic principles of the theory may be examined experimentally not in the distant future. 29 refs

  3. Fundamentals of queueing theory

    CERN Document Server

    Gross, Donald; Thompson, James M; Harris, Carl M

    2013-01-01

    Praise for the Third Edition ""This is one of the best books available. Its excellent organizational structure allows quick reference to specific models and its clear presentation . . . solidifies the understanding of the concepts being presented.""-IIE Transactions on Operations Engineering Thoroughly revised and expanded to reflect the latest developments in the field, Fundamentals of Queueing Theory, Fourth Edition continues to present the basic statistical principles that are necessary to analyze the probabilistic nature of queues. Rather than pre

  4. Proposed experiment to test fundamentally binary theories

    Science.gov (United States)

    Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán

    2017-09-01

    Fundamentally binary theories are nonsignaling theories in which measurements of many outcomes are constructed by selecting from binary measurements. They constitute a sensible alternative to quantum theory and have never been directly falsified by any experiment. Here we show that fundamentally binary theories are experimentally testable with current technology. For that, we identify a feasible Bell-type experiment on pairs of entangled qutrits. In addition, we prove that, for any n , quantum n -ary correlations are not fundamentally (n -1 ) -ary. For that, we introduce a family of inequalities that hold for fundamentally (n -1 ) -ary theories but are violated by quantum n -ary correlations.

  5. Fundamental number theory with applications

    CERN Document Server

    Mollin, Richard A

    2008-01-01

    An update of the most accessible introductory number theory text available, Fundamental Number Theory with Applications, Second Edition presents a mathematically rigorous yet easy-to-follow treatment of the fundamentals and applications of the subject. The substantial amount of reorganizing makes this edition clearer and more elementary in its coverage. New to the Second Edition           Removal of all advanced material to be even more accessible in scope           New fundamental material, including partition theory, generating functions, and combinatorial number theory           Expa

  6. Fundamentals of number theory

    CERN Document Server

    LeVeque, William J

    1996-01-01

    This excellent textbook introduces the basics of number theory, incorporating the language of abstract algebra. A knowledge of such algebraic concepts as group, ring, field, and domain is not assumed, however; all terms are defined and examples are given - making the book self-contained in this respect.The author begins with an introductory chapter on number theory and its early history. Subsequent chapters deal with unique factorization and the GCD, quadratic residues, number-theoretic functions and the distribution of primes, sums of squares, quadratic equations and quadratic fields, diopha

  7. Hydromechanics theory and fundamentals

    CERN Document Server

    Sinaiski, Emmanuil G

    2010-01-01

    Written by an experienced author with a strong background in applications of this field, this monograph provides a comprehensive and detailed account of the theory behind hydromechanics. He includes numerous appendices with mathematical tools, backed by extensive illustrations. The result is a must-have for all those needing to apply the methods in their research, be it in industry or academia.

  8. Fundamental aspects of quantum theory

    International Nuclear Information System (INIS)

    Gorini, V.; Frigerio, A.

    1986-01-01

    This book presents information on the following topics: general problems and crucial experiments; the classical behavior of measuring instruments; quantum interference effect for two atoms radiating a single photon; quantization and stochastic processes; quantum Markov processes driven by Bose noise; chaotic behavior in quantum mechanics; quantum ergodicity and chaos; microscopic and macroscopic levels of description; fundamental properties of the ground state of atoms and molecules; n-level systems interacting with Bosons - semiclassical limits; general aspects of gauge theories; adiabatic phase shifts for neutrons and photons; the spins of cyons and dyons; round-table discussion the the Aharonov-Bohm effect; gravity in quantum mechanics; the gravitational phase transition; anomalies and their cancellation; a new gauge without any ghost for Yang-Mills Theory; and energy density and roughening in the 3-D Ising ferromagnet

  9. Fundamental Principle for Quantum Theory

    OpenAIRE

    Khrennikov, Andrei

    2002-01-01

    We propose the principle, the law of statistical balance for basic physical observables, which specifies quantum statistical theory among all other statistical theories of measurements. It seems that this principle might play in quantum theory the role that is similar to the role of Einstein's relativity principle.

  10. Fundamental principles of quantum theory

    International Nuclear Information System (INIS)

    Bugajski, S.

    1980-01-01

    After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)

  11. Radiometric temperature measurements fundamentals

    CERN Document Server

    Zhang, Zhuomin M; Machin, Graham

    2009-01-01

    This book describes the theory of radiation thermometry, both at a primary level and for a variety of applications, such as in the materials processing industries and remote sensing. This book is written for those who will apply radiation thermometry in industrial practice; use radiation thermometers for scientific research; the radiation thermometry specialist in a national measurement institute; developers of radiation thermometers who are working to innovate products for instrument manufacturers, and developers of non-contact thermometry methods to address challenging thermometry problems.

  12. Long-range weight functions in fundamental measure theory of the non-uniform hard-sphere fluid

    International Nuclear Information System (INIS)

    Hansen-Goos, Hendrik

    2016-01-01

    We introduce long-range weight functions to the framework of fundamental measure theory (FMT) of the non-uniform, single-component hard-sphere fluid. While the range of the usual weight functions is equal to the hard-sphere radius R , the modified weight functions have range 3 R . Based on the augmented FMT, we calculate the radial distribution function g (r) up to second order in the density within Percus’ test particle theory. Consistency of the compressibility and virial routes on this level allows us to determine the free parameter γ of the theory. As a side result, we obtain a value for the fourth virial coefficient B 4 which deviates by only 0.01% from the exact result. The augmented FMT is tested for the dense fluid by comparing results for g (r) calculated via the test particle route to existing results from molecular dynamics simulations. The agreement at large distances (r   >  6 R) is significantly improved when the FMT with long-range weight functions is used. In order to improve agreement close to contact (r   =  2 R) we construct a free energy which is based on the accurate Carnahan–Starling equation of state, rather than the Percus–Yevick compressibility equation underlying standard FMT. (paper)

  13. Fundamental measure theory for the electric double layer: implications for blue-energy harvesting and water desalination

    International Nuclear Information System (INIS)

    Härtel, Andreas; Janssen, Mathijs; Samin, Sela; Roij, René van

    2015-01-01

    Capacitive mixing (CAPMIX) and capacitive deionization (CDI) are promising candidates for harvesting clean, renewable energy and for the energy efficient production of potable water, respectively. Both CAPMIX and CDI involve water-immersed porous carbon (supercapacitors) electrodes at voltages of the order of hundreds of millivolts, such that counter-ionic packing is important for the electric double layer (EDL) which forms near the surfaces of these porous materials. Thus, we propose a density functional theory (DFT) to model the EDL, where the White-Bear mark II fundamental measure theory functional is combined with a mean-field Coulombic and a mean spherical approximation-type correction to describe the interplay between dense packing and electrostatics, in good agreement with molecular dynamics simulations. We discuss the concentration-dependent potential rise due to changes in the chemical potential in capacitors in the context of an over-ideal theoretical description and its impact on energy harvesting and water desalination. Compared to less elaborate mean-field models our DFT calculations reveal a higher work output for blue-energy cycles and a higher energy demand for desalination cycles. (paper)

  14. Beta particle measurement fundamentals

    International Nuclear Information System (INIS)

    Alvarez, J.L.

    1986-01-01

    The necessary concepts for understanding beta particle behavior are stopping power, range, and scattering. Dose as a consequence of beta particle interaction with tissue can be derived and explained by these concepts. Any calculations of dose, however, assume or require detailed knowledge of the beta spectrum at the tissue depth of calculation. A rudimentary knowledge of the incident spectrum can be of use in estimating dose, interpretating dose measuring devices and designing protection. The stopping power and range based on the csda will give a conservative estimate in cases of protection design, as scattering will reduce the range. Estimates of dose may be low because scattering effects were neglected

  15. Fundamental papers in wavelet theory

    CERN Document Server

    Walnut, David F

    2006-01-01

    This book traces the prehistory and initial development of wavelet theory, a discipline that has had a profound impact on mathematics, physics, and engineering. Interchanges between these fields during the last fifteen years have led to a number of advances in applications such as image compression, turbulence, machine vision, radar, and earthquake prediction. This book contains the seminal papers that presented the ideas from which wavelet theory evolved, as well as those major papers that developed the theory into its current form. These papers originated in a variety of journals from differ

  16. Fundamentals in hadronic atom theory

    CERN Document Server

    Deloff, A

    2003-01-01

    Hadronic atoms provide a unique laboratory for studying hadronic interactions essentially at threshold. This text is the first book-form exposition of hadronic atom theory with emphasis on recent developments, both theoretical and experimental. Since the underlying Hamiltonian is a non-self-adjoined operator, the theory goes beyond traditional quantum mechanics and this book covers topics that are often glossed over in standard texts on nuclear physics. The material contained here is intended for the advanced student and researcher in nuclear, atomic or elementary-particle physics. A good know

  17. Modern measurements fundamentals and applications

    CERN Document Server

    Petri, D; Carbone, P; Catelani, M

    2015-01-01

    This book explores the modern role of measurement science for both the technically most advanced applications and in everyday and will help readers gain the necessary skills to specialize their knowledge for a specific field in measurement. Modern Measurements is divided into two parts. Part I (Fundamentals) presents a model of the modern measurement activity and the already recalled fundamental bricks. It starts with a general description that introduces these bricks and the uncertainty concept. The next chapters provide an overview of these bricks and finishes (Chapter 7) with a more general and complex model that encompasses both traditional (hard) measurements and (soft) measurements, aimed at quantifying non-physical concepts, such as quality, satisfaction, comfort, etc. Part II (Applications) is aimed at showing how the concepts presented in Part I can be usefully applied to design and implement measurements in some very impor ant and broad fields. The editors cover System Identification (Chapter 8...

  18. Twenty five years of fundamental theory

    International Nuclear Information System (INIS)

    Bell, J.S.

    1980-01-01

    In reviewing the last twenty five years in fundamental physics theory it is stated that there has been no revolution in this field. In the absence of gravitation, Lorentz invariance remains a requirement on fundamental laws. Einstein's theory of gravitation inspires increasing conviction on the astronomical scale. Quantum theory remains the framework for all serious effort in microphysics, and quantum electrodynamics remains the model of a fully articulated microphysical theory, completely successful in its domain. However,a number of ideas have appeared, of great theoretical interest and some phenomenological success, which may well contribute to the next decisive step. Recent work on the following topics is mentioned; gravitational radiation, singularites, black body radiation from black holes, gauge and hidden symmetry in quantum electrodynamics, the renormalization of electromagnetic and weak interaction theory, non-Abelian gauge theories, magnetic monopoles as the most striking example of solitons, and supersymmetry. (UK)

  19. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  20. The Fundamentals Of Kants Moral Theory

    Directory of Open Access Journals (Sweden)

    Adriana Saraiva Lamounier Rodrigues

    2015-12-01

    Full Text Available The article intends to study the moral thought in the philosophy of Immanuel Kant, considered the first of the philosophers who composes the movement known as German Idealism, especially in the book "Critique of Practical Reason". To achieve the objective the article begins with the traces of the moral studies at Kants time and its fundamental questions as well as traces of his formation that influenced his writings. Soon after, it analyzes the Kantian thought itself, through the work "The Idea of Justice in Kant", from Joaquim Carlos Salgado, theoretical framework of this research. It will be analyzed the postulate of freedom and its relationship with the sollen and the moral law, the species of imperatives, the categorical imperative and equality, connections that the moral theory makes for the existence of positive law and that the author considers the greater pillar of the Idea of justice from the Prussian philosophers point of of view. The methodology used in the research is theoretical, based on the analysis of the theoretical framework and its relationship to other publications concerning the same subject.

  1. SU(2) Gauge Theory with Two Fundamental Flavours

    DEFF Research Database (Denmark)

    Arthur, Rudy; Drach, Vincent; Hansen, Martin

    2016-01-01

    We investigate the continuum spectrum of the SU(2) gauge theory with $N_f=2$ flavours of fermions in the fundamental representation. This model provides a minimal template which is ideal for a wide class of Standard Model extensions featuring novel strong dynamics that range from composite...... (Goldstone) Higgs theories to several intriguing types of dark matter candidates, such as the SIMPs. We improve our previous lattice analysis [1] by adding more data at light quark masses, at two additional lattice spacings, by determining the lattice cutoff via a Wilson flow measure of the $w_0$ parameter...

  2. The implications of fundamental cause theory for priority setting.

    Science.gov (United States)

    Goldberg, Daniel S

    2014-10-01

    Application of fundamental cause theory to Powers and Faden's model of social justice highlights the ethical superiority of upstream public health interventions. In this article, I assess the ramifications of fundamental cause theory specifically in context of public health priority setting. Ethically optimal public health policy simultaneously maximizes overall population health and compresses health inequalities. The fundamental cause theory is an important framework in helping to identify which categories of public health interventions are most likely to advance these twin goals.

  3. A Local Approximation of Fundamental Measure Theory Incorporated into Three Dimensional Poisson-Nernst-Planck Equations to Account for Hard Sphere Repulsion Among Ions

    Science.gov (United States)

    Qiao, Yu; Liu, Xuejiao; Chen, Minxin; Lu, Benzhuo

    2016-04-01

    The hard sphere repulsion among ions can be considered in the Poisson-Nernst-Planck (PNP) equations by combining the fundamental measure theory (FMT). To reduce the nonlocal computational complexity in 3D simulation of biological systems, a local approximation of FMT is derived, which forms a local hard sphere PNP (LHSPNP) model. In the derivation, the excess chemical potential from hard sphere repulsion is obtained with the FMT and has six integration components. For the integrands and weighted densities in each component, Taylor expansions are performed and the lowest order approximations are taken, which result in the final local hard sphere (LHS) excess chemical potential with four components. By plugging the LHS excess chemical potential into the ionic flux expression in the Nernst-Planck equation, the three dimensional LHSPNP is obtained. It is interestingly found that the essential part of free energy term of the previous size modified model (Borukhov et al. in Phys Rev Lett 79:435-438, 1997; Kilic et al. in Phys Rev E 75:021502, 2007; Lu and Zhou in Biophys J 100:2475-2485, 2011; Liu and Eisenberg in J Chem Phys 141:22D532, 2014) has a very similar form to one term of the LHS model, but LHSPNP has more additional terms accounting for size effects. Equation of state for one component homogeneous fluid is studied for the local hard sphere approximation of FMT and is proved to be exact for the first two virial coefficients, while the previous size modified model only presents the first virial coefficient accurately. To investigate the effects of LHS model and the competitions among different counterion species, numerical experiments are performed for the traditional PNP model, the LHSPNP model, the previous size modified PNP (SMPNP) model and the Monte Carlo simulation. It's observed that in steady state the LHSPNP results are quite different from the PNP results, but are close to the SMPNP results under a wide range of boundary conditions. Besides, in both

  4. Measurement theory for engineers

    CERN Document Server

    Gertsbakh, Ilya

    2003-01-01

    The emphasis of this textbook is on industrial applications of Statistical Measurement Theory. It deals with the principal issues of measurement theory, is concise and intelligibly written, and to a wide extent self-contained. Difficult theoretical issues are separated from the mainstream presentation. Each topic starts with an informal introduction followed by an example, the rigorous problem formulation, solution method, and a detailed numerical solution. Each chapter concludes with a set of exercises of increasing difficulty, mostly with solutions. The book is meant as a text for graduate students and a reference for researchers and industrial experts specializing in measurement and measurement data analysis for quality control, quality engineering and industrial process improvement using statistical methods. Knowledge of calculus and fundamental probability and statistics is required for the understanding of its contents.

  5. Fundamental structures of M(brane) theory

    International Nuclear Information System (INIS)

    Hoppe, Jens

    2011-01-01

    A dynamical symmetry, as well as special diffeomorphism algebras generalizing the Witt-Virasoro algebra, related to Poincare invariance and crucial with regard to quantization, questions of integrability, and M(atrix) theory, are found to exist in the theory of relativistic extended objects of any dimension.

  6. Fundamental problems of gauge field theory

    International Nuclear Information System (INIS)

    Velo, G.; Wightman, A.S.

    1986-01-01

    As a result of the experimental and theoretical developments of the last two decades, gauge field theory, in one form or another, now provides the standard language for the description of Nature; QCD and the standard model of the electroweak interactions illustrate this point. It is a basic task of mathematical physics to provide a solid foundation for these developments by putting the theory in a physically transparent and mathematically rigorous form. The lecture notes collected in this volume concentrate on the many unsolved problems which arise here, and on the general ideas and methods which have been proposed for their solution. In particular, the use of rigorous renormalization group methods to obtain control over the continuum limit of lattice gauge field theories, the exploration of the extraordinary enigmatic connections between Kac-Moody-Virasoro algebras and string theory, and the systematic use of the theory of local algebras and indefinite metric spaces to classify the charged C* states in gauge field theories are mentioned

  7. Fundamentals of the physical theory of diffraction

    CERN Document Server

    Ufimtsev, Pyotr Ya

    2014-01-01

    A complete presentation of the modern physical theory of diffraction and its applications, by the world's leading authority on the topicExtensive revisions and additions to the first edition yield a second edition that is 492 pages in length, with 122 figuresNew sections examine the nature of polarization coupling, and extend the theory of shadow radiation and reflection to opaque objectsThis book features end-of-chapter problems and a solutions manual for university professors and graduate studentsMATLAB codes presented in appendices allow for quick numeric calculations of diffracted waves

  8. Fundamentals of the relativistic theory of gravitation

    International Nuclear Information System (INIS)

    Logunov, A.A.; Mestvirishvili, M.A.

    1986-01-01

    An extended exposition of the relativistic theory of gravitation (RTG) proposed by Logunov, Vlasov, and Mestvirishvili is presented. The RTG was constructed uniquely on the basis of the relativity principle and the geometrization principle by regarding the gravitational field as a physical field in the spirit of Faraday and Maxwell possessing energy, momentum, and spins 2 and 0. In the theory, conservation laws for the energy, momentum, and angular momentum for the matter and gravitational field taken together are strictly satisfied. The theory explains all the existing gravitational experiments. When the evolution of the universe is analyzed, the theory leads to the conclusion that the universe is infinite and flat, and it is predicted to contain a large amount of hidden mass. This missing mass exceeds by almost 40 times the amount of matter currently observed in the universe. The RTG predicts that gravitational collapse, which for a comoving observer occurs after a finite proper time, does not lead to infinite compression of matter but is halted at a certain finite density of the collapsing body. Therefore, according to the RTG there cannot be any objects in nature in which the gravitational contraction of matter to infinite density occurs, i.e., there are no black holes

  9. Field theories without fundamental (gauge) symmetries

    International Nuclear Information System (INIS)

    Nielsen, H.B.

    1983-12-01

    By using the lack of dependence of the form of the kinetic energy for a non-relativistic free particle as an example, it is argued that a physical law with a less extended range of application (non-relativistic energy momentum relation) often follows from a more extended one (in this case the relativistic relation) without too much dependence on the details of the latter. This is extended to the ideal of random dynamics: no fundamental laws are needed to be known. Almost any random fundamental model will give the correct main features for the range of physical conditions accessible today (energies less than 1000 GeV) even if it is wrong in detail. This suggests the programme of attempting to 'derive' the various symmetries and other features of physics known today from random models at least without the feature to be derived. The achievements in the programme of random dynamics up till now are briefly reviewed. In particular, Lorentz invariance may be understood as a low energy phenomenon. (Auth.)

  10. Fundamentals of machine theory and mechanisms

    CERN Document Server

    Simón Mata, Antonio; Cabrera Carrillo, Juan Antonio; Ezquerro Juanco, Francisco; Guerra Fernández, Antonio Jesús; Nadal Martínez, Fernando; Ortiz Fernández, Antonio

    2016-01-01

    This book covers the basic contents for an introductory course in Mechanism and Machine Theory. The topics dealt with are: kinematic and dynamic analysis of machines, introduction to vibratory behaviour, rotor and piston balance, kinematics of gears, ordinary and planetary gear trains and synthesis of planar mechanisms. A new approach to dimensional synthesis of mechanisms based on turning functions has been added for closed and open path generation using an optimization method based on evolutionary algorithms. The text, developed by a group of experts in kinematics and dynamics of mechanisms at the University of Málaga, Spain, is clear and is supported by more than 350 images. More than 60 outlined and explained problems have been included to clarify the theoretical concepts. .

  11. Group theory for chemists fundamental theory and applications

    CERN Document Server

    Molloy, K C

    2010-01-01

    The basics of group theory and its applications to themes such as the analysis of vibrational spectra and molecular orbital theory are essential knowledge for the undergraduate student of inorganic chemistry. The second edition of Group Theory for Chemists uses diagrams and problem-solving to help students test and improve their understanding, including a new section on the application of group theory to electronic spectroscopy.Part one covers the essentials of symmetry and group theory, including symmetry, point groups and representations. Part two deals with the application of group theory t

  12. Random measures, theory and applications

    CERN Document Server

    Kallenberg, Olav

    2017-01-01

    Offering the first comprehensive treatment of the theory of random measures, this book has a very broad scope, ranging from basic properties of Poisson and related processes to the modern theories of convergence, stationarity, Palm measures, conditioning, and compensation. The three large final chapters focus on applications within the areas of stochastic geometry, excursion theory, and branching processes. Although this theory plays a fundamental role in most areas of modern probability, much of it, including the most basic material, has previously been available only in scores of journal articles. The book is primarily directed towards researchers and advanced graduate students in stochastic processes and related areas.

  13. On time variation of fundamental constants in superstring theories

    International Nuclear Information System (INIS)

    Maeda, K.I.

    1988-01-01

    Assuming the action from the string theory and taking into account the dynamical freedom of a dilaton and its coupling to matter fluid, the authors show that fundamental 'constants' in string theories are independent of the 'radius' of the internal space. Since the scalar related to the 'constants' is coupled to the 4-dimensional gravity and matter fluid in the same way as in the Jordan-Brans Dicke theory with ω = -1, it must be massive and can get a mass easily through some symmetry breaking mechanism (e.g. the SUSY breaking due to a gluino condensation). Consequently, time variation of fundamental constants is too small to be observed

  14. Fundamental link between system theory and statistical mechanics

    International Nuclear Information System (INIS)

    Atmanspacher, H.; Scheingraber, H.

    1987-01-01

    A fundamental link between system theory and statistical mechanics has been found to be established by the Kolmogorov entropy. By this quantity the temporal evolution of dynamical systems can be classified into regular, chaotic, and stochastic processes. Since K represents a measure for the internal information creation rate of dynamical systems, it provides an approach to irreversibility. The formal relationship to statistical mechanics is derived by means of an operator formalism originally introduced by Prigogine. For a Liouville operator L and an information operator M tilde acting on a distribution in phase space, it is shown that i[L, M tilde] = KI (I = identity operator). As a first consequence of this equivalence, a relation is obtained between the chaotic correlation time of a system and Prigogine's concept of a finite duration of presence. Finally, the existence of chaos in quantum systems is discussed with respect to the existence of a quantum mechanical time operator

  15. DOE fundamentals handbook: Nuclear physics and reactor theory

    International Nuclear Information System (INIS)

    1993-01-01

    The Nuclear Physics and Reactor Theory Handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of nuclear physics and reactor theory. The handbook includes information on atomic and nuclear physics; neutron characteristics; reactor theory and nuclear parameters; and the theory of reactor operation. This information will provide personnel with a foundation for understanding the scientific principles that are associated with various DOE nuclear facility operations and maintenance

  16. On the fundamental principles of the relativistic theory of gravitation

    International Nuclear Information System (INIS)

    Logunov, A.A.; Mestvirishvili, M.A.

    1990-01-01

    This paper expounds consistently within the frames of the Special Relativity Theory the fundamental postulates of the Relativistic Theory of Gravitation (RTG) which make it possible to obtain the unique complete system of the equations for gravitational field. Major attention has been paid to the analysis of the gauge group and of the causality principle. Some results related to the evolution of the Friedmann Universe, to gravitational collapse, etc. being the consequences of the RTG equations are also presented. 7 refs

  17. Fundamental Elements and Interactions of Nature: A Classical Unification Theory

    Directory of Open Access Journals (Sweden)

    Tianxi Zhang

    2010-04-01

    Full Text Available A classical unification theory that completely unifies all the fundamental interactions of nature is developed. First, the nature is suggested to be composed of the following four fundamental elements: mass, radiation, electric charge, and color charge. All known types of matter or particles are a combination of one or more of the four fundamental elements. Photons are radiation; neutrons have only mass; protons have both mass and electric charge; and quarks contain mass, electric charge, and color charge. The nature fundamental interactions are interactions among these nature fundamental elements. Mass and radiation are two forms of real energy. Electric and color charges are considered as two forms of imaginary energy. All the fundamental interactions of nature are therefore unified as a single interaction between complex energies. The interaction between real energies is the gravitational force, which has three types: mass-mass, mass-radiation, and radiation-radiation interactions. Calculating the work done by the mass-radiation interaction on a photon derives the Einsteinian gravitational redshift. Calculating the work done on a photon by the radiation-radiation interaction derives a radiation redshift, which is much smaller than the gravitational redshift. The interaction between imaginary energies is the electromagnetic (between electric charges, weak (between electric and color charges, and strong (between color charges interactions. In addition, we have four imaginary forces between real and imaginary energies, which are mass-electric charge, radiation-electric charge, mass-color charge, and radiation-color charge interactions. Among the four fundamental elements, there are ten (six real and four imaginary fundamental interactions. This classical unification theory deepens our understanding of the nature fundamental elements and interactions, develops a new concept of imaginary energy for electric and color charges, and provides a

  18. Fundamental Elements and Interactions of Nature: A Classical Unification Theory

    Directory of Open Access Journals (Sweden)

    Zhang T. X.

    2010-04-01

    Full Text Available A classical unification theory that completely unifies all the fundamental interactions of nature is developed. First, the nature is suggested to be composed of the following four fundamental elements: mass, radiation, electric charge, and color charge. All known types of matter or particles are a combination of one or more of the four fundamental elements. Photons are radiation; neutrons have only mass; protons have both mass and electric charge; and quarks contain mass, electric charge, and color charge. The nature fundamental interactions are interactions among these nature fundamental elements. Mass and radiation are two forms of real energy. Electric and color charges are con- sidered as two forms of imaginary energy. All the fundamental interactions of nature are therefore unified as a single interaction between complex energies. The interac- tion between real energies is the gravitational force, which has three types: mass-mass, mass-radiation, and radiation-radiation interactions. Calculating the work done by the mass-radiation interaction on a photon derives the Einsteinian gravitational redshift. Calculating the work done on a photon by the radiation-radiation interaction derives a radiation redshift, which is much smaller than the gravitational redshift. The interaction between imaginary energies is the electromagnetic (between electric charges, weak (between electric and color charges, and strong (between color charges interactions. In addition, we have four imaginary forces between real and imaginary energies, which are mass-electric charge, radiation-electric charge, mass-color charge, and radiation- color charge interactions. Among the four fundamental elements, there are ten (six real and four imaginary fundamental interactions. This classical unification theory deep- ens our understanding of the nature fundamental elements and interactions, develops a new concept of imaginary energy for electric and color charges, and provides a

  19. Fundamental theories of waves and particles formulated without classical mass

    Science.gov (United States)

    Fry, J. L.; Musielak, Z. E.

    2010-12-01

    Quantum and classical mechanics are two conceptually and mathematically different theories of physics, and yet they do use the same concept of classical mass that was originally introduced by Newton in his formulation of the laws of dynamics. In this paper, physical consequences of using the classical mass by both theories are explored, and a novel approach that allows formulating fundamental (Galilean invariant) theories of waves and particles without formally introducing the classical mass is presented. In this new formulation, the theories depend only on one common parameter called 'wave mass', which is deduced from experiments for selected elementary particles and for the classical mass of one kilogram. It is shown that quantum theory with the wave mass is independent of the Planck constant and that higher accuracy of performing calculations can be attained by such theory. Natural units in connection with the presented approach are also discussed and justification beyond dimensional analysis is given for the particular choice of such units.

  20. Fundamental U-Theory of Time. Part 1

    Directory of Open Access Journals (Sweden)

    Yuvraj J. Gopaul

    2016-02-01

    Full Text Available The Fundamental U-Theory of Time (Part 1 is an original theory that aims to unravel the mystery of what exactly is ‘time’. To date very few explanations, from the branches of physics or cosmology, have succeeded to provide an accurate and comprehensive depiction of time. Most explanations have only managed to provide partial understanding or at best, glimpses of its true nature. The U-Theory uses ‘Thought Experiments’ to uncover the determining characteristics of time. In part 1 of this theory, the focus is not on the mathematics as it is on the accuracy of the depiction of time. Moreover, it challenges current views on theoretical physics, particularly on the idea of ‘time travel’. Notably, it is a theory seeking to present a fresh approach for reviewing Einstein’s Theory of Relativity, while unlocking new pathways for upcoming research in the field of physics and cosmology.

  1. Measure and integration theory

    CERN Document Server

    Burckel, Robert B

    2001-01-01

    This book gives a straightforward introduction to the field as it is nowadays required in many branches of analysis and especially in probability theory. The first three chapters (Measure Theory, Integration Theory, Product Measures) basically follow the clear and approved exposition given in the author's earlier book on ""Probability Theory and Measure Theory"". Special emphasis is laid on a complete discussion of the transformation of measures and integration with respect to the product measure, convergence theorems, parameter depending integrals, as well as the Radon-Nikodym theorem. The fi

  2. Perspective: Fundamental aspects of time-dependent density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Maitra, Neepa T. [Department of Physics and Astronomy, Hunter College and the Physics Program at the Graduate Center of the City University of New York, 695 Park Avenue, New York, New York 10065 (United States)

    2016-06-14

    In the thirty-two years since the birth of the foundational theorems, time-dependent density functional theory has had a tremendous impact on calculations of electronic spectra and dynamics in chemistry, biology, solid-state physics, and materials science. Alongside the wide-ranging applications, there has been much progress in understanding fundamental aspects of the functionals and the theory itself. This Perspective looks back to some of these developments, reports on some recent progress and current challenges for functionals, and speculates on future directions to improve the accuracy of approximations used in this relatively young theory.

  3. Measurement theory in quantum mechanics

    International Nuclear Information System (INIS)

    Klein, G.

    1980-01-01

    It is assumed that consciousness, memory and liberty (within the limits of the quantum mechanics indeterminism) are fundamental properties of elementary particles. Then, using this assumption it is shown how measurements and observers may be introduced in a natural way in the quantum mechanics theory. There are no longer fundamental differences between macroscopic and microscopic objects, between classical and quantum objects, between observer and object. Thus, discrepancies and paradoxes have disappeared from the conventional quantum mechanics theory. One consequence of the cumulative memory of the particles is that the sum of negentropy plus information is a constant. Using this theory it is also possible to explain the 'paranormal' phenomena and what is their difference from the 'normal' ones [fr

  4. Geometric theory of fundamental interactions. Foundations of unified physics

    International Nuclear Information System (INIS)

    Pestov, A.B.

    2012-01-01

    We put forward an idea that regularities of unified physics are in a simple relation: everything in the concept of space and the concept of space in everything. With this hypothesis as a ground, a conceptual structure of a unified geometrical theory of fundamental interactions is created and deductive derivation of its main equations is produced. The formulated theory gives solution of the actual problems, provides opportunity to understand the origin and nature of physical fields, local internal symmetry, time, energy, spin, charge, confinement, dark energy and dark matter, thus conforming the existence of new physics in its unity

  5. Explaining crude oil prices using fundamental measures

    International Nuclear Information System (INIS)

    Coleman, Les

    2012-01-01

    Oil is the world's most important commodity, and improving the understanding of drivers of its price is a longstanding research objective. This article analyses real oil prices during 1984–2007 using a monthly dataset of fundamental and market parameters that cover financial markets, global economic growth, demand and supply of oil, and geopolitical measures. The innovation is to incorporate proxies for speculative and terrorist activity and dummies for major industry events, and quantify price impacts of each. New findings are positive links between oil prices and speculative activity, bond yields, an interaction term incorporating OPEC market share and OECD import dependence, and the number of US troops and frequency of terrorist attacks in the Middle East. Shocks also prove significant with a $6–18 per barrel impact on price for several months. - Highlights: ► Article introduces new variables to the study of oil prices. ► New variables are terrorist incidents and military activity, and oil futures market size. ► Shocks prove important affecting prices by $6–18 per barrel for several months. ► OPEC market influence rises with OECD import dependence.

  6. M(atrix) theory: matrix quantum mechanics as a fundamental theory

    International Nuclear Information System (INIS)

    Taylor, Washington

    2001-01-01

    This article reviews the matrix model of M theory. M theory is an 11-dimensional quantum theory of gravity that is believed to underlie all superstring theories. M theory is currently the most plausible candidate for a theory of fundamental physics which reconciles gravity and quantum field theory in a realistic fashion. Evidence for M theory is still only circumstantial -- no complete background-independent formulation of the theory exists as yet. Matrix theory was first developed as a regularized theory of a supersymmetric quantum membrane. More recently, it has appeared in a different guise as the discrete light-cone quantization of M theory in flat space. These two approaches to matrix theory are described in detail and compared. It is shown that matrix theory is a well-defined quantum theory that reduces to a supersymmetric theory of gravity at low energies. Although its fundamental degrees of freedom are essentially pointlike, higher-dimensional fluctuating objects (branes) arise through the non-Abelian structure of the matrix degrees of freedom. The problem of formulating matrix theory in a general space-time background is discussed, and the connections between matrix theory and other related models are reviewed

  7. In search for the unified theory of fundamental interactions

    International Nuclear Information System (INIS)

    Ansel'm, A.A.

    1980-01-01

    The problem of developing the unified theory of fundamental interactions is considered in a popular form. The fundamental interactions include interactions between really elementary particles (quarks and leptons) which are performed by strong, weak, electromagnetic and gravitational forces. The unified theory is based on the requirement of ''Local symmetry''. The problem on invariance of strong interaction theory to local isotopic transformation was proposed for the first time by Yang and Mills, who introduced fields, called compensating (they compensate additional members in the theory equations, appearing during local transformations) Quanta of these fields (calibrating bosons) are massless particles with a spin, equal to one. The bosons should have the mass different from zero in order to be the carriers of real strong and weak interactions. At present there exist two mechanisms, due to which the mentioned controdiction can be overcome. One of these mechanisms - spontaneous symmetry distortion, the other mechanism - ''non-escape'', or ''captivity'' of the particles. The main ideas of building the realistic model of strong interaction are briefly presented

  8. Fundamentals of time-dependent density functional theory

    International Nuclear Information System (INIS)

    Marques, Miguel A.L.; Rubio, Angel

    2012-01-01

    There have been many significant advances in time-dependent density functional theory over recent years, both in enlightening the fundamental theoretical basis of the theory, as well as in computational algorithms and applications. This book, as successor to the highly successful volume Time-Dependent Density Functional Theory (Lect. Notes Phys. 706, 2006) brings together for the first time all recent developments in a systematic and coherent way. First, a thorough pedagogical presentation of the fundamental theory is given, clarifying aspects of the original proofs and theorems, as well as presenting fresh developments that extend the theory into new realms such as alternative proofs of the original Runge-Gross theorem, open quantum systems, and dispersion forces to name but a few. Next, all of the basic concepts are introduced sequentially and building in complexity, eventually reaching the level of open problems of interest. Contemporary applications of the theory are discussed, from real-time coupled-electron-ion dynamics, to excited-state dynamics and molecular transport. Last but not least, the authors introduce and review recent advances in computational implementation, including massively parallel architectures and graphical processing units. Special care has been taken in editing this volume as a multi-author textbook, following a coherent line of thought, and making all the relevant connections between chapters and concepts consistent throughout. As such it will prove to be the text of reference in this field, both for beginners as well as expert researchers and lecturers teaching advanced quantum mechanical methods to model complex physical systems, from molecules to nanostructures, from biocomplexes to surfaces, solids and liquids. (orig.)

  9. To the field theory with a fundamental mass

    International Nuclear Information System (INIS)

    Ibadov, R.M.; Kadyshevskij, V.G.

    1986-01-01

    This paper is a continuation of the investigations along the lines of constructing a consistent field theory with fundamental mass M - a hypothetical universal scale in the ultrahigh energy region. Earlier, in the developed approach the key role was played by the de Sitter momentum space of radius M. In this paper a quantum version of this idea is worked out: p-space is assumed to be a de Sitter one like before; however, the four-momentum p μ is treated as a quantum mechanical operator in δ/δx μ only

  10. Fundamentals of functions and measure theory

    CERN Document Server

    Mikhalev, Alexander V; Zakharov, Valeriy K

    2018-01-01

    The series is devoted to the publication of monographs and high-level textbooks in mathematics, mathematical methods and their applications. Apart from covering important areas of current interest, a major aim is to make topics of an interdisciplinary nature accessible to the non-specialist. The works in this series are addressed to advanced students and researchers in mathematics and theoretical physics. In addition, it can serve as a guide for lectures and seminars on a graduate level. The series de Gruyter Studies in Mathematics was founded ca. 30 years ago by the late Professor Heinz Bauer and Professor Peter Gabriel with the aim to establish a series of monographs and textbooks of high standard, written by scholars with an international reputation presenting current fields of research in pure and applied mathematics. While the editorial board of the Studies has changed with the years, the aspirations of the Studies are unchanged. In times of rapid growth of mathematical knowledge carefully written monogr...

  11. Fundamental constants and tests of theory in Rydberg states of hydrogenlike ions.

    Science.gov (United States)

    Jentschura, Ulrich D; Mohr, Peter J; Tan, Joseph N; Wundt, Benedikt J

    2008-04-25

    A comparison of precision frequency measurements to quantum electrodynamics (QED) predictions for Rydberg states of hydrogenlike ions can yield information on values of fundamental constants and test theory. With the results of a calculation of a key QED contribution reported here, the uncertainty in the theory of the energy levels is reduced to a level where such a comparison can yield an improved value of the Rydberg constant.

  12. Fundamental Constants and Tests of Theory in Rydberg States of Hydrogenlike Ions

    International Nuclear Information System (INIS)

    Jentschura, Ulrich D.; Mohr, Peter J.; Tan, Joseph N.; Wundt, Benedikt J.

    2008-01-01

    A comparison of precision frequency measurements to quantum electrodynamics (QED) predictions for Rydberg states of hydrogenlike ions can yield information on values of fundamental constants and test theory. With the results of a calculation of a key QED contribution reported here, the uncertainty in the theory of the energy levels is reduced to a level where such a comparison can yield an improved value of the Rydberg constant

  13. Essentials of measure theory

    CERN Document Server

    Kubrusly, Carlos S

    2015-01-01

    Classical in its approach, this textbook is thoughtfully designed and composed in two parts. Part I is meant for a one-semester beginning graduate course in measure theory, proposing an “abstract” approach to measure and integration, where the classical concrete cases of Lebesgue measure and Lebesgue integral are presented as an important particular case of general theory. Part II of the text is more advanced and is addressed to a more experienced reader. The material is designed to cover another one-semester graduate course subsequent to a first course, dealing with measure and integration in topological spaces. The final section of each chapter in Part I presents problems that are integral to each chapter, the majority of which consist of auxiliary results, extensions of the theory, examples, and counterexamples. Problems which are highly theoretical have accompanying hints. The last section of each chapter of Part II consists of Additional Propositions containing auxiliary and complementary results. Th...

  14. Geometric measure theory

    CERN Document Server

    Waerden, B

    1996-01-01

    From the reviews: "... Federer's timely and beautiful book indeed fills the need for a comprehensive treatise on geometric measure theory, and his detailed exposition leads from the foundations of the theory to the most recent discoveries. ... The author writes with a distinctive style which is both natural and powerfully economical in treating a complicated subject. This book is a major treatise in mathematics and is essential in the working library of the modern analyst." Bulletin of the London Mathematical Society.

  15. Laser measurement technology fundamentals and applications

    CERN Document Server

    Donges, Axel

    2015-01-01

    Laser measurement technology has evolved in the last years in a versatile and reflationary way. Today, its methods are indispensable for research and development activities as well as for production technology. Every physicist and engineer should therefore gain a working knowledge of laser measurement technology. This book closes the gap of existing textbooks. It introduces in a comprehensible presentation laser measurement technology in all its aspects. Numerous figures, graphs and tables allow for a fast access into the matter. In the first part of the book the important physical and optical basics are described being necessary to understand laser measurement technology. In the second part technically significant measuring methods are explained and application examples are presented. Target groups of this textbook are students of natural and engineering sciences as well as working physicists and engineers, who are interested to make themselves familiar with laser measurement technology and its fascinating p...

  16. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1989-01-01

    An important aspect of a complete measurement control program for special nuclear materials is the analysis of data from periodic control measurements of known standards. This chapter covers the following topics: basic algorithms including an introduction and terminology, the standard case (known mean and standard deviation), Shewart control charts, and sequential test for bias; modifications for nonstandard cases including modification for changing (decaying) standard value, modifications for deteriorating measurement precision, and modifications when repeated measurements are made; maintenance information including estimation of historical standard deviation (standard case), estimation of historical standard deviation (changing with time), normality and outliners, and other tests of randomness

  17. Fundamental tests and measures of the structure of matter at short distances

    International Nuclear Information System (INIS)

    Brodsky, S.J.

    1981-07-01

    Recent progress in gauge field theories has led to a new perspective on the structure of matter and basic interactions at short distances. It is clear that at very high energies quantum electrodynamics, together with the weak and strong interactions, are part of a unified theory with new fundamental constants, new symmetries, and new conservation laws. A non-technical introduction to these topics is given, with emphasis on fundamental tests and measurements. 21 references

  18. Fundamental developments for quantitative acoustic emission measurements

    International Nuclear Information System (INIS)

    Breckenridge, F.R.; Eitzen, D.G.; Clough, R.B.; Fuller, E.R.; Hsu, N.N.; Simmons, J.A.

    1981-10-01

    This report describes a research program supported jointly by the Electric Power Research Institute and the National Bureau of Standards. The intent of this report is to present an in depth description of research and results of this program for the specialist; additional details are contained in the referenced papers resulting from this program. The work under Phase 1 of the EPRI/NBS AE program has focused on: improved test standardization through the development of a calibration capability for AE sensors; improved sensor concepts and techniques for field and laboratory calibration; an improved basis for understanding and predicting AE behavior through the development of a mathematical framework for AE (transfer function formalism) through specific theoretical solutions to AE generation, transmission and inversion problems and the successful application of these theories to actual events in glass; an improved basis for assessing defect significance through the development of improved signal processing and inversion methods and through experimental results from AE in pressure vessel steels; the implementation of experiments to establish the feasibility of using causal methods, based on theoretical mechanics, to obtain source information in structural steels

  19. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  20. Fundamental and applied measurements in ICP MS

    International Nuclear Information System (INIS)

    Carter, Julian Robert

    2002-01-01

    Fundamental and applied aspects of ICP-MS have been investigated to gain an increased understanding of the technique and improve on its analytical capabilities. Dissociation temperatures of polyatomic ions were calculated using a double-focusing sector instrument, to obtain more reliable mass spectral data with controlled vapour introduction via a Dreschel bottle to allow accurate calculation of the ingredients in the plasma. The equilibrium temperature for the plasma, operated at 1280 W calculated using CO + and C 2 + as the thermometric probes, was c.a. 5800 - 7400 K, while using ArO + and ArC + as the thermometric probes the temperature calculated was c.a. 2000 - 7000 K. Calculated dissociation temperatures were used to elucidate the site of formation of these ions. Results confirmed that strongly bound ions such as CO + and C 2 + were formed in the plasma whereas weakly bound ions such as ArO + and ArC + were formed in the interface region due to gross deviation of the calculated temperatures from those expected for a system in thermal equilibrium. The use of helium gas in a hexapole collision cell attenuated the signals of ArH + Ar + , ArO + , ArC + , ArCl + and Ar 2 + allowing improved determination of 39 K + , 40 Ca + , 56 Fe + , 52 Cr + , 75 As + and 80 Se + in standard solutions. The use of the hexapole collision cell also resulted in an enhancement of analyte signals due to the thermalisation of the ion beam. The ion kinetic energy of ions sampled from the plasma and those sampled from the skimmer cone were determined using a modified lens stack to assess the significance for memory effects of material deposited on the skimmer cone. The most probable kinetic energy of Be + ions sampled from the skimmer cone was found to be 2.4 eV, which was considerably lower than the most probable kinetic energy of Be + ions sampled from the plasma, which was found to be 9.5 eV. The low kinetic energy of the ions deposited on the skimmer cone means they will only

  1. Fundamental and applied measurements in ICP MS

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Julian Robert

    2002-07-01

    Fundamental and applied aspects of ICP-MS have been investigated to gain an increased understanding of the technique and improve on its analytical capabilities. Dissociation temperatures of polyatomic ions were calculated using a double-focusing sector instrument, to obtain more reliable mass spectral data with controlled vapour introduction via a Dreschel bottle to allow accurate calculation of the ingredients in the plasma. The equilibrium temperature for the plasma, operated at 1280 W calculated using CO{sup +} and C{sub 2}{sup +} as the thermometric probes, was c.a. 5800 - 7400 K, while using ArO{sup +} and ArC{sup +} as the thermometric probes the temperature calculated was c.a. 2000 - 7000 K. Calculated dissociation temperatures were used to elucidate the site of formation of these ions. Results confirmed that strongly bound ions such as CO{sup +} and C{sub 2}{sup +} were formed in the plasma whereas weakly bound ions such as ArO{sup +} and ArC{sup +} were formed in the interface region due to gross deviation of the calculated temperatures from those expected for a system in thermal equilibrium. The use of helium gas in a hexapole collision cell attenuated the signals of ArH{sup +} Ar{sup +}, ArO{sup +}, ArC{sup +}, ArCl{sup +} and Ar{sub 2}{sup +} allowing improved determination of {sup 39}K{sup +}, {sup 40}Ca{sup +}, {sup 56}Fe{sup +}, {sup 52}Cr{sup +}, {sup 75}As{sup +} and {sup 80}Se{sup +} in standard solutions. The use of the hexapole collision cell also resulted in an enhancement of analyte signals due to the thermalisation of the ion beam. The ion kinetic energy of ions sampled from the plasma and those sampled from the skimmer cone were determined using a modified lens stack to assess the significance for memory effects of material deposited on the skimmer cone. The most probable kinetic energy of Be{sup +} ions sampled from the skimmer cone was found to be 2.4 eV, which was considerably lower than the most probable kinetic energy of Be{sup +} ions

  2. Fundamental problem in the relativistic approach to atomic structure theory

    International Nuclear Information System (INIS)

    Kagawa, Takashi

    1987-01-01

    It is known that the relativistic atomic structure theory contains a serious fundamental problem, so-called the Brown-Ravenhall (BR) problem or variational collapse. This problem arises from the fact that the energy spectrum of the relativistic Hamiltonian for many-electron systems is not bounded from below because the negative-energy solutions as well as the positive-energy ones are obtained from the relativistic equation. This report outlines two methods to avoid the BR problem in the relativistic calculation, that is, the projection operator method and the general variation method. The former method is described first. The use of a modified Hamiltonian containing a projection operator which projects the positive-energy solutions in the relativistic wave equation has been proposed to remove the BR difficulty. The problem in the use of the projection operator method is that the projection operator for the system cannot be determined uniquely. The final part of this report outlines the general variation method. This method can be applied to any system, such as relativistic ones whose Hamiltonian is not bounded from below. (Nogami, K.)

  3. Five fundamental constraints on theories of the origins of music.

    Science.gov (United States)

    Merker, Bjorn; Morley, Iain; Zuidema, Willem

    2015-03-19

    The diverse forms and functions of human music place obstacles in the way of an evolutionary reconstruction of its origins. In the absence of any obvious homologues of human music among our closest primate relatives, theorizing about its origins, in order to make progress, needs constraints from the nature of music, the capacities it engages, and the contexts in which it occurs. Here we propose and examine five fundamental constraints that bear on theories of how music and some of its features may have originated. First, cultural transmission, bringing the formal powers of cultural as contrasted with Darwinian evolution to bear on its contents. Second, generativity, i.e. the fact that music generates infinite pattern diversity by finite means. Third, vocal production learning, without which there can be no human singing. Fourth, entrainment with perfect synchrony, without which there is neither rhythmic ensemble music nor rhythmic dancing to music. And fifth, the universal propensity of humans to gather occasionally to sing and dance together in a group, which suggests a motivational basis endemic to our biology. We end by considering the evolutionary context within which these constraints had to be met in the genesis of human musicality. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  4. Measurement Errors and Uncertainties Theory and Practice

    CERN Document Server

    Rabinovich, Semyon G

    2006-01-01

    Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...

  5. Infusing fundamental cause theory with features of Pierre Bourdieu's theory of symbolic power.

    Science.gov (United States)

    Veenstra, Gerry

    2018-02-01

    The theory of fundamental causes is one of the more influential attempts to provide a theoretical infrastructure for the strong associations between indicators of socioeconomic status (education, income, occupation) and health. It maintains that people of higher socioeconomic status have greater access to flexible resources such as money, knowledge, prestige, power, and beneficial social connections that they can use to reduce their risks of morbidity and mortality and minimize the consequences of disease once it occurs. However, several key aspects of the theory remain underspecified, compromising its ability to provide truly compelling explanations for socioeconomic health inequalities. In particular, socioeconomic status is an assembly of indicators that do not necessarily cohere in a straightforward way, the flexible resources that disproportionately accrue to higher status people are not clearly defined, and the distinction between socioeconomic status and resources is ambiguous. I attempt to address these definitional issues by infusing fundamental cause theory with features of a well-known theory of socioeconomic stratification in the sociological literature-Pierre Bourdieu's theory of symbolic power.

  6. Fundamental characteristics of the QFP measured by the dc SQUID

    International Nuclear Information System (INIS)

    Shimizu, N.; Harada, Y.; Miyamoto, N.; Hosoya, M.; Goto, E.

    1989-01-01

    This paper describes the fundamental characteristics of the Quantum Flux Parametron (QFP) measured by a new method in which the output signals of the QFP are detected with a dc SQUID. The dc SQUID linearly and continuously converts the output current of the QFP to voltage, allowing the output signal of the QFP to be measured as the voltage of the dc SQUID. Thus, the fundamental characteristics of the QFP have been experimentally confirmed in detail

  7. Fundamentals of Acoustics. Psychoacoustics and Hearing. Acoustical Measurements

    Science.gov (United States)

    Begault, Durand R.; Ahumada, Al (Technical Monitor)

    1997-01-01

    These are 3 chapters that will appear in a book titled "Building Acoustical Design", edited by Charles Salter. They are designed to introduce the reader to fundamental concepts of acoustics, particularly as they relate to the built environment. "Fundamentals of Acoustics" reviews basic concepts of sound waveform frequency, pressure, and phase. "Psychoacoustics and Hearing" discusses the human interpretation sound pressure as loudness, particularly as a function of frequency. "Acoustic Measurements" gives a simple overview of the time and frequency weightings for sound pressure measurements that are used in acoustical work.

  8. The Kadomtsev-Petviashvili equations and fundamental string theory

    International Nuclear Information System (INIS)

    Gilbert, G.

    1988-01-01

    In this paper the infinite sequence of non-linear partial differential equations known as the Kadomtsev-Petviashvili equations is described in simple terms and possible applications to a fundamental description of interacting strings are addressed. Lines of research likely to prove useful in formulating a description of non-perturbative string configurations are indicated. (orig.)

  9. Macroscopic Fundamental Diagram for pedestrian networks : Theory and applications

    NARCIS (Netherlands)

    Hoogendoorn, S.P.; Daamen, W.; Knoop, V.L.; Steenbakkers, Jeroen; Sarvi, Majid

    2017-01-01

    The Macroscopic Fundamental diagram (MFD) has proven to be a powerful concept in understanding and managing vehicular network dynamics, both from a theoretical angle and from a more application-oriented perspective. In this contribution, we explore the existence and the characteristics of the

  10. Relativistic quantum chemistry the fundamental theory of molecular science

    CERN Document Server

    Reiher, Markus

    2014-01-01

    Einstein proposed his theory of special relativity in 1905. For a long time it was believed that this theory has no significant impact on chemistry. This view changed in the 1970s when it was realized that (nonrelativistic) Schrödinger quantum mechanics yields results on molecular properties that depart significantly from experimental results. Especially when heavy elements are involved, these quantitative deviations can be so large that qualitative chemical reasoning and understanding is affected. For this to grasp the appropriate many-electron theory has rapidly evolved. Nowadays relativist

  11. Non-nucleon degrees of freedom in nuclei and ABC plan for developing fundamental nuclear theories

    International Nuclear Information System (INIS)

    Zhang Qiren

    1996-01-01

    We emphasize that to develop a fundamental nuclear theory one has to consider various non-nucleon degrees of freedom in nuclei and to make the theory relativistic. A three step ABC Plan for this purpose is proposed. The A pan is to reform the relativistic hadron field theory by taking the finite baryon size into account. We call finite size baryons atoms in contrast with points. The fundamental nuclear theory in this form is therefore a quantum atom dynamics (QAD). The B plan is to reform the bag model for hadrons by making it be quantum bag dynamics (QBD). This is a model fundamental nuclear theory on the quark level. The fundamental nuclear theory should eventually be developed on the basis of quantum chromodynamics (QCD). This is the C Plan

  12. Hybrid Fundamental Solution Based Finite Element Method: Theory and Applications

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2015-01-01

    Full Text Available An overview on the development of hybrid fundamental solution based finite element method (HFS-FEM and its application in engineering problems is presented in this paper. The framework and formulations of HFS-FEM for potential problem, plane elasticity, three-dimensional elasticity, thermoelasticity, anisotropic elasticity, and plane piezoelectricity are presented. In this method, two independent assumed fields (intraelement filed and auxiliary frame field are employed. The formulations for all cases are derived from the modified variational functionals and the fundamental solutions to a given problem. Generation of elemental stiffness equations from the modified variational principle is also described. Typical numerical examples are given to demonstrate the validity and performance of the HFS-FEM. Finally, a brief summary of the approach is provided and future trends in this field are identified.

  13. Elementary Concepts and Fundamental Laws of the Theory of Heat

    Science.gov (United States)

    de Oliveira, Mário J.

    2018-06-01

    The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.

  14. Elementary Concepts and Fundamental Laws of the Theory of Heat

    Science.gov (United States)

    de Oliveira, Mário J.

    2018-03-01

    The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.

  15. Atom Interferometry for Fundamental Physics and Gravity Measurements in Space

    Science.gov (United States)

    Kohel, James M.

    2012-01-01

    Laser-cooled atoms are used as freefall test masses. The gravitational acceleration on atoms is measured by atom-wave interferometry. The fundamental concept behind atom interferometry is the quantum mechanical particle-wave duality. One can exploit the wave-like nature of atoms to construct an atom interferometer based on matter waves analogous to laser interferometers.

  16. Accurate Q value measurements for fundamental physics studies at JYFLTRAP

    Energy Technology Data Exchange (ETDEWEB)

    Eronen, T., E-mail: tommi.o.eronen@jyu.fi; Kolhinen, V. S. [University of Jyvaeskylae (Finland); Collaboration: JYFLTRAP collaboration

    2011-07-15

    We have measured several Q values at JYFLTRAP for superallowed {beta} decays that contribute to testing the Standard Model and candidate nuclei that one could use for the search of neutrinoless double-{beta} decay. These results play important roles in the research of fundamental physics that have scopes beyond Standard Model.

  17. Fundamentals of wireless sensor networks theory and practice

    CERN Document Server

    Dargie, Waltenegus

    2010-01-01

    In this book, the authors describe the fundamental concepts and practical aspects of wireless sensor networks. The book provides a comprehensive view to this rapidly evolving field, including its many novel applications, ranging from protecting civil infrastructure to pervasive health monitoring. Using detailed examples and illustrations, this book provides an inside track on the current state of the technology. The book is divided into three parts. In Part I, several node architectures, applications and operating systems are discussed. In Part II, the basic architectural frameworks, including

  18. A theory of the coherent fundamental plasma emission in Tokamaks

    International Nuclear Information System (INIS)

    Alves, M.V.; Chian, A.C.-L.

    1987-01-01

    A theoretical model of coherent radiation near the fundamental plasma frequency in tokamaks is proposed. It is shown that, in the presence of runaway electrons, the beam-generated Langmuir waves (L) can be parametrically converted into electromagnetic waves (T) through ponderomotive coupling to ion acoustic waves (S). Two types of pumps are considered: travelling wave pump and standing wave pump. Expressions are derived for the excitation conditions and the growth rates of electromagnetic decay instabilities (L-> T + S), electromagnetic fusion instabilities (L + S -> T) and electromagnetic oscillating two-stream instabilities (L -> T+- S * , where S * is a purely growing mode). (author) [pt

  19. A theory of the coherent fundamental plasma emission in Tokamaks

    International Nuclear Information System (INIS)

    Alves, M.V.; Chian, A.C.-L.

    1987-07-01

    A theoretical model of coherent radiation near the fundamental plasma frequency in Tokamaks is proposed. It is shown that, in the presence of runaway electrons, the beam-generated Langmuir waves (L) can be paarmetrically converted into electromagnetic waves (T) through ponderomotive coupling to ion acoustic waves (S). Two types of pumps are considered: traveling wave and standing wave pump. Expressions are derived for the excitation conditions and the growth rates of electomagnetic decay instabilities (L → T + S), electromagnetic fusion instabilities (L + S → T) and electromagnetic oscillating two-stream instabilities (L → T+-S sup(*) is a purely growing mode). (author) [pt

  20. Decompositional equivalence: A fundamental symmetry underlying quantum theory

    OpenAIRE

    Fields, Chris

    2014-01-01

    Decompositional equivalence is the principle that there is no preferred decomposition of the universe into subsystems. It is shown here, by using simple thought experiments, that quantum theory follows from decompositional equivalence together with Landauer's principle. This demonstration raises within physics a question previously left to psychology: how do human - or any - observers agree about what constitutes a "system of interest"?

  1. Fundamental course of measuring. Pt. 2. 4. enlarged ed.

    International Nuclear Information System (INIS)

    Merz, L.

    1975-01-01

    The fundamental course of the electrical measuring of non-electrical parameters aims to fulfill the task of presenting the present knowledge on the basic measuring methods in simple language and illustrative form. The present part II deals especially with measuring methods in heat and process engineering in the industrial field. Following the introduction in part A, the techniques of electrical probes are mainly described, and it is shown which mechanical probes cannot yet be replaced by electrical ones. Part C describes the techniques of measuring transducers. (ORU) [de

  2. Is signal detection theory fundamentally flawed? A response to Balakrishnan (1998a, 1998b, 1999).

    Science.gov (United States)

    Treisman, Michel

    2002-12-01

    For nearly 50 years, signal detection theory (SDT; Green & Swvets, 1966; Macmillan & Creelman, 1991) has been of central importance in the development of psychophysics and other areas of psychology. The theory has recently been challenged by Balakrishnan (1998b), who argues that, within SDT, an alternative index is "better justified" than d' and who claims to show (1998a, 1999) that SDT is fundamentally flawed and should be rejected. His evidence is based on new nonparametric measures that he has introduced and applied to experimental data. He believes his results show that basic assumptions of SDT are not supported-in particular, that payoff and probability manipulations do not affect the position of the decision criterion. In view of the importance of SDT in psychology, these claims deserve careful examination. They are critically reviewed here. It appears that it is Balakrishnans arguments that fail, and not SDT

  3. Modeling, Measurements, and Fundamental Database Development for Nonequilibrium Hypersonic Aerothermodynamics

    Science.gov (United States)

    Bose, Deepak

    2012-01-01

    The design of entry vehicles requires predictions of aerothermal environment during the hypersonic phase of their flight trajectories. These predictions are made using computational fluid dynamics (CFD) codes that often rely on physics and chemistry models of nonequilibrium processes. The primary processes of interest are gas phase chemistry, internal energy relaxation, electronic excitation, nonequilibrium emission and absorption of radiation, and gas-surface interaction leading to surface recession and catalytic recombination. NASAs Hypersonics Project is advancing the state-of-the-art in modeling of nonequilibrium phenomena by making detailed spectroscopic measurements in shock tube and arcjets, using ab-initio quantum mechanical techniques develop fundamental chemistry and spectroscopic databases, making fundamental measurements of finite-rate gas surface interactions, implementing of detailed mechanisms in the state-of-the-art CFD codes, The development of new models is based on validation with relevant experiments. We will present the latest developments and a roadmap for the technical areas mentioned above

  4. All the fundamental massless bosonic fields in superstring theory

    International Nuclear Information System (INIS)

    Manoukian, E.B.

    2012-01-01

    A systematic analysis of all the massless bosonic fields in superstring theory is carried out. Emphasis is put on the derivation of their propagators, their polarization aspects and the investigation of their underlying constraints as well as their number of degrees of freedom. The treatment is given in the presence of external sources, in the celebrated Coulomb gauge, ensuring the positivity of the formalism - a result which is also established in the process. The challenge here is the investigation involved in the self-dual fourth rank anti-symmetric tensor field. No constraints are imposed on the external sources so that their components may be varied independently, thus the complete expressions of the propagators may be obtained. As emphasized in our earlier work, the latter condition is an important one in dynamical theories with constraints giving rise to modifications as Faddeev-Popov factors. The analysis is carried out in 10 dimensions, not only because of the consistency requirement by the superstrings, but also in order to take into account of the self-duality character of the fourth rank anti-symmetric tensor field as spelled out in the paper. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  5. Tales of the quantum understanding physics' most fundamental theory

    CERN Document Server

    Hobson, Art

    2017-01-01

    Everybody has heard that we live in a world made of atoms. But far more fundamentally, we live in a universe made of quanta. Many things are not made of atoms: light, radio waves, electric current, magnetic fields, Earth's gravitational field, not to mention exotica such a neutron stars, black holes, dark energy, and dark matter. But everything, including atoms, is made of highly unified or "coherent" bundles of energy called "quanta" that (like everything else) obey certain rules. In the case of the quantum, these rules are called "quantum physics." This is a book about quanta and their unexpected, some would say peculiar, behavior--tales, if you will, of the quantum. The quantum has developed the reputation of being capricious, bewildering, even impossible to understand. The peculiar habits of quanta are certainly not what we would have expected to find at the foundation of physical reality, but these habits are not necessarily bewildering and not at all impossible or paradoxical. This book explains those h...

  6. Complex analysis fundamentals of the classical theory of functions

    CERN Document Server

    Stalker, John

    1998-01-01

    This clear, concise introduction to the classical theory of one complex variable is based on the premise that "anything worth doing is worth doing with interesting examples." The content is driven by techniques and examples rather than definitions and theorems. This self-contained monograph is an excellent resource for a self-study guide and should appeal to a broad audience. The only prerequisite is a standard calculus course. The first chapter deals with a beautiful presentation of special functions. . . . The third chapter covers elliptic and modular functions. . . in much more detail, and from a different point of view, than one can find in standard introductory books. . . . For [the] subjects that are omitted, the author has suggested some excellent references for the reader who wants to go through these topics. The book is read easily and with great interest. It can be recommended to both students as a textbook and to mathematicians and physicists as a useful reference. ---Mathematical Reviews Mainly or...

  7. A fundamental study of ''contribution'' transport theory and channel theory applications

    International Nuclear Information System (INIS)

    Williams, M.L.

    1992-01-01

    The objective of this three-year study is to develop a technique called ''channel theory'' that can be used in interpreting particle transport analysis such as frequently required in radiation shielding design and assessment. Channel theory is a technique used to provide insight into the mechanisms by which particles emitted from a source are transported through a complex system and register a response on some detector. It is based on the behavior of a pseudo particle called a ''contributon,'' which is the response carrier through space and energy channels that connect the source and detector. ''Contributons'' are those particles among all the ones contained in the system which will eventually contribute some amount of response to the detector. The specific goals of this projects are to provide a more fundamental theoretical understanding of the method, and to develop computer programs to apply the techniques to practical problems encountered in radiation transport analysis. The overall project can be divided into three components to meet these objectives: (a) Theoretical Development, (b) Code Development, and (c) Sample Applications. During the present third year of this study, an application of contributon theory to the analysis of radiation heating in a nuclear rocket has been completed, and a paper on the assessment of radiation damage response of an LWR pressure vessel and analysis of radiation propagation through space and energy channels in air at the Hiroshima weapon burst was accepted for publication. A major effort was devoted to developing a new ''Contributon Monte Carlo'' method, which can improve the efficiency of Monte Carlo calculations of radiation transport by tracking only contributons. The theoretical basis for Contributon Monte Carlo has been completed, and the implementation and testing of the technique is presently being performed

  8. Open and closed string worldsheets from free large N gauge theories with adjoint and fundamental matter

    International Nuclear Information System (INIS)

    Yaakov, Itamar

    2006-01-01

    We extend Gopakumar's prescription for constructing closed string worldsheets from free field theory diagrams with adjoint matter to open and closed string worldsheets arising from free field theories with fundamental matter. We describe the extension of the gluing mechanism and the electrical circuit analogy to fundamental matter. We discuss the generalization of the existence and uniqueness theorem of Strebel differentials to open Riemann surfaces. Two examples are computed of correlators containing fundamental matter, and the resulting worldsheet OPE's are computed. Generic properties of Gopakumar's construction are discussed

  9. Theory of Effectiveness Measurement

    National Research Council Canada - National Science Library

    Bullock, Richard K

    2006-01-01

    Effectiveness measures provide decision makers feedback on the impact of deliberate actions and affect critical issues such as allocation of scarce resources, as well as whether to maintain or change existing strategy...

  10. Ultracold atoms for precision measurement of fundamental physical quantities

    CERN Multimedia

    CERN. Geneva

    2003-01-01

    Cooling and trapping of neutral atoms has been one of the most active fields of research in physics in recent years. Several methods were demonstrated to reach temperatures as low as a few nanokelvin allowing, for example, the investigation of quantum degenerate gases. The ability to control the quantum degrees of freedom of atoms opens the way to applications for precision measurement of fundamental physical quantities. Experiments in progress, planned or being considered using new quantum devices based on ultracold atoms, namely atom interferometers and atomic clocks, will be discussed.

  11. Towards the Fundamental Quantum Limit of Linear Measurements of Classical Signals.

    Science.gov (United States)

    Miao, Haixing; Adhikari, Rana X; Ma, Yiqiu; Pang, Belinda; Chen, Yanbei

    2017-08-04

    The quantum Cramér-Rao bound (QCRB) sets a fundamental limit for the measurement of classical signals with detectors operating in the quantum regime. Using linear-response theory and the Heisenberg uncertainty relation, we derive a general condition for achieving such a fundamental limit. When applied to classical displacement measurements with a test mass, this condition leads to an explicit connection between the QCRB and the standard quantum limit that arises from a tradeoff between the measurement imprecision and quantum backaction; the QCRB can be viewed as an outcome of a quantum nondemolition measurement with the backaction evaded. Additionally, we show that the test mass is more a resource for improving measurement sensitivity than a victim of the quantum backaction, which suggests a new approach to enhancing the sensitivity of a broad class of sensors. We illustrate these points with laser interferometric gravitational-wave detectors.

  12. An Ultraviolet Chiral Theory of the Top for the Fundamental Composite (Goldstone) Higgs

    DEFF Research Database (Denmark)

    Cacciapaglia, Giacomo; Sannino, Francesco

    2016-01-01

    We introduce a scalar-less anomaly free chiral gauge theory that serves as natural ultraviolet completion of models of fundamental composite (Goldstone) Higgs dynamics. The new theory is able to generate the top mass and furthermore features a built-in protection mechanism that naturally suppresses...... the bottom mass. At low energies the theory predicts new fractionally charged fermions, and a number of four-fermion operators that, besides being relevant for the generation of the top mass, also lead to an intriguing phenomenology for the new states predicted by the theory....

  13. An ultraviolet chiral theory of the top for the fundamental composite (Goldstone) Higgs

    Energy Technology Data Exchange (ETDEWEB)

    Cacciapaglia, Giacomo, E-mail: g.cacciapaglia@ipnl.in2p3.fr [Univ Lyon, Université Lyon 1, CNRS/IN2P3, IPNL, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Sannino, Francesco, E-mail: sannino@cp3.dias.sdu.dk [CP" 3-Origins and the Danish IAS, University of Southern Denmark, Campusvej 55, DK-5230 Odense M (Denmark)

    2016-04-10

    We introduce a scalar-less anomaly free chiral gauge theory that serves as natural ultraviolet completion of models of fundamental composite (Goldstone) Higgs dynamics. The new theory is able to generate the top mass and furthermore features a built-in protection mechanism that naturally suppresses the bottom mass. At low energies the theory predicts new fractionally charged fermions, and a number of four-fermion operators that, besides being relevant for the generation of the top mass, also lead to an intriguing phenomenology for the new states predicted by the theory.

  14. An Analysis of Fundamental Mode Surface Wave Amplitude Measurements

    Science.gov (United States)

    Schardong, L.; Ferreira, A. M.; van Heijst, H. J.; Ritsema, J.

    2014-12-01

    Seismic tomography is a powerful tool to decipher the Earth's interior structure at various scales. Traveltimes of seismic waves are widely used to build velocity models, whereas amplitudes are still only seldomly accounted for. This mainly results from our limited ability to separate the various physical effects responsible for observed amplitude variations, such as focussing/defocussing, scattering and source effects. We present new measurements from 50 global earthquakes of fundamental-mode Rayleigh and Love wave amplitude anomalies measured in the period range 35-275 seconds using two different schemes: (i) a standard time-domain amplitude power ratio technique; and (ii) a mode-branch stripping scheme. For minor-arc data, we observe amplitude anomalies with respect to PREM in the range of 0-4, for which the two measurement techniques show a very good overall agreement. We present here a statistical analysis and comparison of these datasets, as well as comparisons with theoretical calculations for a variety of 3-D Earth models. We assess the geographical coherency of the measurements, and investigate the impact of source, path and receiver effects on surface wave amplitudes, as well as their variations with frequency in a wider range than previously studied.

  15. Atomic spectroscopy and highly accurate measurement: determination of fundamental constants

    International Nuclear Information System (INIS)

    Schwob, C.

    2006-12-01

    This document reviews the theoretical and experimental achievements of the author concerning highly accurate atomic spectroscopy applied for the determination of fundamental constants. A pure optical frequency measurement of the 2S-12D 2-photon transitions in atomic hydrogen and deuterium has been performed. The experimental setting-up is described as well as the data analysis. Optimized values for the Rydberg constant and Lamb shifts have been deduced (R = 109737.31568516 (84) cm -1 ). An experiment devoted to the determination of the fine structure constant with an aimed relative uncertainty of 10 -9 began in 1999. This experiment is based on the fact that Bloch oscillations in a frequency chirped optical lattice are a powerful tool to transfer coherently many photon momenta to the atoms. We have used this method to measure accurately the ratio h/m(Rb). The measured value of the fine structure constant is α -1 = 137.03599884 (91) with a relative uncertainty of 6.7*10 -9 . The future and perspectives of this experiment are presented. This document presented before an academic board will allow his author to manage research work and particularly to tutor thesis students. (A.C.)

  16. Scattering lengths in SU(2) gauge theory with two fundamental fermions

    DEFF Research Database (Denmark)

    Arthur, R.; Drach, V.; Hansen, Martin Rasmus Lundquist

    2014-01-01

    We investigate non perturbatively scattering properties of Goldstone Bosons in an SU(2) gauge theory with two Wilson fermions in the fundamental representation. Such a theory can be used to build extensions of the Standard Model that unifies Technicolor and pseudo Goldstone composite Higgs models...... the expected chiral symmetry breaking pattern. We then discuss how to compute them on the lattice and give preliminary results using finite size methods....

  17. The quantum theory of measurement

    CERN Document Server

    Busch, Paul; Mittelstaedt, Peter

    1996-01-01

    The amazing accuracy in verifying quantum effects experimentally has recently renewed interest in quantum mechanical measurement theory. In this book the authors give within the Hilbert space formulation of quantum mechanics a systematic exposition of the quantum theory of measurement. Their approach includes the concepts of unsharp objectification and of nonunitary transformations needed for a unifying description of various detailed investigations. The book addresses advanced students and researchers in physics and philosophy of science. In this second edition Chaps. II-IV have been substantially rewritten. In particular, an insolubility theorem for the objectification problem has been formulated in full generality, which includes unsharp object observables and unsharp pointers.

  18. Measurement and probability a probabilistic theory of measurement with applications

    CERN Document Server

    Rossi, Giovanni Battista

    2014-01-01

    Measurement plays a fundamental role both in physical and behavioral sciences, as well as in engineering and technology: it is the link between abstract models and empirical reality and is a privileged method of gathering information from the real world. Is it possible to develop a single theory of measurement for the various domains of science and technology in which measurement is involved? This book takes the challenge by addressing the following main issues: What is the meaning of measurement? How do we measure? What can be measured? A theoretical framework that could truly be shared by scientists in different fields, ranging from physics and engineering to psychology is developed. The future in fact will require greater collaboration between science and technology and between different sciences. Measurement, which played a key role in the birth of modern science, can act as an essential interdisciplinary tool and language for this new scenario. A sound theoretical basis for addressing key problems in mea...

  19. Non-additive measures theory and applications

    CERN Document Server

    Narukawa, Yasuo; Sugeno, Michio; 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012)

    2014-01-01

    This book provides a comprehensive and timely report in the area of non-additive measures and integrals. It is based on a panel session on fuzzy measures, fuzzy integrals and aggregation operators held during the 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012) in Girona, Spain, November 21-23, 2012. The book complements the MDAI 2012 proceedings book, published in Lecture Notes in Computer Science (LNCS) in 2012. The individual chapters, written by key researchers in the field, cover fundamental concepts and important definitions (e.g. the Sugeno integral, definition of entropy for non-additive measures) as well some important applications (e.g. to economics and game theory) of non-additive measures and integrals. The book addresses students, researchers and practitioners working at the forefront of their field.  

  20. Toward the fundamental theory of nuclear matter physics: The microscopic theory of nuclear collective dynamics

    International Nuclear Information System (INIS)

    Sakata, F.; Marumori, T.; Hashimoto, Y.; Tsukuma, H.; Yamamoto, Y.; Terasaki, J.; Iwasawa, Y.; Itabashi, H.

    1992-01-01

    Since the research field of nuclear physics is expanding rapidly, it is becoming more imperative to develop the microscopie theory of nuclear matter physics which provides us with a unified understanding of diverse phenomena exhibited by nuclei. An estabishment of various stable mean-fields in nuclei allows us to develop the microscopie theory of nuclear collective dynamics within the mean-field approximation. The classical-level theory of nuclear collective dynamics is developed by exploiting the symplectic structure of the timedependent Hartree-Fock (TDHF)-manifold. The importance of exploring the single-particle dynamics, e.g. the level-crossing dynamics in connection with the classical order-to-chaos transition mechanism is pointed out. Since the classical-level theory os directly related to the full quantum mechanical boson expansion theory via the symplectic structure of the TDHF-manifold, the quantum theory of nuclear collective dynamics is developed at the dictation of what os developed on the classical-level theory. The quantum theory thus formulated enables us to introduce the quantum integrability and quantum chaoticity for individual eigenstates. The inter-relationship between the classical-level and quantum theories of nuclear collective dynamics might play a decisive role in developing the quantum theory of many-body problems. (orig.)

  1. Measurement of attenuation coefficients of the fundamental and second harmonic waves in water

    Science.gov (United States)

    Zhang, Shuzeng; Jeong, Hyunjo; Cho, Sungjong; Li, Xiongbing

    2016-02-01

    Attenuation corrections in nonlinear acoustics play an important role in the study of nonlinear fluids, biomedical imaging, or solid material characterization. The measurement of attenuation coefficients in a nonlinear regime is not easy because they depend on the source pressure and requires accurate diffraction corrections. In this work, the attenuation coefficients of the fundamental and second harmonic waves which come from the absorption of water are measured in nonlinear ultrasonic experiments. Based on the quasilinear theory of the KZK equation, the nonlinear sound field equations are derived and the diffraction correction terms are extracted. The measured sound pressure amplitudes are adjusted first for diffraction corrections in order to reduce the impact on the measurement of attenuation coefficients from diffractions. The attenuation coefficients of the fundamental and second harmonics are calculated precisely from a nonlinear least squares curve-fitting process of the experiment data. The results show that attenuation coefficients in a nonlinear condition depend on both frequency and source pressure, which are much different from a linear regime. In a relatively lower drive pressure, the attenuation coefficients increase linearly with frequency. However, they present the characteristic of nonlinear growth in a high drive pressure. As the diffraction corrections are obtained based on the quasilinear theory, it is important to use an appropriate source pressure for accurate attenuation measurements.

  2. Measurements of Fundamental Fluid Physics of SNF Storage Canisters

    Energy Technology Data Exchange (ETDEWEB)

    Condie, Keith Glenn; Mc Creery, Glenn Ernest; McEligot, Donald Marinus

    2001-09-01

    With the University of Idaho, Ohio State University and Clarksean Associates, this research program has the long-term goal to develop reliable predictive techniques for the energy, mass and momentum transfer plus chemical reactions in drying / passivation (surface oxidation) operations in the transfer and storage of spent nuclear fuel (SNF) from wet to dry storage. Such techniques are needed to assist in design of future transfer and storage systems, prediction of the performance of existing and proposed systems and safety (re)evaluation of systems as necessary at later dates. Many fuel element geometries and configurations are accommodated in the storage of spent nuclear fuel. Consequently, there is no one generic fuel element / assembly, storage basket or canister and, therefore, no single generic fuel storage configuration. One can, however, identify generic flow phenomena or processes which may be present during drying or passivation in SNF canisters. The objective of the INEEL tasks was to obtain fundamental measurements of these flow processes in appropriate parameter ranges.

  3. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    Science.gov (United States)

    Lira, Ignacio

    2003-08-01

    on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

  4. Applied Physics of Carbon Nanotubes Fundamentals of Theory, Optics and Transport Devices

    CERN Document Server

    Rotkin, Slava V

    2005-01-01

    The book describes the state-of-the-art in fundamental, applied and device physics of nanotubes, including fabrication, manipulation and characterization for device applications; optics of nanotubes; transport and electromechanical devices and fundamentals of theory for applications. This information is critical to the field of nanoscience since nanotubes have the potential to become a very significant electronic material for decades to come. The book will benefit all all readers interested in the application of nanotubes, either in their theoretical foundations or in newly developed characterization tools that may enable practical device fabrication.

  5. Polarization of electron-positron vacuum by strong magnetic field in theory with fundamental mass

    International Nuclear Information System (INIS)

    Kadyshevskij, V.G.; ); Rodionov, V.N.

    2003-01-01

    The exact Lagrangian function of the intensive constant magnetic field, replacing the Heisenberg-Euler Lagrangian in the traditional quantum electrodynamics, is calculated within the frames of the theory with the fundamental mass in the single-loop approximation. It is established that the obtained generalization of the Lagrangian function is substantial by arbitrary values of the magnetic field. The calculated Lagrangian in the weak field coincides with the known Heisenberg-Euler formula. The Lagrangian dependence on the field in the extremely strong fields completely disappears and it tends in this area to the threshold value, which is determined by the fundamental and lepton mass ratio [ru

  6. Two-colour QCD at finite fundamental quark-number density and related theories

    International Nuclear Information System (INIS)

    Hands, S.J.; Kogut, J.B.; Morrison, S.E.; Sinclair, D.K.

    2001-01-01

    We are simulating SU(2) Yang-Mills theory with four flavours of dynamical quarks in the fundamental representation of SU(2) 'colour' at finite chemical potential, μ for quark number, as a model for QCD at finite baryon number density. In particular we observe that for μ large enough this theory undergoes a phase transition to a state with a diquark condensate which breaks quark-number symmetry. In this phase we examine the spectrum of light scalar and pseudoscalar bosons and see evidence for the Goldstone boson associated with this spontaneous symmetry breaking. This theory is closely related to QCD at finite chemical potential for isospin, a theory which we are now studying for SU(3) colour

  7. Two-colour QCD at finite fundamental quark-number density and related theories

    International Nuclear Information System (INIS)

    Hands, S. J.; Kogut, J. B.; Morrison, S. E.; Sinclair, D. K.

    2000-01-01

    We are simulating SU(2) Yang-Mills theory with four flavours of dynamical quarks in the fundamental representation of SU(2) colour at finite chemical potential, p for quark number, as a model for QCD at finite baryon number density. In particular we observe that for p large enough this theory undergoes a phase transition to a state with a diquark condensate which breaks quark-number symmetry. In this phase we examine the spectrum of light scalar and pseudoscalar bosons and see evidence for the Goldstone boson associated with this spontaneous symmetry breaking. This theory is closely related to QCD at finite chemical potential for isospin, a theory which we are now studying for SU(3) colour

  8. $SU(2)$ gauge theory with two fundamental flavours: scalar and pseudoscalar spectrum

    CERN Document Server

    Arthur, Rudy; Hietanen, Ari; Pica, Claudio; Sannino, Francesco

    2016-01-01

    We investigate the scalar and pseudoscalar spectrum of the $SU(2)$ gauge theory with $N_f=2$ flavours of fermions in the fundamental representation using non perturbative lattice simulations. We provide first benchmark estimates of the mass of the lightest $0(0^{+})$ ($\\sigma$), $0(0^{-})$ ($\\eta'$) and $1(0^+)$ ($a_0$) states, including estimates of the relevant disconnected contributions. We find $m_{a_0}/F_{\\rm{PS}}= 16.7(4.9)$, $m_\\sigma/F_{\\rm{PS}}=19.2(10.8)$ and $m_{\\eta'}/F_{\\rm{PS}} = 12.8(4.7)$. These values for the masses of light scalar states provide crucial information for composite extensions of the Standard Model from the unified Fundamental Composi te Higgs-Technicolor theory \\cite{Cacciapaglia:2014uja} to models of composite dark matter.

  9. Positivism and Constitutional Post- positivism : A Debate on Breast Theory of Fundamental Rights

    Directory of Open Access Journals (Sweden)

    Matheus Felipe de Castro

    2016-05-01

    Full Text Available This article, based on the theoretical framework of the philosophy of praxis, is to discuss the strained relations between power and justice in the enforcement of fundamental rights, making a comparison between the theoretical concepts of Hans Kelsen and Robert Alexy. In this sense, are compared the thoughts of these two authors, emphasizing the central role that power has the legal conception of the first as opposed to the theory of justice that animates the legal conceptions of the second. We discuss how this tension that appears in the theoretical confrontation of the two authors is actually a moment of real, but constitutes a dialectical interaction which must be observed and deciphered in the concrete application of the law. It concludes with the search of separation from what is real from what is in this ideological debate, seeking to deepen the debate on fundamental rights as the core of modern structural theory of law.

  10. Introduction to probability and measure theories

    International Nuclear Information System (INIS)

    Partasarati, K.

    1983-01-01

    Chapters of probability and measured theories are presented. The Borele images of spaces with the measure into each other and in separate metric spaces are studied. The Kolmogorov theorem on the continuation of probabilies is drawn from the theorem on the measure continuation to the projective limits of spaces with measure. The integration theory is plotted, measures on multiplications of spaces are studied. The theory of conventional mathematical expectations by projections in Hilbert space is presented. In conclusion, the theory of weak convergence of measures of elements of the theory of characteristic functions and the theory of invariant and quasi-invariant measures on groups and homogeneous spaces is given

  11. Rho meson decay width in SU(2) gauge theories with 2 fundamental flavours

    CERN Document Server

    Janowski, Tadeusz; Pica, Claudio

    2016-01-01

    SU(2) gauge theories with two quark flavours in the fundamental representation are among the most promising theories of composite dynamics describing the electroweak sector. Three out of five Goldstone bosons in these models become the longitudinal components of the W and Z bosons giving them mass. Like in QCD, we expect a spectrum of excitations which appear as resonances in vector boson scattering, in particular the vector resonance corresponding to the rho-meson in QCD. In this talk I will present the preliminary results of the first calculation of the rho-meson decay width in this theory, which is analogous to rho to two pions decay calculation in QCD. The results presented were calculated in a moving frame with total momentum (0,0,1) on two ensembles. Future plans include using 3 moving frames on a larger set of ensembles to extract the resonance parameters more reliably and also take the chiral and continuum limits.

  12. Theory of precision electroweak measurements

    International Nuclear Information System (INIS)

    Peskin, M.E.

    1990-03-01

    In these lectures, I will review the theoretical concepts needed to understand the goals and implications of experiments in this new era of weak interactions. I will explain how to compute the most important order-α radiative corrections to weak interaction processes and discuss the physical implications of these correction terms. I hope that this discussion will be useful to those --- experimentalists and theorists --- who will try to interpret the new data that we will soon receive. This paper is organized as follows: I will review the structure of the standard weak interaction model at zeroth order. I will discuss the measurement of the Z 0 boson mass in e + e - annihilation. This measurement is affected by radiative correction to the form of the Z 0 resonance, and so I will review the theory of the resonance line shape. I will briefly review the modifications of the properties of the Z 0 which would be produced by additional neutral gauge bosons. I will review the theory of the renormalization of weak interaction parameters such as sin 2 θ ω , concentrating especially on the contributions of the top quark and other heavy, undiscovered particles

  13. The theory of confidence-building measures

    International Nuclear Information System (INIS)

    Darilek, R.E.

    1992-01-01

    This paper discusses the theory of Confidence-Building Measures (CBMs) in two ways. First, it employs a top-down, deductively oriented approach to explain CBM theory in terms of the arms control goals and objectives to be achieved, the types of measures to be employed, and the problems or limitations likely to be encountered when applying CBMs to conventional or nuclear forces. The chapter as a whole asks how various types of CBMs might function during a political - military escalation from peacetime to a crisis and beyond (i.e. including conflict), as well as how they might operate in a de-escalatory environment. In pursuit of these overarching issues, the second section of the chapter raises a fundamental but complicating question: how might the next all-out war actually come aoubt - by unpremeditated escalation resulting from misunderstanding or miscalculation, or by premeditation resulting in a surprise attack? The second section of the paper addresses this question, explores its various implications for CBMs, and suggests the potential contribution of different types of CBMs toward successful resolution of the issues involved

  14. A fundamental special-relativistic theory valid for all real-valued speeds

    Directory of Open Access Journals (Sweden)

    Vedprakash Sewjathan

    1984-01-01

    Full Text Available This paper constitutes a fundamental rederivation of special relativity based on the c-invariance postulate but independent of the assumption ds′2=±ds2 (Einstein [1], Kittel et al [2], Recami [3], the equivalence principle, homogeneity of space-time, isotropy of space, group properties and linearity of space-time transformations or the coincidence of the origins of inertial space-time frames. The mathematical formalism is simpler than Einstein's [4] and Recami's [3]. Whilst Einstein's subluminal and Recami's superluminal theories are rederived in this paper by further assuming the equivalence principle and “mathematical inverses” [4,3], this paper derives (independent of these assumptions with physico-mathematical motivation an alternate singularity-free special-relativistic theory which replaces Einstein's factor [1/(1−V2/c2]12 and Recami's extended-relativistic factor [1/(V2/c2−1]12 by [(1−(V2/c2n/(1−V2/c2]12, where n equals the value of (m(V/m02 as |V|→c. In this theory both Newton's and Einstein's subluminal theories are experimentally valid on account of negligible terms. This theory implies that non-zero rest mass luxons will not be detected as ordinary non-zero rest mass bradyons because of spatial collapse, and non-zero rest mass tachyons are undetectable because they exist in another cosmos, resulting in a supercosmos of matter, with the possibility of infinitely many such supercosmoses, all moving forward in time. Furthermore this theory is not based on any assumption giving rise to the twin paradox controversy. The paper concludes with a discussion of the implications of this theory for general relativity.

  15. Boolean Approach to Dichotomic Quantum Measurement Theories

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, K. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Nakamura, T. [Keio University, Yokohama (Japan); Batle, J. [Universitat de les Illes Balears, Balearic Islands (Spain); Abdalla, S. [King Abdulaziz University Jeddah, Jeddah (Saudi Arabia); Farouk, A. [Al-Zahra College for Women, Muscat (Egypt)

    2017-02-15

    Recently, a new measurement theory based on truth values was proposed by Nagata and Nakamura [Int. J. Theor. Phys. 55, 3616 (2016)], that is, a theory where the results of measurements are either 0 or 1. The standard measurement theory accepts a hidden variable model for a single Pauli observable. Hence, we can introduce a classical probability space for the measurement theory in this particular case. Additionally, we discuss in the present contribution the fact that projective measurement theories (the results of which are either +1 or −1) imply the Bell, Kochen, and Specker (BKS) paradox for a single Pauli observable. To justify our assertion, we present the BKS theorem in almost all the two-dimensional states by using a projective measurement theory. As an example, we present the BKS theorem in two-dimensions with white noise. Our discussion provides new insight into the quantum measurement problem by using this measurement theory based on the truth values.

  16. Justification for measurement equation: a fundamental issue in theoretical metrology

    Directory of Open Access Journals (Sweden)

    Aleksander V. Prokopov

    2013-11-01

    Full Text Available A review and a critical analysis of the specialized literature on justification for the measurement equation and an estimation of a methodical error (uncertainty of the measurement result are presented in the paper, and some prospects for solving of the issue are discussed herein.

  17. Justification for measurement equation: a fundamental issue in theoretical metrology

    OpenAIRE

    Aleksander V. Prokopov

    2013-01-01

    A review and a critical analysis of the specialized literature on justification for the measurement equation and an estimation of a methodical error (uncertainty) of the measurement result are presented in the paper, and some prospects for solving of the issue are discussed herein.

  18. Commencement measurements giving fundamental surface tension determinations in tensiometry

    International Nuclear Information System (INIS)

    Carbery, D; Morrin, D; O'Rourke, B; McMillan, N D; O'Neill, M; Riedel, S; Pringuet, P; Smith, S R P

    2011-01-01

    This study provides experimental testing of a ray-tracing model of the tensiotrace that explores the measurement potential of a well-defined optical position in the tensiotrace signal known as the 'commencement'. This point is defined as the first measureable optical coupling in the fiber drophead between source and collector fibers for light injected inside a growing drop. Tensiotrace ray-tracing model is briefly introduced. Empirical relationships of commencement measures from a wide-ranging study are presented. A number of conclusions can be drawn from the successful linking of computer predictions to these experimental relationships.

  19. Minimalist Program and its fundamental improvements in syntactic theory: evidence from Agreement Asymmetry in Standard Arabic

    Directory of Open Access Journals (Sweden)

    Nasser Al-Horais

    2012-11-01

    Full Text Available Normal 0 21 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:auto; mso-para-margin-right:1.0cm; mso-para-margin-bottom:auto; mso-para-margin-left:2.0cm; text-align:justify; text-indent:-1.0cm; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi; mso-ansi-language:EN-US; mso-fareast-language:EN-US;} The Minimalist Program is a major line of inquiry that has been developing inside Generative Grammar since the early nineties, when it was proposed by Chomsky  (1993, 1995. In that time, Chomsky (1998: 5 presents Minimalist Program as a program, not as a theory, but today, Minimalist Program lays out a very specific view of the basis of syntactic grammar that, when compared to other formalisms, is often taken to look very much like a theory. The prime concern of this paper, however, is  to provide a comprehensive and accessible introduction to the art of minimalist approach to the theory of grammar. In this regard, this paper discusses some new ideas articulated recently by Chomsky, and have led to several fundamental improvements in syntactic theory  such as changing the function of movement and the Extended Projection Principle (EPP feature, or proposing new theories such as Phases and Feature Inheritance. In order to evidence the significance of these fundamental improvements, the current paper provides a minimalist analysis to account for agreement and word-order asymmetry in Stranded Arabic. This fresh minimalist account meets the challenges (to the basic tenets of syntactic theory occurred

  20. Measurement of endotoxin. I. Fundamental studies on radioimmunoassay of endotoxin

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, H [Okayama Univ. (Japan). School of Medicine

    1976-08-01

    A method for estimating endotoxin by radioimmunoassay was recently introduced. The present paper describes improvements in the speed and sensitivity on this endotoxin measurement. Antigen was purified from E. coli 0111: B4(B) lipopolysaccharide by centrifugation and dialysis. Purified anti-endotoxin antibody was prepared from immunized rabbit serum. A radioimmunoassay system was established with the antigen and antibody. Dextran-coated charcoal was used to separate the antibody-bound antigen from free antigen. Experimental studies were also performed on possible factors related to the antigen-antibody reaction. Accurate measurements on quantitites as low as 100 pg/ml (10 ng/ml in the plasma) were performed by the dextran-coated charcoal method, and the reaction time was reduced to 2 hr at 4/sup 0/C. This new method does not require strict sterilization or aseptic handling, and therefore is quite practical for quantitative measurements of endotoxin.

  1. Impact of Neutrino Oscillation Measurements on Theory

    International Nuclear Information System (INIS)

    Murayama, Hitoshi

    2003-01-01

    Neutrino oscillation data had been a big surprise to theorists, and indeed they have ongoing impact on theory. I review what the impact has been, and what measurements will have critical impact on theory in the future.

  2. Fundamental Characteristics For Building Dymanics Obtained From Microtremors Measurements.

    Science.gov (United States)

    Enomoto, T.; Abeki, N.; Kuramochi, D.; Lanuza, A.; Gonzalez, J.; Schmitz, M.; Navarro, M.

    We are performing the international joint research investigations for the seismic disaster mitigation in Metro Manila between Philippine and Japan from 1994, in Caracas between Venezuela and Japan from 1996 and also in Almeria and Granada between Spain and Japan from 1996. We have made the microtremor measurements at Reinforced Concrete (RC) buildings existed in these cities and evaluated the dynamical characteristics of RC buildings, natural period and damping factor. Even if it·fs necessary to have some discussions about the accuracy of microtremor measurement in order to evaluate the dynamical characteristics because of small amplitude range phenomena. However, the microtremor measurement method is a simple, low cost and realistic method in order to observe and investigate the actual dynamical behavior for obtaining the useful information in many countries. In these international joint research works, we settled the main object to the getting useful information of building dynamical characteristics for the seismic disaster mitigation. So, we observed microtremors at the top floor of several kinds of buildings which have the different conditions, for examples, existed place, building type, dimensional scale and number of stories etc. Then we realized the evaluation of natural period and responded damping factor of building depending on the building conditions as a statistical tendencies. In this paper, mainly we would like to present the investigated results of regression relationship between the natural period of RC building and the number of stories in Philippine and Venezuela, respectively. and also, we summarized the relationship between the natural period of RC building and damping factor considering the surrounding soil condition. We thought that these relations are reasonable and believable results in the small amplitude range evaluated from microtremors.

  3. The First Fundamental Theorem of Invariant Theory for the Orthosymplectic Supergroup

    Science.gov (United States)

    Lehrer, G. I.; Zhang, R. B.

    2017-01-01

    We give an elementary and explicit proof of the first fundamental theorem of invariant theory for the orthosymplectic supergroup by generalising the geometric method of Atiyah, Bott and Patodi to the supergroup context. We use methods from super-algebraic geometry to convert invariants of the orthosymplectic supergroup into invariants of the corresponding general linear supergroup on a different space. In this way, super Schur-Weyl-Brauer duality is established between the orthosymplectic supergroup of superdimension ( m|2 n) and the Brauer algebra with parameter m - 2 n. The result may be interpreted either in terms of the group scheme OSp( V) over C, where V is a finite dimensional super space, or as a statement about the orthosymplectic Lie supergroup over the infinite dimensional Grassmann algebra {Λ}. We take the latter point of view here, and also state a corresponding theorem for the orthosymplectic Lie superalgebra, which involves an extra invariant generator, the super-Pfaffian.

  4. Fundamentals of gamma-ray measurements and radiometric analyses

    International Nuclear Information System (INIS)

    Hochel, R.C.

    1990-01-01

    There are four primary modes of radioactive decay. All can be measured using various types of detectors and are the basis of many analytical techniques and much of what we know about the nucleus and its structure. Alpha particle emission occurs mostly in heavy nuclei of atomic number, Z, greater than 82 like Po, Ra, Th, and U, etc. Beta particles are simply electrons. They are emitted from the nucleus with a distribution of energies ranging from 0--3 MeV. Gamma-rays are photons with energies ranging from a few keV to 10 MeV or more. They usually follow alpha or beta decay, and depending on their energy, can have considerable range in matter. Neutrons are emitted in fission processes and also from a few of the highly excited fission product nuclei. Fission neutrons typically have energies of 1--2 MeV. Like gamma-rays, they have long ranges. The energies involved in nuclear decay processes are much higher than anything encountered in, say, chemical reactions. They are at the very top of the electromagnetic spectrum -- about a million times more energetic than visible light. As a result, these particles always produce ionization, either directly or indirectly, as they pass through matter. It is this ionization which is the basis of all radiation detectors

  5. Evaluating fundamentals of care: The development of a unit-level quality measurement and improvement programme.

    Science.gov (United States)

    Parr, Jenny M; Bell, Jeanette; Koziol-McLain, Jane

    2018-06-01

    The project aimed to develop a unit-level quality measurement and improvement programme using evidence-based fundamentals of care. Feedback from patients, families, whānau, staff and audit data in 2014 indicated variability in the delivery of fundamental aspects of care such as monitoring, nutrition, pain management and environmental cleanliness at a New Zealand District Health Board. A general inductive approach was used to explore the fundamentals of care and design a measurement and improvement programme, the Patient and Whānau Centred Care Standards (PWCCS), focused on fundamental care. Five phases were used to explore the evidence, and design and test a measurement and improvement framework. Nine identified fundamental elements of care were used to define expected standards of care and develop and test a measurement and improvement framework. Four six-monthly peer reviews have been undertaken since June 2015. Charge Nurse Managers used results to identify quality improvements. Significant improvement was demonstrated overall, in six of the 27 units, in seven of the nine standards and three of the four measures. In all, 89% (n = 24) of units improved their overall result. The PWCCS measurement and improvement framework make visible nursing fundamentals of care in line with continuous quality improvement to increase quality of care. Delivering fundamentals of care is described by nurses as getting ?back to basics'. Patient and family feedback supports the centrality of fundamentals of care to their hospital experience. Implementing a unit-level fundamentals of care quality measurement and improvement programme clarifies expected standards of care, highlights the contribution of fundamentals of care to quality and provides a mechanism for ongoing improvements. © 2018 John Wiley & Sons Ltd.

  6. Discrete time interval measurement system: fundamentals, resolution and errors in the measurement of angular vibrations

    International Nuclear Information System (INIS)

    Gómez de León, F C; Meroño Pérez, P A

    2010-01-01

    The traditional method for measuring the velocity and the angular vibration in the shaft of rotating machines using incremental encoders is based on counting the pulses at given time intervals. This method is generically called the time interval measurement system (TIMS). A variant of this method that we have developed in this work consists of measuring the corresponding time of each pulse from the encoder and sampling the signal by means of an A/D converter as if it were an analog signal, that is to say, in discrete time. For this reason, we have denominated this method as the discrete time interval measurement system (DTIMS). This measurement system provides a substantial improvement in the precision and frequency resolution compared with the traditional method of counting pulses. In addition, this method permits modification of the width of some pulses in order to obtain a mark-phase on every lap. This paper explains the theoretical fundamentals of the DTIMS and its application for measuring the angular vibrations of rotating machines. It also displays the required relationship between the sampling rate of the signal, the number of pulses of the encoder and the rotating velocity in order to obtain the required resolution and to delimit the methodological errors in the measurement

  7. Quantum theory of measurements as quantum decision theory

    International Nuclear Information System (INIS)

    Yukalov, V I; Sornette, D

    2015-01-01

    Theory of quantum measurements is often classified as decision theory. An event in decision theory corresponds to the measurement of an observable. This analogy looks clear for operationally testable simple events. However, the situation is essentially more complicated in the case of composite events. The most difficult point is the relation between decisions under uncertainty and measurements under uncertainty. We suggest a unified language for describing the processes of quantum decision making and quantum measurements. The notion of quantum measurements under uncertainty is introduced. We show that the correct mathematical foundation for the theory of measurements under uncertainty, as well as for quantum decision theory dealing with uncertain events, requires the use of positive operator-valued measure that is a generalization of projection-valued measure. The latter is appropriate for operationally testable events, while the former is necessary for characterizing operationally uncertain events. In both decision making and quantum measurements, one has to distinguish composite nonentangled events from composite entangled events. Quantum probability can be essentially different from classical probability only for entangled events. The necessary condition for the appearance of an interference term in the quantum probability is the occurrence of entangled prospects and the existence of an entangled strategic state of a decision maker or of an entangled statistical state of a measuring device

  8. Interpreting doubly special relativity as a modified theory of measurement

    International Nuclear Information System (INIS)

    Liberati, Stefano; Sonego, Sebastiano; Visser, Matt

    2005-01-01

    In this article we develop a physical interpretation for the deformed (doubly) special relativity theories (DSRs), based on a modification of the theory of measurement in special relativity. We suggest that it is useful to regard the DSRs as reflecting the manner in which quantum gravity effects induce Planck-suppressed distortions in the measurement of the 'true' energy and momentum. This interpretation provides a framework for the DSRs that is manifestly consistent, nontrivial, and in principle falsifiable. However, it does so at the cost of demoting such theories from the level of fundamental physics to the level of phenomenological models - models that should in principle be derivable from whatever theory of quantum gravity one ultimately chooses to adopt

  9. String theory and fundamental interactions. Gabriele Veneziano and theoretical physics - Historical and contemporary perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Gasperini, M. [Bari Univ. (Italy). Dipt. di Fisica; Maharana, J. (eds.) [Institute of Physics, Orissa (India)

    2008-07-01

    This volume, dedicated to Prof. Gabriele Veneziano on the occasion of his retirement from CERN, starts as a broad historico-scientific study on the work on string theory and nonperturbative QCD that has been pioneered by Prof. Veneziano in the late 60s and early 70s. It goes on to examine the many ramifications this and similar early work has spawned over the past decades and the reader will find state-of-the art tutorial reviews on string cosmology, string dualities and symmetries, and much more. The book includes a concise updated scientific biography of, and an interview with, Prof. Veneziano, in which he relates his personal views about the present and future of fundamental physics. This is followed by the commented draft of an unpublished paper of 1973 of his, anticipating interesting results which were rediscovered and published more than a decade later. Overall, this volume is a vast and unique canvas where the re-examination of older and the presentation of newer results and insights are skillfully mixed with personal recollections of the contributing authors, most of them involved in the early days of string and quantum field theory, about Prof. Veneziano and the many interrelated topics considered. (orig.)

  10. Derivation of binding energies on the basis of fundamental nuclear theory

    International Nuclear Information System (INIS)

    Kouki, Tuomo.

    1975-10-01

    An attempt to assess the degree of consistency between the underlying ideas of two different approaches to nuclear energy relations is described. The fundamental approach in the form of density dependent Hartree-Fock theory, as well as the method of renormalizing shell model energies have both met with fair success. Whereas the former method is based on nuclear matter theory, the latter's central idea is to combine shell structure with an average liquid drop behaviour. The shell smoothing procedure employed there has been subject to intense theoretical study. Only little attention has been paid to the liquid drop aspect of the method. It is purposed to derive the liquid drop mass formula by means of a model force fitted to results of some nuclear matter calculations. Moreover, the force is tested by applying it to finite nuclei. Because of this, the present work could also be regarded as an attempt to find a very direct way of relating nuclear matter properties to those of finite nuclei. As the results in this respect are worse than expected, we conclude with a discussion of possible directions of improvement. (author)

  11. String theory and fundamental interactions. Gabriele Veneziano and theoretical physics - Historical and contemporary perspectives

    International Nuclear Information System (INIS)

    Gasperini, M.

    2008-01-01

    This volume, dedicated to Prof. Gabriele Veneziano on the occasion of his retirement from CERN, starts as a broad historico-scientific study on the work on string theory and nonperturbative QCD that has been pioneered by Prof. Veneziano in the late 60s and early 70s. It goes on to examine the many ramifications this and similar early work has spawned over the past decades and the reader will find state-of-the art tutorial reviews on string cosmology, string dualities and symmetries, and much more. The book includes a concise updated scientific biography of, and an interview with, Prof. Veneziano, in which he relates his personal views about the present and future of fundamental physics. This is followed by the commented draft of an unpublished paper of 1973 of his, anticipating interesting results which were rediscovered and published more than a decade later. Overall, this volume is a vast and unique canvas where the re-examination of older and the presentation of newer results and insights are skillfully mixed with personal recollections of the contributing authors, most of them involved in the early days of string and quantum field theory, about Prof. Veneziano and the many interrelated topics considered. (orig.)

  12. ANALYSIS OF PUBLIC COURT-ORDERED-DEBT DISCLOSURE: INFLUENCE OF LEGISLATION AND FUNDAMENTALS OF ACCOUNTING THEORY

    Directory of Open Access Journals (Sweden)

    Lucas Oliveira Gomes Ferreira

    2012-03-01

    Full Text Available The purpose of the present study is to analyze the accounting disclosure of judicial payments warrants (precatórios, issued when governmental entities are found liable for pecuniary awards in lawsuits according to accounting theory, and to verify if the current legislation interferes in the accounting treatment of these instruments. In this sense, we performed a documental and literature review about the legal framework and accounting procedures adopted, as well gathered data from the National Treasury Secretariat Data Collection System (SISTN in the period 2004-2009 and consulted a study carried out by the Supreme Court (STF in 2004. The study’s justification is based on the perception that over than a half of judicial payment warrants are not registered in the public accounts. Consequently, whereas these warrants (i vested rights of the plaintiffs and (ii debts of the public entity, the lack of accounting disclosure jeopardizes both the beneficiary, whose right is not reflected in the public accounts, thus casting doubt on the expectation to receive payment, and government managers and society, who do not have reliable information that allows effective management. The innovation of this paper consists of discussing identification of the appropriate moment of the generating event of the underlying debts and the proposal of disclosure considering the risk classification. In conclusion, the influence of the current legislation and the failure to observe accounting fundamentals are among the likely factors that have affected the proper accounting of judicial payment warrants within the Brazilian public administration.

  13. Fundamental Theories and Key Technologies for Smart and Optimal Manufacturing in the Process Industry

    Directory of Open Access Journals (Sweden)

    Feng Qian

    2017-04-01

    Full Text Available Given the significant requirements for transforming and promoting the process industry, we present the major limitations of current petrochemical enterprises, including limitations in decision-making, production operation, efficiency and security, information integration, and so forth. To promote a vision of the process industry with efficient, green, and smart production, modern information technology should be utilized throughout the entire optimization process for production, management, and marketing. To focus on smart equipment in manufacturing processes, as well as on the adaptive intelligent optimization of the manufacturing process, operating mode, and supply chain management, we put forward several key scientific problems in engineering in a demand-driven and application-oriented manner, namely: ① intelligent sensing and integration of all process information, including production and management information; ② collaborative decision-making in the supply chain, industry chain, and value chain, driven by knowledge; ③ cooperative control and optimization of plant-wide production processes via human-cyber-physical interaction; and ④ life-cycle assessments for safety and environmental footprint monitoring, in addition to tracing analysis and risk control. In order to solve these limitations and core scientific problems, we further present fundamental theories and key technologies for smart and optimal manufacturing in the process industry. Although this paper discusses the process industry in China, the conclusions in this paper can be extended to the process industry around the world.

  14. Quantum measure and integration theory

    International Nuclear Information System (INIS)

    Gudder, Stan

    2009-01-01

    This article begins with a review of quantum measure spaces. Quantum forms and indefinite inner-product spaces are then discussed. The main part of the paper introduces a quantum integral and derives some of its properties. The quantum integral's form for simple functions is characterized and it is shown that the quantum integral generalizes the Lebesgue integral. A bounded, monotone convergence theorem for quantum integrals is obtained and it is shown that a Radon-Nikodym-type theorem does not hold for quantum measures. As an example, a quantum-Lebesgue integral on the real line is considered.

  15. Experimental measurements of competition between fundamental and second harmonic emission in a quasi-optical gyrotron

    International Nuclear Information System (INIS)

    Alberti, S.; Pedrozzi, M.; Tran, M.Q.; Hogge, J.P.; Tran, T.M.; Muggli, P.; Joedicke, B.; Mathews, H.G.

    1990-04-01

    A quasi-optical gyrotron (QOG) designed for operation at the fundamental (Ω ce ≅100 GHz) exhibits simultaneous emission at Ω ce and 2Ω ce (second harmonic). For a beam current of 4 A, 20% of the total RF power is emitted at the second harmonic. The experimental measurements show that the excitation of the second harmonic is only possible when the fundamental is present. The frequency of the second harmonic is locked by the frequency of the fundamental. Experimental evidence shows that when the second harmonic is not excited, total efficiency is enhanced. (author) 6 refs., 5 figs., 1 tab

  16. Quantum decision theory as quantum theory of measurement

    International Nuclear Information System (INIS)

    Yukalov, V.I.; Sornette, D.

    2008-01-01

    We present a general theory of quantum information processing devices, that can be applied to human decision makers, to atomic multimode registers, or to molecular high-spin registers. Our quantum decision theory is a generalization of the quantum theory of measurement, endowed with an action ring, a prospect lattice and a probability operator measure. The algebra of probability operators plays the role of the algebra of local observables. Because of the composite nature of prospects and of the entangling properties of the probability operators, quantum interference terms appear, which make actions noncommutative and the prospect probabilities nonadditive. The theory provides the basis for explaining a variety of paradoxes typical of the application of classical utility theory to real human decision making. The principal advantage of our approach is that it is formulated as a self-consistent mathematical theory, which allows us to explain not just one effect but actually all known paradoxes in human decision making. Being general, the approach can serve as a tool for characterizing quantum information processing by means of atomic, molecular, and condensed-matter systems

  17. Accurate Estimation of Low Fundamental Frequencies from Real-Valued Measurements

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2013-01-01

    In this paper, the difficult problem of estimating low fundamental frequencies from real-valued measurements is addressed. The methods commonly employed do not take the phenomena encountered in this scenario into account and thus fail to deliver accurate estimates. The reason for this is that the......In this paper, the difficult problem of estimating low fundamental frequencies from real-valued measurements is addressed. The methods commonly employed do not take the phenomena encountered in this scenario into account and thus fail to deliver accurate estimates. The reason...... for this is that they employ asymptotic approximations that are violated when the harmonics are not well-separated in frequency, something that happens when the observed signal is real-valued and the fundamental frequency is low. To mitigate this, we analyze the problem and present some exact fundamental frequency estimators...

  18. Foams theory, measurements, and applications

    CERN Document Server

    Khan, Saad A

    1996-01-01

    This volume discusses the physics and physical processes of foam and foaming. It delineates various measurement techniques for characterizing foams and foam properties as well as the chemistry and application of foams. The use of foams in the textile industry, personal care products, enhanced oil recovery, firefighting and mineral floatation are highlighted, and the connection between the microstructure and physical properties of foam are detailed. Coverage includes nonaqueous foams and silicone antifoams, and more.

  19. The inductively coupled plasma as a source for the measurement of fundamental spectroscopic constants

    International Nuclear Information System (INIS)

    Farnsworth, P.B.

    1993-01-01

    Inductively coupled plasmas (ICPs) are stable, robust sources for the generation of spectra from neutral and singly ionized atoms. They are used extensively for analytical spectrometry, but have seen limited use for the measurement of fundamental spectroscopic constants. Several properties of the ICP affect its suitability for such fundamental measurements. They include: spatial structure, spectral background, noise characteristics, electron densities and temperatures, and the state of equilibrium in the plasma. These properties are particularly sensitive to the means by which foreign atoms are introduced into the plasma. With some departures from the operating procedures normally used in analytical measurements, the ICP promise to be a useful source for the measurement of fundamental atomic constants. (orig.)

  20. Immersed in media telepresence theory, measurement & technology

    CERN Document Server

    Lombard, Matthew; Freeman, Jonathan; IJsselsteijn, Wijnand; Schaevitz, Rachel J

    2015-01-01

    Highlights key research currently being undertaken within the field of telepresence, providing the most detailed account of the field to date, advancing our understanding of a fundamental property of all media - the illusion of presence; the sense of "being there" inside a virtual environment, with actual or virtual others. This collection has been put together by leading international scholars from America, Europe, and Asia. Together, they describe the state-of-the-art in presence theory, research and technology design for an advanced academic audience. Immersed in Media provides research t

  1. Introduction to Measure Theory and Integration

    CERN Document Server

    Ambrosio, Luigi; Mennucci, Andrea

    2011-01-01

    This textbook collects the notes for an introductory course in measure theory and integration. The course was taught by the authors to undergraduate students of the Scuola Normale Superiore, in the years 2000-2011. The goal of the course was to present, in a quick but rigorous way, the modern point of view on measure theory and integration, putting Lebesgue's Euclidean space theory into a more general context and presenting the basic applications to Fourier series, calculus and real analysis. The text can also pave the way to more advanced courses in probability, stochastic processes or geomet

  2. Theory of Bessel Functions of High Rank - I: Fundamental Bessel Functions

    OpenAIRE

    Qi, Zhi

    2014-01-01

    In this article we introduce a new category of special functions called fundamental Bessel functions arising from the Voronoi summation formula for $\\mathrm{GL}_n (\\mathbb{R})$. The fundamental Bessel functions of rank one and two are the oscillatory exponential functions $e^{\\pm i x}$ and the classical Bessel functions respectively. The main implements and subjects of our study of fundamental Bessel functions are their formal integral representations and Bessel equations.

  3. Measure theory and fine properties of functions

    CERN Document Server

    Evans, Lawrence Craig

    2015-01-01

    Measure Theory and Fine Properties of Functions, Revised Edition provides a detailed examination of the central assertions of measure theory in n-dimensional Euclidean space. The book emphasizes the roles of Hausdorff measure and capacity in characterizing the fine properties of sets and functions. Topics covered include a quick review of abstract measure theory, theorems and differentiation in ℝn, Hausdorff measures, area and coarea formulas for Lipschitz mappings and related change-of-variable formulas, and Sobolev functions as well as functions of bounded variation.The text provides complete proofs of many key results omitted from other books, including Besicovitch's covering theorem, Rademacher's theorem (on the differentiability a.e. of Lipschitz functions), area and coarea formulas, the precise structure of Sobolev and BV functions, the precise structure of sets of finite perimeter, and Aleksandrov's theorem (on the twice differentiability a.e. of convex functions).This revised edition includes countl...

  4. Measuring uncertainty within the theory of evidence

    CERN Document Server

    Salicone, Simona

    2018-01-01

    This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone’s Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining ...

  5. Quantum measurement and algebraic quantum field theories

    International Nuclear Information System (INIS)

    DeFacio, B.

    1976-01-01

    It is shown that the physics and semantics of quantum measurement provide a natural interpretation of the weak neighborhoods of the states on observable algebras without invoking any ideas of ''a reading error'' or ''a measured range.'' Then the state preparation process in quantum measurement theory is shown to give the normal (or locally normal) states on the observable algebra. Some remarks are made concerning the physical implications of normal state for systems with an infinite number of degrees of freedom, including questions on open and closed algebraic theories

  6. Is education a fundamental right? People's lay theories about intellectual potential drive their positions on education

    OpenAIRE

    Savani, K; Rattan, A; Dweck, C S

    2017-01-01

    Does every child have a fundamental right to receive a high quality education? We propose that people’s beliefs about whether “nearly everyone” or “only some people” have high intellectual potential drive their positions on education. Three studies found that the more people believed that nearly everyone has high potential, the more they viewed education as a fundamental human right. Further, people who viewed education as a fundamental right, in turn, (1) were more likely to support the inst...

  7. Measurement theory and the Schroedinger equation

    International Nuclear Information System (INIS)

    Schwarz, A.S.; Tyupkin, Yu.S.

    1987-01-01

    The paper is an analysis of the measuring process in quantum mechanics based on the Schroedinger equation. The arguments employed use an assumption reflecting, to some extent, the statistical properties of the vacuum. A description is given of the cases in which different incoherent superpositions of pure states in quantum mechanics are physically equivalent. The fundamental difference between quantum and classical mechanics as explained by the existence of unobservable variables is discussed. (U.K.)

  8. The Quest for a Fundamental Theory of Physics - Rise and Demise of the Field Paradigm

    NARCIS (Netherlands)

    Holman, M.

    2014-01-01

    Quite remarkably, the two physical theories that describe extremely well physical phenomena on the largest and smallest distance scales in our universe, viz. general relativity and quantum theory, respectively, are radically disparate. Both theories are now almost a century old and have passed with

  9. Contiguity and quantum theory of measurement

    Energy Technology Data Exchange (ETDEWEB)

    Green, H.S. [Adelaide Univ., SA (Australia). Dept. of Mathematical Physics]|[Adelaide Univ., SA (Australia). Dept. of Physics

    1995-12-31

    This paper presents a comprehensive treatment of the problem of measurement in microscopic physics, consistent with the indeterministic Copenhagen interpretation of quantum mechanics and information theory. It is pointed out that there are serious difficulties in reconciling the deterministic interpretations of quantum mechanics, based on the concepts of a universal wave function or hidden variables, with the principle of contiguity. Quantum mechanics is reformulated entirely in terms of observables, represented by matrices, including the statistical matrix, and the utility of information theory is illustrated by a discussion of the EPR paradox. The principle of contiguity is satisfied by all conserved quantities. A theory of the operation of macroscopic measuring devices is given in the interaction representation, and the attenuation of the indeterminacy of a microscopic observable in the process of measurement is related to observable changes of entropy. 28 refs.

  10. Contiguity and quantum theory of measurement

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1995-01-01

    This paper presents a comprehensive treatment of the problem of measurement in microscopic physics, consistent with the indeterministic Copenhagen interpretation of quantum mechanics and information theory. It is pointed out that there are serious difficulties in reconciling the deterministic interpretations of quantum mechanics, based on the concepts of a universal wave function or hidden variables, with the principle of contiguity. Quantum mechanics is reformulated entirely in terms of observables, represented by matrices, including the statistical matrix, and the utility of information theory is illustrated by a discussion of the EPR paradox. The principle of contiguity is satisfied by all conserved quantities. A theory of the operation of macroscopic measuring devices is given in the interaction representation, and the attenuation of the indeterminacy of a microscopic observable in the process of measurement is related to observable changes of entropy. 28 refs

  11. Ocean Ambient Noise Measurement and Theory

    CERN Document Server

    Carey, William M

    2011-01-01

    This book develops the theory of ocean ambient noise mechanisms and measurements, and also describes general noise characteristics and computational methods.  It concisely summarizes the vast ambient noise literature using theory combined with key representative results.  The air-sea boundary interaction zone is described in terms of non-dimensional variables requisite for future experiments.  Noise field coherency, rare directional measurements, and unique basin scale computations and methods are presented.  The use of satellite measurements in these basin scale models is demonstrated.  Finally, this book provides a series of appendices giving in-depth mathematical treatments.  With its complete and careful discussions of both theory and experimental results, this book will be of the greatest interest to graduate students and active researchers working in fields related to ambient noise in the ocean.

  12. Ontic structural realism and quantum field theory: Are there intrinsic properties at the most fundamental level of reality?

    Science.gov (United States)

    Berghofer, Philipp

    2018-05-01

    Ontic structural realism refers to the novel, exciting, and widely discussed basic idea that the structure of physical reality is genuinely relational. In its radical form, the doctrine claims that there are, in fact, no objects but only structure, i.e., relations. More moderate approaches state that objects have only relational but no intrinsic properties. In its most moderate and most tenable form, ontic structural realism assumes that at the most fundamental level of physical reality there are only relational properties. This means that the most fundamental objects only possess relational but no non-reducible intrinsic properties. The present paper will argue that our currently best physics refutes even this most moderate form of ontic structural realism. More precisely, I will claim that 1) according to quantum field theory, the most fundamental objects of matter are quantum fields and not particles, and show that 2) according to the Standard Model, quantum fields have intrinsic non-relational properties.

  13. Wilson loops in superconformal Chern-Simons theory and fundamental strings in Anti-de Sitter supergravity dual

    International Nuclear Information System (INIS)

    Rey, Soo-Jong; Suyama, Takao; Yamaguchi, Satoshi

    2009-01-01

    We study Wilson loop operators in three-dimensional, N = 6 superconformal Chern-Simons theory dual to IIA superstring theory on AdS 4 x CP 3 . Novelty of Wilson loop operators in this theory is that, for a given contour, there are two linear combinations of Wilson loop transforming oppositely under time-reversal transformation. We show that one combination is holographically dual to IIA fundamental string, while orthogonal combination is set to zero. We gather supporting evidences from detailed comparative study of generalized time-reversal transformations in both D2-brane worldvolume and ABJM theories. We then classify supersymmetric Wilson loops and find at most 1/6 supersymmetry. We next study Wilson loop expectation value in planar perturbation theory. For circular Wilson loop, we find features remarkably parallel to circular Wilson loop in N = 4 super Yang-Mills theory in four dimensions. First, all odd loop diagrams vanish identically and even loops contribute nontrivial contributions. Second, quantum corrected gauge and scalar propagators take the same form as those of N = 4 super Yang-Mills theory. Combining these results, we propose that expectation value of circular Wilson loop is given by Wilson loop expectation value in pure Chern-Simons theory times zero-dimensional Gaussian matrix model whose variance is specified by an interpolating function of 't Hooft coupling. We suggest the function interpolates smoothly between weak and strong coupling regime, offering new test ground of the AdS/CFT correspondence.

  14. Field algebras in quantum theory with indefinite metric. III. Spectrum of modular operator and Tomita's fundamental theorem

    International Nuclear Information System (INIS)

    Dadashyan, K.Yu.; Khoruzhii, S.S.

    1987-01-01

    The construction of a modular theory for weakly closed J-involutive algebras of bounded operators on Pontryagin spaces is continued. The spectrum of the modular operator Δ of such an algebra is investigated, the existence of a strongly continuous J-unitary group is established and, under the condition that the spectrum lies in the right half-plane, Tomita's fundamental theorem is proved

  15. Is Education a Fundamental Right? People's Lay Theories About Intellectual Potential Drive Their Positions on Education.

    Science.gov (United States)

    Savani, Krishna; Rattan, Aneeta; Dweck, Carol S

    2017-09-01

    Does every child have a fundamental right to receive a high-quality education? We propose that people's beliefs about whether "nearly everyone" or "only some people" have high intellectual potential drive their positions on education. Three studies found that the more people believed that nearly everyone has high potential, the more they viewed education as a fundamental human right. Furthermore, people who viewed education as a fundamental right, in turn (a) were more likely to support the institution of free public education, (b) were more concerned upon learning that students in the country were not performing well academically compared with students in peer nations, and (c) were more likely to support redistributing educational funds more equitably across wealthier and poorer school districts. The studies show that people's beliefs about intellectual potential can influence their positions on education, which can affect the future quality of life for countless students.

  16. Examining Teacher Grades Using Rasch Measurement Theory

    Science.gov (United States)

    Randall, Jennifer; Engelhard, George, Jr.

    2009-01-01

    In this study, we present an approach to questionnaire design within educational research based on Guttman's mapping sentences and Many-Facet Rasch Measurement Theory. We designed a 54-item questionnaire using Guttman's mapping sentences to examine the grading practices of teachers. Each item in the questionnaire represented a unique student…

  17. Geometric Measure Theory and Minimal Surfaces

    CERN Document Server

    Bombieri, Enrico

    2011-01-01

    W.K. ALLARD: On the first variation of area and generalized mean curvature.- F.J. ALMGREN Jr.: Geometric measure theory and elliptic variational problems.- E. GIUSTI: Minimal surfaces with obstacles.- J. GUCKENHEIMER: Singularities in soap-bubble-like and soap-film-like surfaces.- D. KINDERLEHRER: The analyticity of the coincidence set in variational inequalities.- M. MIRANDA: Boundaries of Caciopoli sets in the calculus of variations.- L. PICCININI: De Giorgi's measure and thin obstacles.

  18. Fundamentals of legal argumentation : A survey of theories on the justification of legal decisions

    NARCIS (Netherlands)

    Feteris, E.T.

    2017-01-01

    This book is an updated and revised edition of Fundamentals of Legal Argumentation published in 1999. It discusses new developments that have taken place in the past 15 years in research of legal argumentation, legal justification and legal interpretation, as well as the implications of these new

  19. Fundamental Flaws In The Derivation Of Stevens' Law For Taste Within Norwich's Entropy Theory of Perception

    International Nuclear Information System (INIS)

    Nizami, Lance

    2010-01-01

    Norwich's Entropy Theory of Perception (1975-present) is a general theory of perception, based on Shannon's Information Theory. Among many bold claims, the Entropy Theory presents a truly astounding result: that Stevens' Law with an Index of 1, an empirical power relation of direct proportionality between perceived taste intensity and stimulus concentration, arises from theory alone. Norwich's theorizing starts with several extraordinary hypotheses. First, 'multiple, parallel receptor-neuron units' without collaterals 'carry essentially the same message to the brain', i.e. the rate-level curves are identical. Second, sensation is proportional to firing rate. Third, firing rate is proportional to the taste receptor's 'resolvable uncertainty'. Fourth, the 'resolvable uncertainty' is obtained from Shannon's Information Theory. Finally, 'resolvable uncertainty' also depends upon the microscopic thermodynamic density fluctuation of the tasted solute. Norwich proves that density fluctuation is density variance, which is proportional to solute concentration, all based on the theory of fluctuations in fluid composition from Tolman's classic physics text, 'The Principles of Statistical Mechanics'. Altogether, according to Norwich, perceived taste intensity is theoretically proportional to solute concentration. Such a universal rule for taste, one that is independent of solute identity, personal physiological differences, and psychophysical task, is truly remarkable and is well-deserving of scrutiny. Norwich's crucial step was the derivation of density variance. That step was meticulously reconstructed here. It transpires that the appropriate fluctuation is Tolman's mean-square fractional density fluctuation, not density variance as used by Norwich. Tolman's algebra yields a 'Stevens Index' of -1 rather than 1. As 'Stevens Index' empirically always exceeds zero, the Index of -1 suggests that it is risky to infer psychophysical laws of sensory response from information theory

  20. Two-ion theory of energy coupling in ATP synthesis rectifies a fundamental flaw in the governing equations of the chemiosmotic theory.

    Science.gov (United States)

    Nath, Sunil

    2017-11-01

    The vital coupled processes of oxidative phosphorylation and photosynthetic phosphorylation synthesize molecules of adenosine-5'-triphosphate (ATP), the universal biological energy currency, and sustain all life on our planet. The chemiosmotic theory of energy coupling in oxidative and photophosphorylation was proposed by Mitchell >50years ago. It has had a contentious history, with part of the accumulated body of experimental evidence supporting it, and part of it in conflict with the theory. Although the theory was strongly criticized by many prominent scientists, the controversy has never been resolved. Here, the mathematical steps of Mitchell's original derivation leading to the principal equation of the chemiosmotic theory are scrutinized, and a fundamental flaw in them has been identified. Surprisingly, this flaw had not been detected earlier. Discovery of such a defect negates, or at least considerably weakens, the theoretical foundations on which the chemiosmotic theory is based. Ad hoc or simplistic ways to remedy this defect are shown to be scientifically unproductive and sterile. A novel two-ion theory of biological energy coupling salvages the situation by rectifying the fundamental flaw in the chemiosmotic theory, and the governing equations of the new theory have been shown to accurately quantify and predict extensive recent experimental data on ATP synthesis by F 1 F O -ATP synthase without using adjustable parameters. Some major biological implications arising from the new thinking are discussed. The principles of energy transduction and coupling proposed in the new paradigm are shown to be of a very general and universal nature. It is concluded that the timely availability after a 25-year research struggle of Nath's torsional mechanism of energy transduction and ATP synthesis is a rational alternative that has the power to solve the problems arising from the past, and also meet present and future challenges in this important interdisciplinary field

  1. Historical-systematic fundaments of the Trinitarian theory of the liturgical event

    Directory of Open Access Journals (Sweden)

    Robert Woźniak

    2011-12-01

    Full Text Available The object of present research is to develop some fundamental traces of the Trinitarian understanding of the Christian liturgy. The article attempts to point out to the fundamental coordinates of Trinitarian comprehension of the liturgy from the historical perspective. In order to do this, it traces the links between first formulations of Trinitarian faith and early development of the Christian liturgy. The argument starts with consideration of some new biblical approaches to the phenomena of early Christian cult seen in its theological (Christological and Trinitarian constellation (Bauckham, Hurtado. After this preliminary biblical-theological inquiry, some fundamental patristic texts are taken into account. The last stage of investigation is presentation of Second Vatican Council’s account of the theology of liturgy which proofs itself to be openly Trinitarian.

  2. Theories and measures of elder abuse.

    Science.gov (United States)

    Abolfathi Momtaz, Yadollah; Hamid, Tengku Aizan; Ibrahim, Rahimah

    2013-09-01

    Elder abuse is a pervasive phenomenon around the world with devastating effects on the victims. Although it is not a new phenomenon, interest in examining elder abuse is relatively new. This paper aims to provide an overview of the aetiological theories and measures of elder abuse. The paper briefly reviews theories to explain causes of elder abuse and then discusses the most commonly used measures of elder abuse. Based on the reviewed theories, it can be concluded that elder abuse is a multifactorial problem that may affect elderly people from different backgrounds and involve a wide variety of potential perpetrators, including caregivers, adult children, and partners. The review of existing measurement instruments notes that many different screening and assessment instruments have been developed to identify elders who are at risk for or are victims of abuse. However, there is a real need for more measurements of elder abuse, as the current instruments are limited in scope. © 2013 The Authors. Psychogeriatrics © 2013 Japanese Psychogeriatric Society.

  3. FUNDAMENTALS OF THE THEORY OF VENTILLATION PROCESSES IN THE STEAM TURBINES TPP

    Directory of Open Access Journals (Sweden)

    V. M. Neuimin

    2015-01-01

    Full Text Available  The article proposes the theoretical framework of ventilation processes emerging and going on in the stages of TPP steam turbines during the operating regimes with small-quantity volumetric flow rates in the low-pressure cylinder. The basic theory includes new physicomathematical models for estimating the ventilating capacity losses and ventilation heatings-up of the steam and the air-gas channel of the turbine; search and investigation of the factors causing the increased momental loads on the blade wheels of the finale stages which are likely to lead to destruction of the rotating blades. The paper renders the practical results of utilizing the theoretical framework of ventilation processes.The author obtains a new mathematical relation for high-accuracy assessment of the ventilating capacity losses accounting for all the diversification of parameters defining the level of these losses (it is established that the Coriolis force contributes twice as much to the ventilating capacity losses as the centrifugal force. Seven ordinary formulae obtained on its basis provide a separate stage ventilation-losses immediate evaluation (with rotation blades of the finale stage not unwinding from the turning, with rotation blades of the finale and intermediate stages unwinding from the turning, in the turbine altogether-vapor-evacuated including by readings of the regular instruments located at the connecters of the exhaust part of the lowpressure cylinder.As the cornerstone of the new ventilation heating-up evaluation system the author lays two experimentally established facts: the ventilating capacity losses are practically constant at working steam negligible volumetric flow rates; symmetrical ventilating flows in the blade channel mingle entirely to the moment of their split up at the periphery. This renders possible estimating the complete enthalpy increment of the steam being discharged from a stage in relation to the enthalpy of the steam being

  4. Geometric measure theory a beginner's guide

    CERN Document Server

    Morgan, Frank

    1995-01-01

    Geometric measure theory is the mathematical framework for the study of crystal growth, clusters of soap bubbles, and similar structures involving minimization of energy. Morgan emphasizes geometry over proofs and technicalities, and includes a bibliography and abundant illustrations and examples. This Second Edition features a new chapter on soap bubbles as well as updated sections addressing volume constraints, surfaces in manifolds, free boundaries, and Besicovitch constant results. The text will introduce newcomers to the field and appeal to mathematicians working in the field.

  5. Measurement Models for Reasoned Action Theory

    OpenAIRE

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...

  6. Fundamentals of the fuzzy logic-based generalized theory of decisions

    CERN Document Server

    Aliev, Rafik Aziz

    2013-01-01

    Every day decision making and decision making in complex human-centric systems are characterized by imperfect decision-relevant information. Main drawback of the existing decision theories is namely incapability to deal with imperfect information and modeling vague preferences. Actually, a paradigm of non-numerical probabilities in decision making has a long history and arose also in Keynes’s analysis of uncertainty. There is a need for further generalization – a move to decision theories with perception-based imperfect information described in NL. The languages of new decision models for human-centric systems should be not languages based on binary logic but human-centric computational schemes able to operate on NL-described information. Development of new theories is now possible due to an increased computational power of information processing systems which allows for computations with imperfect information, particularly, imprecise and partially true information, which are much more complex than comput...

  7. Fundamentals and approximations of multilevel resonance theory for reactor physics applications

    International Nuclear Information System (INIS)

    Moore, M.S.

    1980-01-01

    The formal theory of nuclear reactions leads to any of a number of alternative representations for describing resonance behavior. None of these is satisfactory for applications, and, depending on the problem to be addressed, approximate expressions are used. The specializations and approximations found to be most useful by evaluators are derived from R-matrix theory and are discussed from the viewpoint of convenience in numerical calculations. Finally, we illustrate the application of the theory by reviewing a particular example: the spin-separated neutron-induced cross sections of 235 U in the resolved and unresolved resonance regions and the use of these results in the U.S. evaluated nuclear data file ENDF/B. (author)

  8. The basics of information security understanding the fundamentals of InfoSec in theory and practice

    CERN Document Server

    Andress, Jason

    2014-01-01

    As part of the Syngress Basics series, The Basics of Information Security provides you with fundamental knowledge of information security in both theoretical and practical aspects. Author Jason Andress gives you the basic knowledge needed to understand the key concepts of confidentiality, integrity, and availability, and then dives into practical applications of these ideas in the areas of operational, physical, network, application, and operating system security. The Basics of Information Security gives you clear-non-technical explanations of how infosec works and how to apply these princi

  9. Ends, fundamental tones and capacity of minimal submanifolds via extrinsic comparison theory

    DEFF Research Database (Denmark)

    Gimeno, Vicent; Markvorsen, Steen

    2015-01-01

    We study the volume of extrinsic balls and the capacity of extrinsic annuli in minimal submanifolds which are properly immersed with controlled radial sectional curvatures into an ambient manifold with a pole. The key results are concerned with the comparison of those volumes and capacities with ...... with the corresponding entities in a rotationally symmetric model manifold. Using the asymptotic behavior of the volumes and capacities we then obtain upper bounds for the number of ends as well as estimates for the fundamental tone of the submanifolds in question....

  10. Education and liberty: the habitus formation as a fundamental critical element to the theory of emancipation

    Directory of Open Access Journals (Sweden)

    Adreana Dulcina Platt

    2017-11-01

    Full Text Available The study evaluates the cumulative path traveled by the subject originally submitted to the state of nature (hominization to the condition of ‘human being’ made of instrumental complexity and cognitive (humanization elements. Education becomes the constitution axis from hominization to humanization, through the assumptions of incorporation of habitus of human nature. Through Historical Materialist Theory, we described habitus as the exercise of practices repeated and incorporated into the formation of the subject, thus becoming a ‘second nature’. The Theory of Emancipation described in Kant and Horkheimer and Adorno, we check the status of liberty of the subjects by the incorporation of a practice primarily belonging to the world of reason (ethics and aesthetics or by negative educational action.

  11. Views of a devil's advocate -- Fundamental challenges to effective field theory treatments of nuclear physics

    International Nuclear Information System (INIS)

    Cohen, T.D.

    1998-04-01

    The physics goals of the effective field theory program for nuclear phenomena are outlined. It is pointed out that there are multiple schemes for implementing EFT and it is presently not clear if any of these schemes is viable. Most of the applications of effective field theory ideas have been on nucleon-nucleon scattering. It is argued that this is little more than curve fitting and that other quantities need to be calculated to test the ideas. It is shown that EFT methods work well for certain bound state properties of the deuteron electric form factor. However, it is also shown that this success depends sensitively on the fact that the majority of the probability of the deuteron's wave function is beyond the range of the potential. This circumstance is special to the deuteron suggesting that it will be very difficult to achieve the same kinds of success for tightly bound nuclei

  12. Applications of measure theory to statistics

    CERN Document Server

    Pantsulaia, Gogi

    2016-01-01

    This book aims to put strong reasonable mathematical senses in notions of objectivity and subjectivity for consistent estimations in a Polish group by using the concept of Haar null sets in the corresponding group. This new approach – naturally dividing the class of all consistent estimates of an unknown parameter in a Polish group into disjoint classes of subjective and objective estimates – helps the reader to clarify some conjectures arising in the criticism of null hypothesis significance testing. The book also acquaints readers with the theory of infinite-dimensional Monte Carlo integration recently developed for estimation of the value of infinite-dimensional Riemann integrals over infinite-dimensional rectangles. The book is addressed both to graduate students and to researchers active in the fields of analysis, measure theory, and mathematical statistics.

  13. How the “Extended Mind” Thesis Helps to Solve a Fundamental Dilemma of Literacy Theory

    Directory of Open Access Journals (Sweden)

    Marcin Trybulec

    2013-09-01

    Full Text Available Abstract The aim of the paper is to outline the basic theoretical reasons for applying the concept of extended mind to literacy theory. In order to explicate theoretical difficulties faced by literacy theory, one needs to take into account the debate regarding technological determinism. Using this debate as an example, one can observe two basic strategies applied to defining the concept of a medium. On the one hand, there exists a strategy to formulate a narrow definition of a medium in terms of material artifacts. On the other hand, one can observe a strategy which relies on creating a wide definition of a medium. According to this definition, a medium is understood as a social institution which denotes a particular way of behaving and thinking. The paper aims at justifying the hypothesis that both interpretational strategies employed in defining a medium and the related concept of technology are inaccurate in certain important respects. If the argumentation presented here proves correct, literacy theory will be faced with a serious dilemma: a choice between two equally unsuitable definitions of technology/media. However, the theoretical dilemma pertaining to the question of media can be elucidated by appealing to the conceptual framework of the extended mind hypothesis.

  14. Quantum theory from a nonlinear perspective Riccati equations in fundamental physics

    CERN Document Server

    Schuch, Dieter

    2018-01-01

    This book provides a unique survey displaying the power of Riccati equations to describe reversible and irreversible processes in physics and, in particular, quantum physics. Quantum mechanics is supposedly linear, invariant under time-reversal, conserving energy and, in contrast to classical theories, essentially based on the use of complex quantities. However, on a macroscopic level, processes apparently obey nonlinear irreversible evolution equations and dissipate energy. The Riccati equation, a nonlinear equation that can be linearized, has the potential to link these two worlds when applied to complex quantities. The nonlinearity can provide information about the phase-amplitude correlations of the complex quantities that cannot be obtained from the linearized form. As revealed in this wide ranging treatment, Riccati equations can also be found in many diverse fields of physics from Bose-Einstein-condensates to cosmology. The book will appeal to graduate students and theoretical physicists interested in ...

  15. A measurement theory of illusory conjunctions.

    Science.gov (United States)

    Prinzmetal, William; Ivry, Richard B; Beck, Diane; Shimizu, Naomi

    2002-04-01

    Illusory conjunctions refer to the incorrect perceptual combination of correctly perceived features, such as color and shape. Research on the phenomenon has been hampered by the lack of a measurement theory that accounts for guessing features, as well as the incorrect combination of correctly perceived features. Recently, several investigators have suggested using multinomial models as a tool for measuring feature integration. The authors examined the adequacy of these models in 2 experiments by testing whether model parameters reflect changes in stimulus factors. In a third experiment, confidence ratings were used as a tool for testing the model. Multinomial models accurately reflected both variations in stimulus factors and observers' trial-by-trial confidence ratings.

  16. Testing the Standard Model and Fundamental Symmetries in Nuclear Physics with Lattice QCD and Effective Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Walker-Loud, Andre [College of William and Mary, Williamsburg, VA (United States)

    2016-10-14

    The research supported by this grant is aimed at probing the limits of the Standard Model through precision low-energy nuclear physics. The work of the PI (AWL) and additional personnel is to provide theory input needed for a number of potentially high-impact experiments, notably, hadronic parity violation, Dark Matter direct detection and searches for permanent electric dipole moments (EDMs) in nucleons and nuclei. In all these examples, a quantitative understanding of low-energy nuclear physics from the fundamental theory of strong interactions, Quantum Chromo-Dynamics (QCD), is necessary to interpret the experimental results. The main theoretical tools used and developed in this work are the numerical solution to QCD known as lattice QCD (LQCD) and Effective Field Theory (EFT). This grant is supporting a new research program for the PI, and as such, needed to be developed from the ground up. Therefore, the first fiscal year of this grant, 08/01/2014-07/31/2015, has been spent predominantly establishing this new research effort. Very good progress has been made, although, at this time, there are not many publications to show for the effort. After one year, the PI accepted a job at Lawrence Berkeley National Laboratory, so this final report covers just a single year of five years of the grant.

  17. Quantum gravity in the sky: interplay between fundamental theory and observations

    International Nuclear Information System (INIS)

    Ashtekar, Abhay; Gupt, Brajesh

    2017-01-01

    Observational missions have provided us with a reliable model of the evolution of the universe starting from the last scattering surface all the way to future infinity. Furthermore given a specific model of inflation, using quantum field theory on curved space-times this history can be pushed back in time to the epoch when space-time curvature was some 10 62 times that at the horizon of a solar mass black hole! However, to extend the history further back to the Planck regime requires input from quantum gravity. An important aspect of this input is the choice of the background quantum geometry and of the Heisenberg state of cosmological perturbations thereon, motivated by Planck scale physics. This paper introduces first steps in that direction. Specifically we propose two principles that link quantum geometry and Heisenberg uncertainties in the Planck epoch with late time physics and explore in detail the observational consequences of the initial conditions they select. We find that the predicted temperature–temperature (T–T) correlations for scalar modes are indistinguishable from standard inflation at small angular scales even though the initial conditions are now set in the deep Planck regime. However, there is a specific power suppression at large angular scales . As a result, the predicted spectrum provides a better fit to the PLANCK mission data than standard inflation, where the initial conditions are set in the general relativity regime. Thus, our proposal brings out a deep interplay between the ultraviolet and the infrared. Finally, the proposal also leads to specific predictions for power suppression at large angular scales also for the (T–E and E–E) correlations involving electric polarization3. The PLANCK team is expected to release this data in the coming year. (paper)

  18. Quantum gravity in the sky: interplay between fundamental theory and observations

    Science.gov (United States)

    Ashtekar, Abhay; Gupt, Brajesh

    2017-01-01

    Observational missions have provided us with a reliable model of the evolution of the universe starting from the last scattering surface all the way to future infinity. Furthermore given a specific model of inflation, using quantum field theory on curved space-times this history can be pushed back in time to the epoch when space-time curvature was some 1062 times that at the horizon of a solar mass black hole! However, to extend the history further back to the Planck regime requires input from quantum gravity. An important aspect of this input is the choice of the background quantum geometry and of the Heisenberg state of cosmological perturbations thereon, motivated by Planck scale physics. This paper introduces first steps in that direction. Specifically we propose two principles that link quantum geometry and Heisenberg uncertainties in the Planck epoch with late time physics and explore in detail the observational consequences of the initial conditions they select. We find that the predicted temperature-temperature (T-T) correlations for scalar modes are indistinguishable from standard inflation at small angular scales even though the initial conditions are now set in the deep Planck regime. However, there is a specific power suppression at large angular scales. As a result, the predicted spectrum provides a better fit to the PLANCK mission data than standard inflation, where the initial conditions are set in the general relativity regime. Thus, our proposal brings out a deep interplay between the ultraviolet and the infrared. Finally, the proposal also leads to specific predictions for power suppression at large angular scales also for the (T-E and E-E) correlations involving electric polarization3. The PLANCK team is expected to release this data in the coming year.

  19. Light Scattering Tests of Fundamental Theories of Transport Properties in the Critical Region

    Science.gov (United States)

    Gammon, R. W.; Moldover, M. R.

    1985-01-01

    The objective of this program is to measure the decay rates of critical density fluctuations in a simple fluid (xenon) very near its liquid-vapor critical point using laser light scattering and photon correlation spectroscopy. Such experiments have been severely limited on Earth by the presence of gravity which causes large density gradients in the sample when the compressibility diverges approaching the critical point. The goal is to measure decay rates deep in the critical region where the scaled wavevector is the order of 1000. This will require loading the sample to 0.01% of the critical density and taking data as close as 3 microKelvin to the critical temperature (Tc = 289.72 K). Other technical problems have to be addressed such as multiple scattering and the effect of wetting layers. The ability to avoid multiple scattering by using a thin sample (100 microns) was demonstrated, as well as a temperature history which can avoid wetting layers satisfactory temperature control and measurement, and accurate sample loading. Thus the questions of experimental art are solved leaving the important engineering tasks of mounting the experiment to maintain alignment during flight and automating the state-of-the-art temperature bridges for microcomputer control of the experiment.

  20. Measurement Models for Reasoned Action Theory.

    Science.gov (United States)

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  1. Measurement sum theory and application - Application to low level measurements

    International Nuclear Information System (INIS)

    Puydarrieux, S.; Bruel, V.; Rivier, C.; Crozet, M.; Vivier, A.; Manificat, G.; Thaurel, B.; Mokili, M.; Philippot, B.; Bohaud, E.

    2015-09-01

    In laboratories, most of the Total Sum methods implemented today use substitution or censure methods for nonsignificant or negative values, and thus create biases which can sometimes be quite large. They are usually positive, and generate, for example, becquerel (Bq) counting or 'administrative' quantities of materials (= 'virtual'), thus artificially falsifying the records kept by the laboratories under regulatory requirements (environment release records, waste records, etc.). This document suggests a methodology which will enable the user to avoid such biases. It is based on the following two fundamental rules: - The Total Sum of measurement values must be established based on all the individual measurement values, even those considered non-significant including the negative values. Any modification of these values, under the pretext that they are not significant, will inevitably lead to biases in the accumulated result and falsify the evaluation of its uncertainty. - In Total Sum operations, the decision thresholds are arrived at in a similar way to the approach used for uncertainties. The document deals with four essential aspects of the notion of 'measurement Total Sums': - The expression of results and associated uncertainties close to Decision Thresholds, and Detection or Quantification Limits, - The Total Sum of these measurements: sum or mean, - The calculation of the uncertainties associated with the Total Sums, - Result presentation (particularly when preparing balance sheets or reports, etc.) Several case studies arising from different situations are used to illustrate the methodology: environmental monitoring reports, release reports, and chemical impurity Total Sums for the qualification of a finished product. The special case of material balances, in which the measurements are usually all significant and correlated (the covariance term cannot then be ignored) will be the subject of a future second document. This

  2. Educacion Fundamental Integral #2: Teoria y Aplicacion en el Caso de ACPO. (Fundamental Integral Education #2: Theory and Application in the Case of ACPO.)

    Science.gov (United States)

    Alarcon, Hernando Bernal

    Educacion Fundamental Integral (EFI) is an educational process which aims to help Colombia's rural population to improve their living conditions. EFI adapts to the concrete circumstances of the person in his own environment. Objectives of EFI are to make the rural people: responsible for the work necessary for their own development; work together;…

  3. Measurement Theory and the Foundations of Utilitarianism

    OpenAIRE

    John a. Weymark

    2004-01-01

    Harsanyi used expected utility theory to provide two axiomatizations of weighted utilitarian rules. Sen (and later, Weymark) has argued that Harsanyi has not, in fact, axiomatized utilitarianism because he has misapplied expected utility theory. Specifically, Sen and Weymark have argued that von Neumann-Morgenstern expected utility theory is an ordinal theory and, therefore, any increasing transform of a von Neumann-Morgenstern utility function is a satisfactory representation of a preference...

  4. Evaluating Child Coping Competence: Theory and Measurement

    Science.gov (United States)

    Moreland, Angela D.; Dumas, Jean E.

    2008-01-01

    Much of the research on children's coping styles is based on a downward extension of adult coping theories. In a departure from this approach, coping competence theory seeks to account for children's ability to cope with daily challenges on the basis of developmental research. The theory, which states that challenges call for distinct coping…

  5. Fundamental length

    International Nuclear Information System (INIS)

    Pradhan, T.

    1975-01-01

    The concept of fundamental length was first put forward by Heisenberg from purely dimensional reasons. From a study of the observed masses of the elementary particles known at that time, it is sumrised that this length should be of the order of magnitude 1 approximately 10 -13 cm. It was Heisenberg's belief that introduction of such a fundamental length would eliminate the divergence difficulties from relativistic quantum field theory by cutting off the high energy regions of the 'proper fields'. Since the divergence difficulties arise primarily due to infinite number of degrees of freedom, one simple remedy would be the introduction of a principle that limits these degrees of freedom by removing the effectiveness of the waves with a frequency exceeding a certain limit without destroying the relativistic invariance of the theory. The principle can be stated as follows: It is in principle impossible to invent an experiment of any kind that will permit a distintion between the positions of two particles at rest, the distance between which is below a certain limit. A more elegant way of introducing fundamental length into quantum theory is through commutation relations between two position operators. In quantum field theory such as quantum electrodynamics, it can be introduced through the commutation relation between two interpolating photon fields (vector potentials). (K.B.)

  6. The issue of phases in quantum measurement theory

    International Nuclear Information System (INIS)

    Pati, Arun Kumar

    1999-01-01

    The issue of phases is always very subtle in quantum world and many of the curious phenomena are due to the existence of the phase of the quantum mechanical wave function. We investigate the issue of phases in quantum measurement theory and predict a new effect of fundamental importance. We call a quantum system under goes a quantum Zeno dynamics when the unitary evolution of a quantum system is interrupted by a sequence of measurements. In particular, we investigate the effect of repeated measurements on the geometric phase and show that the quantum Zeno dynamics can inhibit its development under a large number of measurement pulses. It is interesting to see that neither the total phase nor the dynamical phase goes to zero under large number of measurements. This new effect we call as the 'quantum Zeno Phase effect' in analogous to the quantum Zeno effect where the repeated measurements inhibit the transition probability. This 'quantum Zeno Phase effect' can be proved within von Neumann's collapse mechanism as well as using a continuous measurement model. So the effect is really independent of any particular measurement model considered. Since the geometric phase attributes a memory to a quantum system our results also proves that the path dependent memory of a system can be erased by a sequence of measurements. The quantum Zeno Phase effect provides a way to control and manipulate the phase of a wave function in an interference set up. Finally, we stress that the quantum Zeno Phase effect can be tested using neutron, photon and atom interference experiments with the presently available technology. (Author)

  7. ACADEMIC TRAINING: Low Energy Experiments that Measure Fundamental Constants and Test Basic Symmetries

    CERN Multimedia

    Françoise Benz

    2002-01-01

    17, 18, 19 , 21 June LECTURE SERIES from 11.00 to 12.00 hrs - Auditorium, bldg. 500 Low Energy Experiments that Measure Fundamental Constants and Test Basic Symmetries by G. GABRIELSE / Professor of Physics and Chair of the Harvard Physics Department, Spokesperson for the ATRAP Collaboration Lecture 1: Particle Traps: the World's Tiniest Accelerators A single elementary particle, or a single ion, can be confined in a tiny accelerator called a particle trap. A single electron was held this way for more than ten months, and antiprotons for months. Mass spectroscopy of exquisite precision is possible with such systems. CERN's TRAP Collaboration thereby compared the charge-to-mass ratios of the antiproton and proton to a precision of 90 parts per trillion, by far the most stringent CPT test done with a baryon system. The important ratio of the masses of the electron and proton have been similarly measured, as have a variety of ions masses, and the neutron mass is most accurately known from such measurements. An i...

  8. Philosophy of mathematics set theory, measuring theories, and nominalism

    CERN Document Server

    Preyer, Gerhard

    2008-01-01

    One main interest of philosophy is to become clear about the assumptions, premisses and inconsistencies of our thoughts and theories. And even for a formal language like mathematics it is controversial if consistency is acheivable or necessary like the articles in the firt part of the publication show. Also the role of formal derivations, the role of the concept of apriority, and the intuitions of mathematical principles and properties need to be discussed. The second part is a contribution on nominalistic and platonistic views in mathematics, like the ""indispensability argument"" of W. v. O.

  9. Speech task effects on acoustic measure of fundamental frequency in Cantonese-speaking children.

    Science.gov (United States)

    Ma, Estella P-M; Lam, Nina L-N

    2015-12-01

    Speaking fundamental frequency (F0) is a voice measure frequently used to document changes in vocal performance over time. Knowing the intra-subject variability of speaking F0 has implications on its clinical usefulness. The present study examined the speaking F0 elicited from three speech tasks in Cantonese-speaking children. The study also compared the variability of speaking F0 elicited from different speech tasks. Fifty-six vocally healthy Cantonese-speaking children (31 boys and 25 girls) aged between 7.0 and 10.11 years participated. For each child, speaking F0 was elicited using speech tasks at three linguistic levels (sustained vowel /a/ prolongation, reading aloud a sentence and passage). Two types of variability, within-session (trial-to-trial) and across-session (test-retest) variability, were compared across speech tasks. Significant differences in mean speaking F0 values were found between speech tasks. Mean speaking F0 value elicited from sustained vowel phonations was significantly higher than those elicited from the connected speech tasks. The variability of speaking F0 was higher in sustained vowel prolongation than that in connected speech. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Detailed examination of 'standard elementary particle theories' based on measurement with Tristan

    International Nuclear Information System (INIS)

    Kamae, Tsuneyoshi

    1989-01-01

    The report discusses possible approaches to detailed analysis of 'standard elementary particle theories' on the basis of measurements made with Tristan. The first section of the report addresses major elementary particles involved in the 'standard theories'. The nature of the gauge particles, leptons, quarks and Higgs particle are briefly outlined. The Higgs particle and top quark have not been discovered, though the Higgs particle is essential in the Weiberg-Salam theory. Another important issue in this field is the cause of the collapse of the CP symmetry. The second section deals with problems which arise in universalizing the concept of the 'standard theories'. What are required to solve these problems include the discovery of supersymmetric particles, discovery of conflicts in the 'standard theories', and accurate determination of fundamental constants used in the 'standard theories' by various different methods. The third and fourth sections address the Weinberg-Salam theory and quantum chromodynamics (QCD). There are four essential parameters for the 'standard theories', three of which are associated with the W-S theory. The mass of the W and Z bosons measured in proton-antiproton collision experiments is compared with that determined by applying the W-S theory to electron-positron experiments. For QCD, it is essential to determine the lambda constant. (N.K.)

  11. Quantum Measurements using Diamond Spins : From Fundamental Tests to Long-Distance Teleportation

    NARCIS (Netherlands)

    Hanson, R.

    2014-01-01

    Spin qubits in diamond provide an excellent platform both for fundamental tests and for realizing extended quantum networks . Here we present our latest results, including the deterministic teleportation over three meters.

  12. A relativistic theory for continuous measurement of quantum fields

    International Nuclear Information System (INIS)

    Diosi, L.

    1990-04-01

    A formal theory for the continuous measurement of relativistic quantum fields is proposed. The corresponding scattering equations were derived. The proposed formalism reduces to known equations in the Markovian case. Two recent models for spontaneous quantum state reduction have been recovered in the framework of this theory. A possible example of the relativistic continuous measurement has been outlined in standard Quantum Electrodynamics. The continuous measurement theory possesses an alternative formulation in terms of interacting quantum and stochastic fields. (author) 23 refs

  13. Fundamental ecology is fundamental.

    Science.gov (United States)

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Electromagnetic and quantum measurements a bitemporal neoclassical theory

    CERN Document Server

    Wessel-Berg, Tore

    2001-01-01

    It is a pleasure to write a foreword for Professor Tore Wessel-Berg's book, "Electromagnetic and Quantum Measurements: A Bitemporal Neoclassical Theory." This book appeals to me for several reasons. The most important is that, in this book, Wessel-Berg breaks from the pack. The distinguished astrophysicist Thomas Gold has written about the pressures on scientists to move in tight formation, to avoid having their legs nipped by the sheepdogs of science. This book demonstrates that Wessel-Berg is willing to take that risk. I confess that I do not sufficiently understand this book to be able to either agree or disagree with its thesis. Nevertheless, Wessel-Berg makes very cogent arguments for setting out on his journey. The basic equations of physics are indeed time-reversible. Our experience, that leads us to the concept of an "arrow of time," is derived from macro­ scopic phenomena, not from fundamental microscopic phenomena. For this reason, it makes very good sense to explore the consequences of treating mi...

  15. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  16. The New Unified Theory of ATP Synthesis/Hydrolysis and Muscle Contraction, Its Manifold Fundamental Consequences and Mechanistic Implications and Its Applications in Health and Disease

    Directory of Open Access Journals (Sweden)

    Sunil Nath

    2008-09-01

    Full Text Available Complete details of the thermodynamics and molecular mechanisms of ATP synthesis/hydrolysis and muscle contraction are offered from the standpoint of the torsional mechanism of energy transduction and ATP synthesis and the rotation-uncoiling-tilt (RUT energy storage mechanism of muscle contraction. The manifold fundamental consequences and mechanistic implications of the unified theory for oxidative phosphorylation and muscle contraction are explained. The consistency of current mechanisms of ATP synthesis and muscle contraction with experiment is assessed, and the novel insights of the unified theory are shown to take us beyond the binding change mechanism, the chemiosmotic theory and the lever arm model. It is shown from first principles how previous theories of ATP synthesis and muscle contraction violate both the first and second laws of thermodynamics, necessitating their revision. It is concluded that the new paradigm, ten years after making its first appearance, is now perfectly poised to replace the older theories. Finally, applications of the unified theory in cell life and cell death are outlined and prospects for future research are explored. While it is impossible to cover each and every specific aspect of the above, an attempt has been made here to address all the pertinent details and what is presented should be sufficient to convince the reader of the novelty, originality, breakthrough nature and power of the unified theory, its manifold fundamental consequences and mechanistic implications, and its applications in health and disease.

  17. Fundamental frequency and voice perturbation measures in smokers and non-smokers: An acoustic and perceptual study

    Science.gov (United States)

    Freeman, Allison

    This research examined the fundamental frequency and perturbation (jitter % and shimmer %) measures in young adult (20-30 year-old) and middle-aged adult (40-55 year-old) smokers and non-smokers; there were 36 smokers and 36 non-smokers. Acoustic analysis was carried out utilizing one task: production of sustained /a/. These voice samples were analyzed utilizing Multi-Dimensional Voice Program (MDVP) software, which provided values for fundamental frequency, jitter %, and shimmer %.These values were analyzed for trends regarding smoking status, age, and gender. Statistical significance was found regarding the fundamental frequency, jitter %, and shimmer % for smokers as compared to non-smokers; smokers were found to have significantly lower fundamental frequency values, and significantly higher jitter % and shimmer % values. Statistical significance was not found regarding fundamental frequency, jitter %, and shimmer % for age group comparisons. With regard to gender, statistical significance was found regarding fundamental frequency; females were found to have statistically higher fundamental frequencies as compared to males. However, the relationships between gender and jitter % and shimmer % lacked statistical significance. These results indicate that smoking negatively affects voice quality. This study also examined the ability of untrained listeners to identify smokers and non-smokers based on their voices. Results of this voice perception task suggest that listeners are not accurately able to identify smokers and non-smokers, as statistical significance was not reached. However, despite a lack of significance, trends in data suggest that listeners are able to utilize voice quality to identify smokers and non-smokers.

  18. MEASUREMENT OF FINANCIAL REPORTING QUALITY BASED ON IFRS CONCEPTUAL FRAMEWORK’S FUNDAMENTAL QUALITATIVE CHARACTERISTICS

    OpenAIRE

    Alexios KYTHREOTIS

    2014-01-01

    The IASB creates the standards and the conceptual framework in an attempt to create higher quality financial statements. Through this article, the extent to which this objective has been achieved is examined. An important characteristic of this research is the fact that the quality of financial statements is examined in light of the Conceptual Framework. Specifically, the two fundamental qualitative characteristics - relevance and faithful representation (reliability) - set by the IAS Committ...

  19. Quantum theory of successive projective measurements

    International Nuclear Information System (INIS)

    Johansen, Lars M.

    2007-01-01

    We show that a quantum state may be represented as the sum of a joint probability and a complex quantum modification term. The joint probability and the modification term can both be observed in successive projective measurements. The complex modification term is a measure of measurement disturbance. A selective phase rotation is needed to obtain the imaginary part. This leads to a complex quasiprobability: The Kirkwood distribution. We show that the Kirkwood distribution contains full information about the state if the two observables are maximal and complementary. The Kirkwood distribution gives another picture of state reduction. In a nonselective measurement, the modification term vanishes. A selective measurement leads to a quantum state as a non-negative conditional probability. We demonstrate the special significance of the Schwinger basis

  20. Transic time measures in scattering theory

    International Nuclear Information System (INIS)

    MacMillan, L.W.; Osborn, T.A.

    1980-01-01

    This paper studies the time evolution of state vectors that are the solutions of the time-dependent Schroedinger equation, characterized by a Hamiltonian h. We employ trace-theorem methods to prove that the transit time of state vectors through a finite space region, Σ, may be used to construct a family in the energy variable, epsilon, of unique, positive, trace-class operators. The matrix elements of these operators, give the transit time of any vector through Σ, It is proved that the trace of these operators, for a fixed energy epsilon, provide a function which simultaneously gives the sum of all orbital transit times through region Σ and represents the state density of all vectors that have support on Σ and energy epsilon. We use the transit-time operators to recover the usual theory of time delay for single-channel scattering systems. In the process we extend the known results on time delay to include scattering by fixed impurities in a periodic medium

  1. On the theory of SODAR measurement techniques

    DEFF Research Database (Denmark)

    Antoniou, I.; Ejsing Jørgensen, Hans; Bradley, S.

    2003-01-01

    The need for alternative means to measure the wind speed for wind energy purposes has increased with the increase of the size of wind turbines. The cost and the technical difficulties for performing wind speed measurements has also increased with the sizeof the wind turbines, since it is demanded...... the objective has been to present and achieve thefollowing: An accurate theoretic model that describes all the relevant aspects of the interaction of the sound beam with the atmosphere in the level of detail needed for wind energy applications. Understanding of dependence of SODAR performance on hard...

  2. Measuring Theory of Mind in Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Brewer, Neil; Young, Robyn L.; Barnett, Emily

    2017-01-01

    Deficits in Theory of Mind (ToM)--the ability to interpret others' beliefs, intentions and emotions--undermine the ability of individuals with Autism Spectrum Disorder (ASD) to interact in socially normative ways. This study provides psychometric data for the Adult-Theory of Mind (A-ToM) measure using video-scenarios based in part on Happé's…

  3. On music performance, theories, measurement en diversity

    NARCIS (Netherlands)

    Timmers, R.; Honing, H.J.

    2002-01-01

    Measurement of musical performances is of interest to studies in musicology, music psychology and music performance practice, but in general it has not been considered the main issue: when analyzing Western classical music, these disciplines usually focus on the score rather than the performance.

  4. Understanding unmet need: history, theory, and measurement.

    Science.gov (United States)

    Bradley, Sarah E K; Casterline, John B

    2014-06-01

    During the past two decades, estimates of unmet need have become an influential measure for assessing population policies and programs. This article recounts the evolution of the concept of unmet need, describes how demographic survey data have been used to generate estimates of its prevalence, and tests the sensitivity of these estimates to various assumptions in the unmet need algorithm. The algorithm uses a complex set of assumptions to identify women: who are sexually active, who are infecund, whose most recent pregnancy was unwanted, who wish to postpone their next birth, and who are postpartum amenorrheic. The sensitivity tests suggest that defensible alternative criteria for identifying four out of five of these subgroups of women would increase the estimated prevalence of unmet need. The exception is identification of married women who are sexually active; more accurate measurement of this subgroup would reduce the estimated prevalence of unmet need in most settings. © 2013 The Population Council, Inc.

  5. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 2: Fundamentals and application to counting measurements with the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) on samples are counted after treating them (e.g. aliquotation, solution, enrichment, separation). It considers, besides the random character of radioactive decay and of pulse counting, all other influences arising from sample treatment, (e.g. weighing, enrichment, calibration or the instability of the test setup). ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG 2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as non radioactive waste, and ISO 11929-1, ISO 11929-3 and ISO 11929-4 and is, consequently, complementary to these documents

  6. Fundamentals of differential beamforming

    CERN Document Server

    Benesty, Jacob; Pan, Chao

    2016-01-01

    This book provides a systematic study of the fundamental theory and methods of beamforming with differential microphone arrays (DMAs), or differential beamforming in short. It begins with a brief overview of differential beamforming and some popularly used DMA beampatterns such as the dipole, cardioid, hypercardioid, and supercardioid, before providing essential background knowledge on orthogonal functions and orthogonal polynomials, which form the basis of differential beamforming. From a physical perspective, a DMA of a given order is defined as an array that measures the differential acoustic pressure field of that order; such an array has a beampattern in the form of a polynomial whose degree is equal to the DMA order. Therefore, the fundamental and core problem of differential beamforming boils down to the design of beampatterns with orthogonal polynomials. But certain constraints also have to be considered so that the resulting beamformer does not seriously amplify the sensors’ self noise and the mism...

  7. The measurement theory of radioactivity in building materials

    International Nuclear Information System (INIS)

    Qu Jinhui; Wang Renbo; Zhang Xiongjie; Tan Hai; Zhu Zhipu; Man Zaigang

    2010-01-01

    Radioactivity in Building Materials is the main source of natural radiation dose that the individual is received, which has caused serious concern of all Social Sector. The paper completely introduce the measurement theory of the Radioactivity in Building Materials along with the measurement principle of natural radioactivity, design of shielding facility, choosing measurement time, sample prepared and spectrum analyzed. (authors)

  8. Relationships between fundamental movement skills and objectively measured physical activity in preschool children.

    Science.gov (United States)

    Cliff, Dylan P; Okely, Anthony D; Smith, Leif M; McKeen, Kim

    2009-11-01

    Gender differences in cross-sectional relationships between fundamental movement skill (FMS) subdomains (locomotor skills, object-control skills) and physical activity were examined in preschool children. Forty-six 3- to 5-year-olds (25 boys) had their FMS video assessed (Test of Gross Motor Development II) and their physical activity objectively monitored (Actigraph 7164 accelerometers). Among boys, object-control skills were associated with physical activity and explained 16.9% (p = .024) and 13.7% (p = .049) of the variance in percent of time in moderate-to-vigorous physical activity (MVPA) and total physical activity, respectively, after controlling for age, SES and z-BMI. Locomotor skills were inversely associated with physical activity among girls, and explained 19.2% (p = .023) of the variance in percent of time in MVPA after controlling for confounders. Gender and FMS subdomain may influence the relationship between FMS and physical activity in preschool children.

  9. Team synergies in sport: Theory and measures

    Directory of Open Access Journals (Sweden)

    Duarte Araújo

    2016-09-01

    Full Text Available Individual players act as a coherent unit during team sports performance, forming a team synergy. A synergy is a collective property of a task-specific organization of individuals, such that the degrees of freedom of each individual in the system are coupled, enabling the degrees of freedom of different individuals to co-regulate each other. Here, we present an explanation for the emergence of such collective behaviors, indicating how these can be assessed and understood through the measurement of key system properties that exist, considering the contribution of each individual and beyond These include: to (i dimensional compression, a process resulting in independent degree of freedom being coupled so that the synergy has fewer degrees of freedom than the set of components from which it arises; (ii reciprocal compensation, if one element do not produce its function, other elements should display changes in their contributions so that task goals are still attained; (iii interpersonal linkages, the specific contribution of each element to a group task; and (iv, degeneracy, structurally different components performing a similar, but not necessarily identical, function with respect to context. A primary goal of our analysis is to highlight the principles and tools required to understand coherent and dynamic team behaviors, as well as the performance conditions that make such team synergies possible, through perceptual attunement to shared affordances in individual performers. A key conclusion is that teams can be trained to perceive how to use and share specific affordances, explaining how individual’s behaviours self-organize into a group synergy.Ecological dynamics explanations of team behaviors can transit beyond mere ratification of sport performance, providing a comprehensive conceptual framework to guide the implementation of diagnostic measures by sport scientists, sport psychologists and performance analysts.

  10. Aligning the Measurement of Microbial Diversity with Macroecological Theory

    Directory of Open Access Journals (Sweden)

    James C. Stegen

    2016-09-01

    Full Text Available The number of microbial operational taxonomic units (OTUs within a community is akin to species richness within plant/animal (‘macrobial’ systems. A large literature documents OTU richness patterns, drawing comparisons to macrobial theory. There is, however, an unrecognized fundamental disconnect between OTU richness and macrobial theory: OTU richness is commonly estimated on a per-individual basis, while macrobial richness is estimated per-area. Furthermore, the range or extent of sampled environmental conditions can strongly influence a study’s outcomes and conclusions, but this is not commonly addressed when studying OTU richness. Here we (i propose a new sampling approach that estimates OTU richness per-mass of soil, which results in strong support for species energy theory, (ii use data reduction to show how support for niche conservatism emerges when sampling across a restricted range of environmental conditions, and (iii show how additional insights into drivers of OTU richness can be generated by combining different sampling methods while simultaneously considering patterns that emerge by restricting the range of environmental conditions. We propose that a more rigorous connection between microbial ecology and macrobial theory can be facilitated by exploring how changes in OTU richness units and environmental extent influence outcomes of data analysis. While fundamental differences between microbial and macrobial systems persist (e.g., species concepts, we suggest that closer attention to units and scale provide tangible and immediate improvements to our understanding of the processes governing OTU richness and how those processes relate to drivers of macrobial species richness.

  11. The perturbation theory in the fundamental mode. Its application to the analysis of neutronic experiments involving small amounts of materials in fast neutron multiplying media

    International Nuclear Information System (INIS)

    Remsak, Stanislav.

    1975-01-01

    The formalism of the perturbation theory at the first order, is developed in its simplest form: diffusion theory in the fundamental mode and then the more complex formalism of the transport theory in the fundamental mode. A comparison shows the effect of the angular correlation between the fine structures of the flux and its adjoint function, the difference in the treatment of neutron leakage phenomena, and the existence of new terms in the perturbation formula, entailing a reactivity representation in the diffusion theory that is not quite exact. Problems of using the formalism developed are considered: application of the multigroup formalism, transients of the flux and its adjoint function, validity of the first order approximation etc. A detailed analysis allows the formulation of a criterion specifying the validity range. Transients occuring in the reference medium are also treated. A set of numerical tests for determining a method of elimination of transient effects is presented. Some differential experiments are then discussed: sodium blowdown in enriched uranium or plutonium cores, experiments utilizing some structural materials (iron and oxygen) and plutonium sample oscillations. The Cadarache version II program was systematically used but the analysis of the experiments of plutonium sample oscillation in Ermine required the Cadarache version III program [fr

  12. Fundamental safety principles. Safety fundamentals

    International Nuclear Information System (INIS)

    2006-01-01

    This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purpose. The fundamental safety objective - to protect people and the environment from harmful effects of ionizing radiation - applies to all circumstances that give rise to radiation risks. The safety principles are applicable, as relevant, throughout the entire lifetime of all facilities and activities - existing and new - utilized for peaceful purposes, and to protective actions to reduce existing radiation risks. They provide the basis for requirements and measures for the protection of people and the environment against radiation risks and for the safety of facilities and activities that give rise to radiation risks, including, in particular, nuclear installations and uses of radiation and radioactive sources, the transport of radioactive material and the management of radioactive waste

  13. Fundamental safety principles. Safety fundamentals

    International Nuclear Information System (INIS)

    2007-01-01

    This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purpose. The fundamental safety objective - to protect people and the environment from harmful effects of ionizing radiation - applies to all circumstances that give rise to radiation risks. The safety principles are applicable, as relevant, throughout the entire lifetime of all facilities and activities - existing and new - utilized for peaceful purposes, and to protective actions to reduce existing radiation risks. They provide the basis for requirements and measures for the protection of people and the environment against radiation risks and for the safety of facilities and activities that give rise to radiation risks, including, in particular, nuclear installations and uses of radiation and radioactive sources, the transport of radioactive material and the management of radioactive waste

  14. Fundamentals of overlay measurement and inspection using scanning electron-microscope

    Science.gov (United States)

    Kato, T.; Okagawa, Y.; Inoue, O.; Arai, K.; Yamaguchi, S.

    2013-04-01

    Scanning electron-microscope (SEM) has been successfully applied to CD measurement as promising tools for qualifying and controlling quality of semiconductor devices in in-line manufacturing process since 1985. Furthermore SEM is proposed to be applied to in-die overlay monitor in the local area which is too small to be measured by optical overlay measurement tools any more, when the overlay control limit is going to be stringent and have un-ignorable dependence on device pattern layout, in-die location, and singular locations in wafer edge, etc. In this paper, we proposed new overlay measurement and inspection system to make an effective use of in-line SEM image, in consideration of trade-off between measurement uncertainty and measurement pattern density in each SEM conditions. In parallel, we make it clear that the best hybrid overlay metrology is in considering each tool's technology portfolio.

  15. Measurement of temperament and character in mood disorders: a model of fundamental states as personality types.

    Science.gov (United States)

    Cloninger, C R; Bayon, C; Svrakic, D M

    1998-10-01

    Personality assessment may allow reliable measurement of risk of mood disorders. A group of adults (804) representative of the general population were assessed by questionnaire. Personality types were measured by the Temperament and Character Inventory (TCI). Specific TCI configurations define personality types that can be described as hyperthymic, cyclothymic, irritable, and depressive. Each type had a unique profile of emotions, suicide attempts, and hospitalization. TCI traits are associated with mood disorders. Different ways of measuring Kraepelinean subtypes may disagree. Whether differences in personality cause psychopathology, or vice versa, remains uncertain. Personality profiles help in assessing suicidality and planning treatment.

  16. Theory and Application of an Economic Performance Measure of Risk

    NARCIS (Netherlands)

    C. Niu (Cuizhen); X. Guo (Xu); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung)

    2017-01-01

    textabstractHomm and Pigorsch (2012a) use the Aumann and Serrano index to develop a new economic performance measure (EPM), which is well known to have advantages over other measures. In this paper, we extend the theory by constructing a one-sample confidence interval of EPM, and construct

  17. Theory of “Weak Value" and Quantum Mechanical Measurements

    OpenAIRE

    Shikano, Yutaka

    2012-01-01

    Comment: to be published from "Measurements in Quantum Mechanics", edited by M. R. Pahlavani (InTech, 2012) Chapter 4 page 75. Yutaka Shikano (2012). ISBN: 978-953-51-0058-4 Available from: http://www.intechopen.com/articles/show/title/theory-of-weak-value-and-quantum-mechanical-measurement

  18. Measurement Invariance: A Foundational Principle for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…

  19. Practical application of the theory of errors in measurement

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This chapter addresses the practical application of the theory of errors in measurement. The topics of the chapter include fixing on a maximum desired error, selecting a maximum error, the procedure for limiting the error, utilizing a standard procedure, setting specifications for a standard procedure, and selecting the number of measurements to be made

  20. Hydrogen-Helium Mixtures: Fundamental Measurements, Neutral Droplet Buoyancy, Evaporation, and Boiling

    Data.gov (United States)

    National Aeronautics and Space Administration — Research groups at the Marshall Space Flight Center (MSFC) and the Kennedy Space Center (KSC) have contacted our laboratory in need of experimental measurements for...

  1. The relationship between theory of mind and relational frame theory : convergence of perspective-taking measures

    OpenAIRE

    Hendriks, A.; Barnes-Holmes, Yvonne; McEnteggart, Ciara; de Mey, H.; Witteman, C.; Janssen, G.; Egger, J.

    2016-01-01

    Objective: Perspective-taking difficulties have been demonstrated in autism and schizophrenia spectrum disorders, among other clinical presentations, and are traditionally examined from a Theory of Mind (ToM) point of view. Relational Frame Theory (RFT) offers a behavioural and contextual interpretation of perspective-taking, proposing that this ability can be studied in more detail by examining specific perspective-taking relations. To implement relational perspective-taking measures in clin...

  2. Fundamental course of measuring. II. The electrical measuring of non-electrical parameters. Grundkurs der Messtechnik. T. 2. Das elektrische Messen nichtelektrischer Groessen

    Energy Technology Data Exchange (ETDEWEB)

    Merz, L [Technische Univ. Muenchen (F.R. Germany). Lehrstuhl und Lab. fuer Steuerungs- und Regelungstechnik

    1975-01-01

    The fundamental course of the electrical measuring of non-electrical parameters aims to fulfill the task of presenting the present knowledge on the basic measuring methods in simple language and illustrative form. The present part II deals especially with measuring methods in heat and process engineering in the industrial field. Following the introduction in part A, the techniques of electrical probes are mainly described, and it is shown which mechanical probes cannot yet be replaced by electrical ones. Part C describes the techniques of measuring transducers.

  3. The Physics of Ultrabroadband Frequency Comb Generation and Optimized Combs for Measurements in Fundamental Physics

    Science.gov (United States)

    2016-07-02

    order phase-matched cascaded frequency gene , high harmonic generation, fine structure constant measurements, -envelope phase stabilization, ultra fast...MHz repetition rate are generated from a picosecond fiber laser (Pritel FFL-500) before amplifica- tion in an erbium- doped fiber amplifier (EDFA). The...width from 1 to 36 nm with central wavelength tunable over 1527–1550 nm. The pump pulses were combined with the seed and injected into 9.5 m of Ge- doped

  4. Fundamental Attributes of the Theory of Consumption in the Work of Jean Baudrillard, Pierre Bourdieu, and George Ritzer

    Directory of Open Access Journals (Sweden)

    Sanja Stanić

    2016-07-01

    Full Text Available The paper presents elements of the theories of J. Baudrillard, P. Bourdieu, and G. Ritzer that are relevant for the context of the development of social thought on the phenomenon of consumption. Although the three authors held different perspectives on consumption, they shared the notion of an increasingly important role of consumption in the time and society in which they acted. In their work, consumption was analysed as a salient determinant of social life. The theories of those authors are presented here as dominant within the given periods of the development of social theory on consumption. Baudrillard recognised consumption as a new and important issue, and his criticism of the consumer society was far ahead of the time in which he wrote. In his view, consumer goods are signs and consumption is a type of language. In analogy with Marx's concept of means of production, Baudrillard proposed the concept of means of consumption as consumer sites that are a synthesis of abundance and calculation. Bourdieu constructed a class theory of consumption based on cultural practices. He considered consumer behaviour to be an expression of class position. Class is determined by its position in the system of differences or distinctions from other classes based on cultural practices, objects, and taste. In Ritzer's work, consumption becomes a powerful driving force of contemporary society, with the ultimate purpose of profit-making. He explained changes in contemporary society by the process of McDonaldization that has been increasingly spreading to various areas of social life. Transformations in the structures and interpersonal relationships in contemporary society were explained by Ritzer as a change from the old to the new means of consumption. The concluding part of the paper provides an overview of the three authors' theories in the context of consumer society, emphasising their contribution to the body of theoretical analyses of the phenomenon of

  5. A criticism to the fundamental principles of physics: The problem of the quantum measurement (I)

    International Nuclear Information System (INIS)

    Mormontoy Cardenas, Oscar; Marquez Jacome, Mateo

    2008-01-01

    The wave packet model collapse debt to extremely fast fluctuations of quantum field leads to interpreting the phase speed of the harmonic waves that compose the packet, as the speed of time flux. If it consider that harmonics waves keep different phases, the waves packet scattered almost instantly and, as consequence of that, allows the possibility of the quantum system energy it is measure with exactitude absolute in given time. These results induce to think that the time would being a superforce which would determine finally the events of universe and being responsible of the intrinsic pulsations observable in the physics systems. (author)

  6. Low-energy experiments that measure fundamental constants and test basic symmetries

    CERN Multimedia

    CERN. Geneva. Audiovisual Unit

    2002-01-01

    Cold Antihydrogen: Are We There? Cold antihydrogen offers the possibility to precisely compare the structure of antihydrogen and hydrogen atoms, using the well developed tools of laser spectroscopy with antihydrogen atoms cold enough to be trapped in the minimum of a magnetic field gradient. Progress made at CERN's new Antiproton Decelerator will be discussed, along with goals and aspirations, such as measuring the anti-Rydberg constant. ATRAP has observed and studied the interaction of low energy antiprotons and positrons for more than a year, and ATHENA hopes to soon make antiprotons and positrons to interact as well.

  7. Passage from the fundamental tensor gsub(μv) of the gravitation theory to the field structure gsub(μv) of the unified theories

    International Nuclear Information System (INIS)

    Rao, J.R.; Tiwari, R.N.

    1974-01-01

    A theorem on obtaining exact solutions for a particular field structure from those of vacuum field equations of general theory as well as from some simpler solutions of unified theories is derived. With the help of this result the most general solution for the particular field structure is developed from the already known simpler solutions. The physical implications of this theorem in relation to some of the parallel work of other authors is discussed. (author)

  8. Metallurgical physics. Applications of microplasticity measurements to the fundamental study of deformation mechanisms

    International Nuclear Information System (INIS)

    Gouzou, J.; D'Haeyer, R.

    1977-01-01

    This work has resulted in formulating a new method for the treatment of plastic phenomena under combined stresses. This method describes any plastic deformation as a combination of shears in the six planes at 45 0 to the principal stresses, and results in a satisfactory description of the macroscopic properties. A new tensile machine was built for microplasticity measurements under very low stresses. This machine includes a piston-pump, driven by a synchronous electric motor which ensures a perfectly linear stress increase, and it is equipped with a new extensometer whose sensitiveness reaches 10 -8 . Tests were performed on four steels, including two high-strength steels, and on pure iron. These tests revealed the existence of a microplastic component which comes into action for stresses much lower than those required for dislocations movements, and which is probably due to kink displacements. Tests were also performed on four ferritic alloys with various silicon and manganese contents. The linear microstrains were measured at various temperatures and for various rates of stress increase, with and without interstitial elements

  9. Measurements of the fundamental thermodynamic parameters of Li/BCX and Li/SOCl2 cells

    Science.gov (United States)

    Kalu, E. E.; White, R. E.; Darcy, E. C.

    1992-01-01

    Two experimental techniques - equilibrium or reversible cell discharge and measurement of open circuit potential as a function of temperature - are used to determine the thermodynamic data needed to estimate the heat generation characteristics of Li/BCX and Li/SOCl2 cells. The results obtained showed that the reversible cell potential, the temperature dependence of the reversible cell potential, and the thermoneutral potential of the BCX cell were 3.74 V, -0.857 +/- 0.198 mV/K, and 3.994 +/- 0.0603 V, respectively. The respective values obtained for the Li/SOCl2 cell were 3.67 V, -0.776 +/- 0.255 mV/K, and 3.893 +/- 0.0776 V. The difference between the thermoneutral potential of Li/BCX and Li/SCl2 cells is attributable to the difference in their electroactive components.

  10. Views of a devil`s advocate -- Fundamental challenges to effective field theory treatments of nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, T.D.

    1998-04-01

    The physics goals of the effective field theory program for nuclear phenomena are outlined. It is pointed out that there are multiple schemes for implementing EFT and it is presently not clear if any of these schemes is viable. Most of the applications of effective field theory ideas have been on nucleon-nucleon scattering. It is argued that this is little more than curve fitting and that other quantities need to be calculated to test the ideas. It is shown that EFT methods work well for certain bound state properties of the deuteron electric form factor. However, it is also shown that this success depends sensitively on the fact that the majority of the probability of the deuteron`s wave function is beyond the range of the potential. This circumstance is special to the deuteron suggesting that it will be very difficult to achieve the same kinds of success for tightly bound nuclei.

  11. Nonperturbative theory of weak pre- and post-selected measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kofman, Abraham G., E-mail: kofmana@gmail.com; Ashhab, Sahel; Nori, Franco

    2012-11-01

    This paper starts with a brief review of the topic of strong and weak pre- and post-selected (PPS) quantum measurements, as well as weak values, and afterwards presents original work. In particular, we develop a nonperturbative theory of weak PPS measurements of an arbitrary system with an arbitrary meter, for arbitrary initial states of the system and the meter. New and simple analytical formulas are obtained for the average and the distribution of the meter pointer variable. These formulas hold to all orders in the weak value. In the case of a mixed preselected state, in addition to the standard weak value, an associated weak value is required to describe weak PPS measurements. In the linear regime, the theory provides the generalized Aharonov–Albert–Vaidman formula. Moreover, we reveal two new regimes of weak PPS measurements: the strongly-nonlinear regime and the inverted region (the regime with a very large weak value), where the system-dependent contribution to the pointer deflection decreases with increasing the measurement strength. The optimal conditions for weak PPS measurements are obtained in the strongly-nonlinear regime, where the magnitude of the average pointer deflection is equal or close to the maximum. This maximum is independent of the measurement strength, being typically of the order of the pointer uncertainty. In the optimal regime, the small parameter of the theory is comparable to the overlap of the pre- and post-selected states. We show that the amplification coefficient in the weak PPS measurements is generally a product of two qualitatively different factors. The effects of the free system and meter Hamiltonians are discussed. We also estimate the size of the ensemble required for a measurement and identify optimal and efficient meters for weak measurements. Exact solutions are obtained for a certain class of the measured observables. These solutions are used for numerical calculations, the results of which agree with the theory

  12. Nonperturbative theory of weak pre- and post-selected measurements

    International Nuclear Information System (INIS)

    Kofman, Abraham G.; Ashhab, Sahel; Nori, Franco

    2012-01-01

    This paper starts with a brief review of the topic of strong and weak pre- and post-selected (PPS) quantum measurements, as well as weak values, and afterwards presents original work. In particular, we develop a nonperturbative theory of weak PPS measurements of an arbitrary system with an arbitrary meter, for arbitrary initial states of the system and the meter. New and simple analytical formulas are obtained for the average and the distribution of the meter pointer variable. These formulas hold to all orders in the weak value. In the case of a mixed preselected state, in addition to the standard weak value, an associated weak value is required to describe weak PPS measurements. In the linear regime, the theory provides the generalized Aharonov–Albert–Vaidman formula. Moreover, we reveal two new regimes of weak PPS measurements: the strongly-nonlinear regime and the inverted region (the regime with a very large weak value), where the system-dependent contribution to the pointer deflection decreases with increasing the measurement strength. The optimal conditions for weak PPS measurements are obtained in the strongly-nonlinear regime, where the magnitude of the average pointer deflection is equal or close to the maximum. This maximum is independent of the measurement strength, being typically of the order of the pointer uncertainty. In the optimal regime, the small parameter of the theory is comparable to the overlap of the pre- and post-selected states. We show that the amplification coefficient in the weak PPS measurements is generally a product of two qualitatively different factors. The effects of the free system and meter Hamiltonians are discussed. We also estimate the size of the ensemble required for a measurement and identify optimal and efficient meters for weak measurements. Exact solutions are obtained for a certain class of the measured observables. These solutions are used for numerical calculations, the results of which agree with the theory

  13. The theory and measurement of partial discharge transients

    DEFF Research Database (Denmark)

    Pedersen, Aage; Crichton, George C; McAllister, Iain Wilson

    1991-01-01

    A theoretical approach to partial discharge transients is presented. This approach is based on the relationship between the charge induced on the measurement electrode by those created in the interelectrode volume during partial discharge activity. The primary sources for these induced charges ar...... electrode systems of practical interest is illustrated. A discussion of the salient features and practical aspects of the theory is included...

  14. On the Interpretation of Measurement Within the Quantum Theory

    Science.gov (United States)

    Cooper, Leon N.; Van Vechten, Deborah

    1969-01-01

    In interpretation of the process of measurement is proposed which can be placed wholly within the quantum theory. The entire system including the apparatus and even the mind of the observer can be considered to develop according to the Schrodinger equation. (RR)

  15. Readability of Orthopaedic Patient-reported Outcome Measures: Is There a Fundamental Failure to Communicate?

    Science.gov (United States)

    Perez, Jorge L; Mosher, Zachary A; Watson, Shawna L; Sheppard, Evan D; Brabston, Eugene W; McGwin, Gerald; Ponce, Brent A

    2017-08-01

    Patient-reported outcome measures (PROMs) are increasingly used to quantify patients' perceptions of functional ability. The American Medical Association and NIH suggest patient materials be written at or below 6th to 8th grade reading levels, respectively, yet one recent study asserts that few PROMs comply with these recommendations, and suggests that the majority of PROMs are written at too high of a reading level for self-administered patient use. Notably, this study was limited in its use of only one readability algorithm, although there is no commonly accepted, standard readability algorithm for healthcare-related materials. Our study, using multiple readability equations and heeding equal weight to each, hopes to yield a broader, all-encompassing estimate of readability, thereby offering a more accurate assessment of the readability of orthopaedic PROMS. (1) What proportion of orthopaedic-related PROMs and orthopaedic-related portions of the NIH Patient Reported Outcomes Measurement Information System (PROMIS ® ) are written at or below the 6th and 8th grade levels? (2) Is there a correlation between the number of questions in the PROM and reading level? (3) Using systematic edits based on guidelines from the Centers for Medicare and Medicaid Services, what proportion of PROMs achieved American Medical Association and NIH-recommended reading levels? Eighty-six (86) independent, orthopaedic and general wellness PROMs, drawn from commonly referenced orthopaedic websites and prior studies, were chosen for analysis. Additionally, owing to their increasing use in orthopaedics, four relevant short forms, and 11 adult, physical health question banks from the PROMIS ® , were included for analysis. All documents were analyzed for reading grade levels using 19 unique readability algorithms. Descriptive statistics were performed using SPSS Version 22.0. The majority of the independent PROMs (64 of 86; 74%) were written at or below the 6th grade level, with 81 of 86

  16. Quantum dissipative systems from theory of continuous measurements

    International Nuclear Information System (INIS)

    Mensky, Michael B.; Stenholm, Stig

    2003-01-01

    We apply the restricted-path-integral (RPI) theory of non-minimally disturbing continuous measurements for correct description of frictional Brownian motion. The resulting master equation is automatically of the Lindblad form, so that the difficulties typical of other approaches do not exist. In the special case of harmonic oscillator the known familiar master equation describing its frictionally driven Brownian motion is obtained. A thermal reservoir as a measuring environment is considered

  17. Performance Measurement, Expectancy and Agency Theory: An Experimental Study

    OpenAIRE

    Randolph Sloof; Mirjam van Praag

    2007-01-01

    Theoretical analyses of (optimal) performance measures are typically performed within the realm of the linear agency model. An important implication of this model is that, for a given compensation scheme, the agent's optimal effort choice is unrelated to the amount of noise in the performance measure. In contrast, expectancy theory as developed by psychologists predicts that effort levels are increasing in the signal-to-noise ratio. We conduct a real effort laboratory experiment to assess the...

  18. ŽAMPA’S SYSTEMS THEORY: A COMPREHENSIVE THEORY OF MEASUREMENT IN DYNAMIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2018-04-01

    Full Text Available The article outlines in memoriam Prof. Pavel Žampa’s concepts of system theory which enable us to devise a measurement in dynamic systems independently of the particular system behaviour. From the point of view of Žampa’s theory, terms like system time, system attributes, system link, system element, input, output, sub-systems, and state variables are defined. In Conclusions, Žampa’s theory is discussed together with another mathematical approaches of qualitative dynamics known since the 19th century. In Appendices, we present applications of Žampa’s technical approach to measurement of complex dynamical (chemical and biological systems at the Institute of Complex Systems, University of South Bohemia in České Budějovice.

  19. Development of a New Fundamental Measuring Technique for the Accurate Measurement of Gas Flowrates by Means of Laser Doppler Anemometry

    Science.gov (United States)

    Dopheide, D.; Taux, G.; Krey, E.-A.

    1990-01-01

    In the Physikalisch-Technische Bundesanstalt (PTB), a research test facility for the accurate measurement of gas (volume and mass) flowrates has been set up in the last few years on the basis of a laser Doppler anemometer (LDA) with a view to directly measuring gas flowrates with a relative uncertainty of only 0,1%. To achieve this, it was necessary to develop laser Doppler anemometry into a precision measuring technique and to carry out detailed investigations on stationary low-turbulence nozzle flow. The process-computer controlled test facility covers the flowrate range from 100 to 4000 m3/h (~0,03 - 1,0 m3/s), any flowrate being measured directly, immediately and without staggered arrangement of several flow meters. After the development was completed, several turbine-type gas meters were calibrated and international comparisons carried out. The article surveys the most significant aspects of the work and provides an outlook on future developments with regard to the miniaturization of optical flow and flowrate sensors for industrial applications.

  20. Review of the fundamental theories behind small angle X-ray scattering, molecular dynamics simulations, and relevant integrated application

    Directory of Open Access Journals (Sweden)

    Lauren Boldon

    2015-02-01

    Full Text Available In this paper, the fundamental concepts and equations necessary for performing small angle X-ray scattering (SAXS experiments, molecular dynamics (MD simulations, and MD-SAXS analyses were reviewed. Furthermore, several key biological and non-biological applications for SAXS, MD, and MD-SAXS are presented in this review; however, this article does not cover all possible applications. SAXS is an experimental technique used for the analysis of a wide variety of biological and non-biological structures. SAXS utilizes spherical averaging to produce one- or two-dimensional intensity profiles, from which structural data may be extracted. MD simulation is a computer simulation technique that is used to model complex biological and non-biological systems at the atomic level. MD simulations apply classical Newtonian mechanics’ equations of motion to perform force calculations and to predict the theoretical physical properties of the system. This review presents several applications that highlight the ability of both SAXS and MD to study protein folding and function in addition to non-biological applications, such as the study of mechanical, electrical, and structural properties of non-biological nanoparticles. Lastly, the potential benefits of combining SAXS and MD simulations for the study of both biological and non-biological systems are demonstrated through the presentation of several examples that combine the two techniques.

  1. Review of the fundamental theories behind small angle X-ray scattering, molecular dynamics simulations, and relevant integrated application.

    Science.gov (United States)

    Boldon, Lauren; Laliberte, Fallon; Liu, Li

    2015-01-01

    In this paper, the fundamental concepts and equations necessary for performing small angle X-ray scattering (SAXS) experiments, molecular dynamics (MD) simulations, and MD-SAXS analyses were reviewed. Furthermore, several key biological and non-biological applications for SAXS, MD, and MD-SAXS are presented in this review; however, this article does not cover all possible applications. SAXS is an experimental technique used for the analysis of a wide variety of biological and non-biological structures. SAXS utilizes spherical averaging to produce one- or two-dimensional intensity profiles, from which structural data may be extracted. MD simulation is a computer simulation technique that is used to model complex biological and non-biological systems at the atomic level. MD simulations apply classical Newtonian mechanics' equations of motion to perform force calculations and to predict the theoretical physical properties of the system. This review presents several applications that highlight the ability of both SAXS and MD to study protein folding and function in addition to non-biological applications, such as the study of mechanical, electrical, and structural properties of non-biological nanoparticles. Lastly, the potential benefits of combining SAXS and MD simulations for the study of both biological and non-biological systems are demonstrated through the presentation of several examples that combine the two techniques.

  2. Fundamental physics in particle traps

    International Nuclear Information System (INIS)

    Quint, Wolfgang; Vogel, Manuel

    2014-01-01

    The individual topics are covered by leading experts in the respective fields of research. Provides readers with present theory and experiments in this field. A useful reference for researchers. This volume provides detailed insight into the field of precision spectroscopy and fundamental physics with particles confined in traps. It comprises experiments with electrons and positrons, protons and antiprotons, antimatter and highly charged ions, together with corresponding theoretical background. Such investigations represent stringent tests of quantum electrodynamics and the Standard model, antiparticle and antimatter research, test of fundamental symmetries, constants, and their possible variations with time and space. They are key to various aspects within metrology such as mass measurements and time standards, as well as promising to further developments in quantum information processing. The reader obtains a valuable source of information suited for beginners and experts with an interest in fundamental studies using particle traps.

  3. Assessment of Student Performance for Course Examination Using Rasch Measurement Model: A Case Study of Information Technology Fundamentals Course

    Directory of Open Access Journals (Sweden)

    Amir Mohamed Talib

    2018-01-01

    Full Text Available This paper describes a measurement model that is used to measure the student performance in the final examination of Information Technology (IT Fundamentals (IT280 course in the Information Technology (IT Department, College of Computer & Information Sciences (CCIS, Al-Imam Mohammad Ibn Saud Islamic University (IMSIU. The assessment model is developed based on students’ mark entries of final exam results for the second year IT students, which are compiled and tabulated for evaluation using Rasch Measurement Model, and it can be used to measure the students’ performance towards the final examination of the course. A study on 150 second year students (male = 52; female = 98 was conducted to measure students’ knowledge and understanding for IT280 course according to the three level of Bloom’s Taxonomy. The results concluded that students can be categorized as poor (10%, moderate (42%, good (18%, and successful (24% to achieve Level 3 of Bloom’s Taxonomy. This study shows that the students’ performance for the set of IT280 final exam questions was comparatively good. The result generated from this study can be used to guide us to determine the appropriate improvement of teaching method and the quality of question prepared.

  4. Scoring and Classifying Examinees Using Measurement Decision Theory

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2009-04-01

    Full Text Available This paper describes and evaluates the use of measurement decision theory (MDT to classify examinees based on their item response patterns. The model has a simple framework that starts with the conditional probabilities of examinees in each category or mastery state responding correctly to each item. The presented evaluation investigates: (1 the classification accuracy of tests scored using decision theory; (2 the effectiveness of different sequential testing procedures; and (3 the number of items needed to make a classification. A large percentage of examinees can be classified accurately with very few items using decision theory. A Java Applet for self instruction and software for generating, calibrating and scoring MDT data are provided.

  5. The calculus of variations on jet bundles as a universal approach for a variational formulation of fundamental physical theories

    Directory of Open Access Journals (Sweden)

    Musilová Jana

    2016-12-01

    Full Text Available As widely accepted, justified by the historical developments of physics, the background for standard formulation of postulates of physical theories leading to equations of motion, or even the form of equations of motion themselves, come from empirical experience. Equations of motion are then a starting point for obtaining specific conservation laws, as, for example, the well-known conservation laws of momenta and mechanical energy in mechanics. On the other hand, there are numerous examples of physical laws or equations of motion which can be obtained from a certain variational principle as Euler-Lagrange equations and their solutions, meaning that the \\true trajectories" of the physical systems represent stationary points of the corresponding functionals.

  6. Prediction and measurement of the reflection of the fundamental anti-symmetric Lamb wave from cracks and notches

    International Nuclear Information System (INIS)

    Lowe, M.J.S.; Cawley, P.; Kao, J-Y; Diligent, O.

    2000-01-01

    The interaction of the fundamental antisymmetric Lamb wave (a o ) with cracks and with notches of different depth and width has been investigated both experimentally and by finite element analysis. Excellent agreement between the predictions and the measurements has been obtained. It has been shown that the reflection coefficient is a function of both the notch width to wavelength and notch depth to wavelength ratios. Both the relationship between the reflection coefficient and notch, depth, and the frequency dependence of the reflection coefficient, are very different for the a o mode compared to the s o mode which was studied earlier. Physical insight into the reasons for the different behavior is given by examination of the stress fields and opening displacements at the crack or notch

  7. Measure-valued differentiation for finite products of measures : theory and applications

    NARCIS (Netherlands)

    Leahu, H.

    2008-01-01

    In this dissertation we perform a comprehensive analysis of measure-valued differentiation, in which weak differentiation of parameter-dependent probability measures plays a central role. We develop a theory of weak differentiation of measures and show that classical concepts such as differential

  8. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    International Nuclear Information System (INIS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-01-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  9. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    Energy Technology Data Exchange (ETDEWEB)

    Egan, James; McMillan, Normal; Denieffe, David, E-mail: eganj@itcarlow.ie [IT Carlow (Ireland)

    2011-08-17

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  10. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    Energy Technology Data Exchange (ETDEWEB)

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  11. Inertial rotation measurement with atomic spins: From angular momentum conservation to quantum phase theory

    Science.gov (United States)

    Zhang, C.; Yuan, H.; Tang, Z.; Quan, W.; Fang, J. C.

    2016-12-01

    Rotation measurement in an inertial frame is an important technology for modern advanced navigation systems and fundamental physics research. Inertial rotation measurement with atomic spin has demonstrated potential in both high-precision applications and small-volume low-cost devices. After rapid development in the last few decades, atomic spin gyroscopes are considered a promising competitor to current conventional gyroscopes—from rate-grade to strategic-grade applications. Although it has been more than a century since the discovery of the relationship between atomic spin and mechanical rotation by Einstein [Naturwissenschaften, 3(19) (1915)], research on the coupling between spin and rotation is still a focus point. The semi-classical Larmor precession model is usually adopted to describe atomic spin gyroscope measurement principles. More recently, the geometric phase theory has provided a different view of the rotation measurement mechanism via atomic spin. The theory has been used to describe a gyroscope based on the nuclear spin ensembles in diamond. A comprehensive understanding of inertial rotation measurement principles based on atomic spin would be helpful for future applications. This work reviews different atomic spin gyroscopes and their rotation measurement principles with a historical overlook. In addition, the spin-rotation coupling mechanism in the context of the quantum phase theory is presented. The geometric phase is assumed to be the origin of the measurable rotation signal from atomic spins. In conclusion, with a complete understanding of inertial rotation measurements using atomic spin and advances in techniques, wide application of high-performance atomic spin gyroscopes is expected in the near future.

  12. Batteries. Fundamentals and theory, present state of the art of technology and trends of developments. 5. ed.; Batterien. Grundlagen und Theorie, aktueller technischer Stand und Entwicklungstendenzen

    Energy Technology Data Exchange (ETDEWEB)

    Kiehne, H.A.; Berndt, D.; Fischer, W.; Franke, H.; Koenig, W.; Koethe, H.K.; Preuss, P.; Sassmannshausen, G.; Stahl, U.C.; Wehrle, E.; Will, G.; Willmes, H.

    2003-07-01

    This volume gives a comprehensive survey of the present state of the electrochemical power storage with special consideration of their technical characteristics of application. The volume is structured as follows: 1) Electrochemical energy storage, general fundamentals; 2) Batteries for electric-powered industrial trucks; 3) Energy supply concepts for driverless industrial trucks; 4) Batteries for electric-powered road vehicles; 5) Battery-fed electric drive from the user's point of view (=charging, maintenance); 6) Safety standards for stationary batteries and battery systems; 7) Batteries for stationary power supplies; 8) Battery operation from the user's point of view; 9) Starter batteries of vehicles; 10) High-energy batteries (e.g. Zn/Br{sub 2}-, Na/S-, Li/FeS-cells, fuel cells); 11) Solar-electric power supply with batteries; 12) Charging methods and charging technique; 13) Technology of battery chargers and current transformer, monitoring methods; 14) Standards and regulations for batteries and battery systems.

  13. Batteries. Fundamentals and theory, present state of the art of technology and trends of development. 4. compl. rev. ed.; Batterien. Grundlagen und Theorie, aktueller technischer Stand und Entwicklungstendenzen

    Energy Technology Data Exchange (ETDEWEB)

    Kiehne, H.A.; Berndt, D.; Fischer, W. [and others

    2000-07-01

    This volume gives a comprehensive survey of the present state of the electrochemical power storage with special consideration of their technical characteristics of application. The volume is structured as follows: 1) Electrochemical energy storage, general fundamentals; 2) Batteries for electric-powered industrial trucks; 3) Energy supply concepts for driverless industrial trucks; 4) Batteries for electric-powered road vehicles; 5) Battery-fed electric drive from the user's point of view (=charging, maintenance); 6) Safety standards for stationary batteries and battery systems; 7) Batteries for stationary power supplies; 8) Battery operation from the user's point of view; 9) Starter batteries of vehicles; 10) High-energy batteries (e.g. Zn/Br{sub 2}-, Na/S-, Li/FeS-cells, fuel cells); 11) Solar-electric power supply with batteries; 12) Charging methods and charging technique; 13) Technology of battery chargers and current transformer, monitoring methods; 14) Standards and regulations for batteries and battery systems.

  14. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  15. DOE Fundamentals Handbook: Electrical Science, Volume 1

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of electrical theory, terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive components; batteries; AC and DC voltage regulators; transformers; and electrical test instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  16. DOE Fundamentals Handbook: Electrical Science, Volume 3

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of electrical theory, terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors and generators; AC power and reactive components; batteries; AC and DC voltage regulators; transformers; and electrical test instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  17. DOE Fundamentals Handbook: Electrical Science, Volume 4

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of electrical theory, terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive transformers; and electrical test components; batteries; AC and DC voltage regulators; instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  18. Educational measurement for applied researchers theory into practice

    CERN Document Server

    Wu, Margaret; Jen, Tsung-Hau

    2016-01-01

    This book is a valuable read for a diverse group of researchers and practitioners who analyze assessment data and construct test instruments. It focuses on the use of classical test theory (CTT) and item response theory (IRT), which are often required in the fields of psychology (e.g. for measuring psychological traits), health (e.g. for measuring the severity of disorders), and education (e.g. for measuring student performance), and makes these analytical tools accessible to a broader audience. Having taught assessment subjects to students from diverse backgrounds for a number of years, the three authors have a wealth of experience in presenting educational measurement topics, in-depth concepts and applications in an accessible format. As such, the book addresses the needs of readers who use CTT and IRT in their work but do not necessarily have an extensive mathematical background. The book also sheds light on common misconceptions in applying measurement models, and presents an integrated approach to differ...

  19. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  20. Density measurement using gamma radiation - theory and application

    International Nuclear Information System (INIS)

    Springer, E.K.

    1979-01-01

    There are still widespread uncertainties about the use and safety of gamma radiation in industries. This paper describes, by the example of radiometric density measurement, the theory of gamma radiation. The differences and advantages of both types of detectors, the ionization chamber and the scintillation counter, are discussed. The degree of accuracy which can be expected from the radiometric density meter will be defined, and the inter-relationship: source strength - measuring range - measuring length(normally the pipe diameter) in relation to the measuring accuracy required will be explained in detail. The use of radioactive material requires the permission of the Atomic Energy Board. The formalities involved to receive a user's licence and the implementations of safety standards set by the local authorities are discussed in depth [af

  1. Fundamentals of structural dynamics

    CERN Document Server

    Craig, Roy R

    2006-01-01

    From theory and fundamentals to the latest advances in computational and experimental modal analysis, this is the definitive, updated reference on structural dynamics.This edition updates Professor Craig's classic introduction to structural dynamics, which has been an invaluable resource for practicing engineers and a textbook for undergraduate and graduate courses in vibrations and/or structural dynamics. Along with comprehensive coverage of structural dynamics fundamentals, finite-element-based computational methods, and dynamic testing methods, this Second Edition includes new and e

  2. A short course on measure and probability theories

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.

  3. Fundamentals of ergonomic exoskeleton robots

    NARCIS (Netherlands)

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a

  4. Interpreting Measures of Fundamental Movement Skills and Their Relationship with Health-Related Physical Activity and Self-Concept

    Science.gov (United States)

    Jarvis, Stuart; Williams, Morgan; Rainer, Paul; Jones, Eleri Sian; Saunders, John; Mullen, Richard

    2018-01-01

    The aims of this study were to determine proficiency levels of fundamental movement skills using cluster analysis in a cohort of U.K. primary school children; and to further examine the relationships between fundamental movement skills proficiency and other key aspects of health-related physical activity behavior. Participants were 553 primary…

  5. Understanding modern physics by symmetry. A new approach to the fundamental theories; Durch Symmetrie die moderne Physik verstehen. Ein neuer Zugang zu den fundamentalen Theorien

    Energy Technology Data Exchange (ETDEWEB)

    Schwichtenberg, Jakob

    2017-09-01

    The following topics are dealt with: Special relativity theory, theory of Lie groups, the Lagrang formalism for field theories, quantum operators, quantum wave equations, the theory of interactions, quantum mechanics, quantum field theory, classical mechanics, electrodynamics. (HSI)

  6. Fundamentals of statistics

    CERN Document Server

    Mulholland, Henry

    1968-01-01

    Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges

  7. Quantum Measurement Theory in Gravitational-Wave Detectors

    Directory of Open Access Journals (Sweden)

    Stefan L. Danilishin

    2012-04-01

    Full Text Available The fast progress in improving the sensitivity of the gravitational-wave detectors, we all have witnessed in the recent years, has propelled the scientific community to the point at which quantum behavior of such immense measurement devices as kilometer-long interferometers starts to matter. The time when their sensitivity will be mainly limited by the quantum noise of light is around the corner, and finding ways to reduce it will become a necessity. Therefore, the primary goal we pursued in this review was to familiarize a broad spectrum of readers with the theory of quantum measurements in the very form it finds application in the area of gravitational-wave detection. We focus on how quantum noise arises in gravitational-wave interferometers and what limitations it imposes on the achievable sensitivity. We start from the very basic concepts and gradually advance to the general linear quantum measurement theory and its application to the calculation of quantum noise in the contemporary and planned interferometric detectors of gravitational radiation of the first and second generation. Special attention is paid to the concept of the Standard Quantum Limit and the methods of its surmounting.

  8. Quantum Measurement Theory in Gravitational-Wave Detectors.

    Science.gov (United States)

    Danilishin, Stefan L; Khalili, Farid Ya

    2012-01-01

    The fast progress in improving the sensitivity of the gravitational-wave detectors, we all have witnessed in the recent years, has propelled the scientific community to the point at which quantum behavior of such immense measurement devices as kilometer-long interferometers starts to matter. The time when their sensitivity will be mainly limited by the quantum noise of light is around the corner, and finding ways to reduce it will become a necessity. Therefore, the primary goal we pursued in this review was to familiarize a broad spectrum of readers with the theory of quantum measurements in the very form it finds application in the area of gravitational-wave detection. We focus on how quantum noise arises in gravitational-wave interferometers and what limitations it imposes on the achievable sensitivity. We start from the very basic concepts and gradually advance to the general linear quantum measurement theory and its application to the calculation of quantum noise in the contemporary and planned interferometric detectors of gravitational radiation of the first and second generation. Special attention is paid to the concept of the Standard Quantum Limit and the methods of its surmounting.

  9. Fundamental supply of skin blood flow in the Chinese Han population: Measurements by a full-field laser perfusion imager.

    Science.gov (United States)

    Fei, W; Xu, S; Ma, J; Zhai, W; Cheng, S; Chang, Y; Wang, X; Gao, J; Tang, H; Yang, S; Zhang, X

    2018-05-08

    Skin blood flow is believed to link with many diseases, and shows a significant heterogeneity. There are several papers on basal cutaneous microcirculation perfusion in different races, while the data in Chinese is vacant. The aim was to establish the database of absolute fundamental supply of skin blood flow in the Chinese Han population. With a full-field laser perfusion imager (FLPI), the skin blood flow can be quantified. Cutaneous perfusion values were determined in 17 selected skin areas in 406 healthy participants aged between 20 and 80 years (mean 35.05 ± 11.33). Essential parameters such as weight, height were also measured and values of BMI were calculated. The perfusion values were reported in Arbitrary Perfusion Units (APU). The highest cutaneous perfusion value fell on eyelid (931.20 ± 242.59 in male and 967.83 ± 225.49 in female), and pretibial had the lowest value (89.09 ± 30.28 in male and 85.08 ± 33.59 in female). The values were higher in men than women on the bank of fingertips, nose, forehead, cheek, neck and earlobe (P < .05). Perfusion values on stretch and flexion side of forearm had negative correlation with age (P = .01 and P = 4.88 × 10 -3 , respectively) in male. Abdomen was negatively correlated with BMI in both gender (P = .02, respectively). Skin blood flow values vary with skin regions. There is a tendency to measure higher perfusion values in men than in women. And the values are irrelevant with age or BMI. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  11. Interpretation of active neutron measurements by the heterogeneous theory

    International Nuclear Information System (INIS)

    Birkhoff, G.; Depraz, J.; Descieux, J.P.

    1979-01-01

    In this paper are presented results from a study on the application of the heterogeneous method for the interpretation of active neutron measurements. The considered apparatus consists out of a cylindrical lead pile, which is provided with two axial channels: a central channel incorporates an antimony beryllium photoneutron source and an excentric channel serves for the insertion of the sample to be assayed for fissionable materials contents. The mathematical model of this apparatus is the heterogeneous group diffusion theory. Sample and source channel are described by multigroup monopolar and dipolar sources and sinks. Monopolar sources take account of neutron production within energy group and in-scatter from upper groups. Monopolar sinks represent neutron removal by absorption within energy group and outscatter to lower groups. Dipol sources describe radial streaming of neutrons across the sample channel. Multigroup diffusion theory is applied throughout the lead pile. The strengths of the monopolar and dipolar sources and sinks are determined by linear extrapolation distances of azimuthal mean and first harmonic flux values at the channels' surface. In an experiment we may measure the neutrons leaking out of the lead pile and linear extrapolation distances at the channels' surface. Such informations are utilized for interpretation in terms of fission neutron source strengh and mean neutron flux values in the sample. In this paper we summarized the theoretical work in course

  12. Reality, measurement and locality in Quantum Field Theory

    International Nuclear Information System (INIS)

    Tommasini, Daniele

    2002-01-01

    It is currently believed that the local causality of Quantum Field Theory (QFT) is destroyed by the measurement process. This belief is also based on the Einstein-Podolsky-Rosen (EPR) paradox and on the so-called Bell's theorem, that are thought to prove the existence of a mysterious, instantaneous action between distant measurements. However, I have shown recently that the EPR argument is removed, in an interpretation-independent way, by taking into account the fact that the Standard Model of Particle Physics prevents the production of entangled states with a definite number of particles. This result is used here to argue in favor of a statistical interpretation of QFT and to show that it allows for a full reconciliation with locality and causality. Within such an interpretation, as Ballentine and Jarret pointed out long ago, Bell's theorem does not demonstrate any nonlocality. (author)

  13. Measuring and modeling salience with the theory of visual attention.

    Science.gov (United States)

    Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid

    2017-08-01

    For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA's scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are combined, salience adds up additively.

  14. Radiology fundamentals

    CERN Document Server

    Singh, Harjit

    2011-01-01

    ""Radiology Fundamentals"" is a concise introduction to the dynamic field of radiology for medical students, non-radiology house staff, physician assistants, nurse practitioners, radiology assistants, and other allied health professionals. The goal of the book is to provide readers with general examples and brief discussions of basic radiographic principles and to serve as a curriculum guide, supplementing a radiology education and providing a solid foundation for further learning. Introductory chapters provide readers with the fundamental scientific concepts underlying the medical use of imag

  15. Fundamentals of plasma physics

    CERN Document Server

    Bittencourt, J A

    1986-01-01

    A general introduction designed to present a comprehensive, logical and unified treatment of the fundamentals of plasma physics based on statistical kinetic theory. Its clarity and completeness make it suitable for self-learning and self-paced courses. Problems are included.

  16. Measuring Memory Reactivation With Functional MRI: Implications for Psychological Theory.

    Science.gov (United States)

    Levy, Benjamin J; Wagner, Anthony D

    2013-01-01

    Environmental cues often remind us of earlier experiences by triggering the reactivation of memories of events past. Recent evidence suggests that memory reactivation can be observed using functional MRI and that distributed pattern analyses can even provide evidence of reactivation on individual trials. The ability to measure memory reactivation offers unique and powerful leverage on theoretical issues of long-standing interest in cognitive psychology, providing a means to address questions that have proven difficult to answer with behavioral data alone. In this article, we consider three instances. First, reactivation measures can indicate whether memory-based inferences (i.e., generalization) arise through the encoding of integrated cross-event representations or through the flexible expression of separable event memories. Second, online measures of memory reactivation may inform theories of forgetting by providing information about when competing memories are reactivated during competitive retrieval situations. Finally, neural reactivation may provide a window onto the role of replay in memory consolidation. The ability to track memory reactivation, including at the individual trial level, provides unique leverage that is not afforded by behavioral measures and thus promises to shed light on such varied topics as generalization, integration, forgetting, and consolidation. © The Author(s) 2013.

  17. Reheating-volume measure in the string theory landscape

    International Nuclear Information System (INIS)

    Winitzki, Sergei

    2008-01-01

    I recently proposed the ''reheating-volume'' (RV) prescription as a possible solution to the measure problem in ''multiverse'' cosmology. The goal of this work is to extend the RV measure to scenarios involving bubble nucleation, such as the string theory landscape. In the spirit of the RV prescription, I propose to calculate the distribution of observable quantities in a landscape that is conditioned in probability to nucleate a finite total number of bubbles to the future of an initial bubble. A general formula for the relative number of bubbles of different types can be derived. I show that the RV measure is well defined and independent of the choice of the initial bubble type, as long as that type supports further bubble nucleation. Applying the RV measure to a generic landscape, I find that the abundance of Boltzmann brains is always negligibly small compared with the abundance of ordinary observers in the bubbles of the same type. As an illustration, I present explicit results for a toy landscape containing four vacuum states, and for landscapes with a single high-energy vacuum and a large number of low-energy vacua.

  18. Fundamentals of ultrasonic phased arrays

    CERN Document Server

    Schmerr, Lester W

    2014-01-01

    This book describes in detail the physical and mathematical foundations of ultrasonic phased array measurements.?The book uses linear systems theory to develop a comprehensive model of the signals and images that can be formed with phased arrays. Engineers working in the field of ultrasonic nondestructive evaluation (NDE) will find in this approach a wealth of information on how to design, optimize and interpret ultrasonic inspections with phased arrays. The fundamentals and models described in the book will also be of significant interest to other fields, including the medical ultrasound and

  19. Individual differences in fundamental social motives.

    Science.gov (United States)

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Quantifying and handling errors in instrumental measurements using the measurement error theory

    DEFF Research Database (Denmark)

    Andersen, Charlotte Møller; Bro, R.; Brockhoff, P.B.

    2003-01-01

    . This is a new way of using the measurement error theory. Reliability ratios illustrate that the models for the two fish species are influenced differently by the error. However, the error seems to influence the predictions of the two reference measures in the same way. The effect of using replicated x...... measurements. A new general formula is given for how to correct the least squares regression coefficient when a different number of replicated x-measurements is used for prediction than for calibration. It is shown that the correction should be applied when the number of replicates in prediction is less than...

  1. Relativities of fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    2017-08-01

    S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally.

  2. The Yang-Mills gradient flow and SU(3) gauge theory with 12 massless fundamental fermions in a colour-twisted box

    CERN Document Server

    Lin, C -J David; Ramos, Alberto

    2015-01-01

    We perform the step-scaling investigation of the running coupling constant, using the gradient-flow scheme, in SU(3) gauge theory with twelve massless fermions in the fundamental representation. The Wilson plaquette gauge action and massless unimproved staggered fermions are used in the simulations. Our lattice data are prepared at high accuracy, such that the statistical error for the renormalised coupling, g_GF, is at the subpercentage level. To investigate the reliability of the continuum extrapolation, we employ two different lattice discretisations to obtain g_GF. For our simulation setting, the corresponding gauge-field averaging radius in the gradient flow has to be almost half of the lattice size, in order to have this extrapolation under control. We can determine the renormalisation group evolution of the coupling up to g^2_GF ~ 6, before the onset of the bulk phase structure. In this infrared regime, the running of the coupling is significantly slower than the two-loop perturbative prediction, altho...

  3. Cancer: Towards a general theory of the target: All successful cancer therapies, actual or potential, are reducible to either (or both) of two fundamental strategies.

    Science.gov (United States)

    Vincent, Mark D

    2017-09-01

    General theories (GT) are reductionist explications of apparently independent facts. Here, in reviewing the literature, I develop a GT to simplify the cluttered landscape of cancer therapy targets by revealing they cluster parsimoniously according to only a few underlying principles. The first principle is that targets can be only exploited by either or both of two fundamentally different approaches: causality-inhibition, and 'acausal' recognition of some marker or signature. Nonetheless, each approach must achieve both of two separate goals, efficacy (reduction in cancer burden) and selectivity (sparing of normal cells); if the mechanisms are known, this provides a definition of rational treatment. The second principle is target fragmentation, whereby the target may perform up to three categoric functions (cytoreduction, modulation, cytoprotection), potentially mediated by physically different target molecules, even on different cell types, or circulating freely. This GT remains incomplete until the minimal requirements for cure, or alternatively, proof that cure is impossible, become predictable. © 2017 The Authors. BioEssays Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  4. Fundamental Astronomy

    CERN Document Server

    Karttunen, Hannu; Oja, Heikki; Poutanen, Markku; Donner, Karl Johan

    2007-01-01

    Fundamental Astronomy gives a well-balanced and comprehensive introduction to the topics of classical and modern astronomy. While emphasizing both the astronomical concepts and the underlying physical principles, the text provides a sound basis for more profound studies in the astronomical sciences. The fifth edition of this successful undergraduate textbook has been extensively modernized and extended in the parts dealing with the Milky Way, extragalactic astronomy and cosmology as well as with extrasolar planets and the solar system (as a consequence of recent results from satellite missions and the new definition by the International Astronomical Union of planets, dwarf planets and small solar-system bodies). Furthermore a new chapter on astrobiology has been added. Long considered a standard text for physical science majors, Fundamental Astronomy is also an excellent reference and entrée for dedicated amateur astronomers.

  5. Relational description of the measurement process in quantum field theory

    International Nuclear Information System (INIS)

    Gambini, Rodolfo; Porto, Rafael A.

    2002-01-01

    We have recently introduced a realistic, covariant, interpretation for the reduction process in relativistic quantum mechanics. The basic problem for a covariant description is the dependence of the states on the frame within which collapse takes place. A suitable use of the causal structure of the devices involved in the measurement process allowed us to introduce a covariant notion for the collapse of quantum states. However, a fully consistent description in the relativistic domain requires the extension of the interpretation to quantum fields. The extension is far from straightforward. Besides the obvious difficulty of dealing with the infinite degrees of freedom of the field theory, one has to analyse the restrictions imposed by causality concerning the allowed operations in a measurement process. In this paper we address these issues. We shall show that, in the case of partial causally connected measurements, our description allows us to include a wider class of causal operations than the one resulting from the standard way of computing conditional probabilities. This alternative description could be experimentally tested. A verification of this proposal would give stronger support to the realistic interpretations of the states in quantum mechanics. (author)

  6. Fundamental study on a thin-film ae sensor for measurement of behavior of a multi-pad contact slider

    NARCIS (Netherlands)

    Imai, S.; Burger, G.J.; Lammerink, Theodorus S.J.; Fluitman, J.H.J.

    To study the fundamental dynamic characteristics of a multi-pad slider for contact recording, we developed a thin-film piezoelectric acoustic emission array sensor on an Si-suspension with an array pattern similar to that of contact pads. Experiments showed that the sensitivity of the sensor is

  7. Fundamental Movement Skill Proficiency and Body Composition Measured by Dual Energy X-Ray Absorptiometry in Eight-Year-Old Children

    Science.gov (United States)

    Slotte, Sari; Sääkslahti, Arja; Metsämuuronen, Jari; Rintala, Pauli

    2015-01-01

    Objective: The main aim was to examine the association between fundamental movement skills (FMS) and objectively measured body composition using dual energy X-ray absorptiometry (DXA). Methods: A study of 304 eight-year-old children in Finland. FMS were assessed with the "Test of gross motor development," 2nd ed. Total body fat…

  8. One-Group Perturbation Theory Applied to Measurements with Void

    International Nuclear Information System (INIS)

    Persson, Rolf

    1966-09-01

    Formulas suitable for evaluating progressive as well as single rod substitution measurements are derived by means of one-group perturbation theory. The diffusion coefficient may depend on direction and position. By using the buckling concept one can derive expressions which are quite simple and the perturbed flux can be taken into account in a comparatively simple way. By using an unconventional definition of cells a transition region is introduced quite logically. Experiments with voids around metal rods, diam. 3.05 cm, have been analysed. The agreement between extrapolated and directly measured buckling values is excellent, the buckling difference between lattices with water-filled and voided shrouds being 0. 263 ± 0.015/m 2 and 0.267 ± 0.005/m 2 resp. From single-rod experiments differences between diffusion coefficients are determined to δD r /D = 0.083 ± 0.004 and δD z /D = 0.120 ± 0.018. With air-filled shrouds there is consequently anisotropy in the neutron diffusion and we have (D z /D r ) air = 1.034 ± 0.020

  9. One-Group Perturbation Theory Applied to Measurements with Void

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Rolf

    1966-09-15

    Formulas suitable for evaluating progressive as well as single rod substitution measurements are derived by means of one-group perturbation theory. The diffusion coefficient may depend on direction and position. By using the buckling concept one can derive expressions which are quite simple and the perturbed flux can be taken into account in a comparatively simple way. By using an unconventional definition of cells a transition region is introduced quite logically. Experiments with voids around metal rods, diam. 3.05 cm, have been analysed. The agreement between extrapolated and directly measured buckling values is excellent, the buckling difference between lattices with water-filled and voided shrouds being 0. 263 {+-} 0.015/m{sup 2} and 0.267 {+-} 0.005/m{sup 2} resp. From single-rod experiments differences between diffusion coefficients are determined to {delta}D{sub r}/D = 0.083 {+-} 0.004 and {delta}D{sub z}/D = 0.120 {+-} 0.018. With air-filled shrouds there is consequently anisotropy in the neutron diffusion and we have (D{sub z}/D{sub r}){sub air} = 1.034 {+-} 0.020.

  10. Fuel ion rotation measurement and its implications on H-mode theories

    International Nuclear Information System (INIS)

    Kim, J.; Burrell, K.H.; Gohil, P.; Groebner, R.J.; Hinton, F.L.; Kim, Y.B.; Seraydarian, R.; Mandl, W.

    1993-10-01

    Poloidal and toroidal rotation of the fuel ions (He 2+ ) and the impurity ions (C 6+ and B 5+ ) in H-mode helium plasmas have been investigated in the DIII-D tokamak by means of charge exchange recombination spectroscopy, resulting in the discovery that the fuel ion poloidal rotation is in the ion diamagnetic drift direction while the impurity ion rotation is in the electron diamagnetic drift direction. The radial electric field obtained from radial force balance analysis of the measured pressure gradients and rotation velocities is shown to be the same regardless of which ion species is used and therefore is a more fundamental parameter than the rotation flows in studying H-mode phenomena. It is shown that the three contributions to the radial electric field (diamagnetic, poloidal rotation, and toroidal rotation terms) are comparable and consequently the poloidal flow does not solely represent the E x B flow. In the high-shear edge region, the density scale length is comparable to the ion poloidal gyroradius, and thus neoclassical theory is not valid there. In view of this new discovery that the fuel and impurity ions rotate in opposite sense, L-H transition theories based on the poloidal rotation may require improvement

  11. Marketing fundamentals.

    Science.gov (United States)

    Redmond, W H

    2001-01-01

    This chapter outlines current marketing practice from a managerial perspective. The role of marketing within an organization is discussed in relation to efficiency and adaptation to changing environments. Fundamental terms and concepts are presented in an applied context. The implementation of marketing plans is organized around the four P's of marketing: product (or service), promotion (including advertising), place of delivery, and pricing. These are the tools with which marketers seek to better serve their clients and form the basis for competing with other organizations. Basic concepts of strategic relationship management are outlined. Lastly, alternate viewpoints on the role of advertising in healthcare markets are examined.

  12. Large-scale symmetry-adapted perturbation theory computations via density fitting and Laplace transformation techniques: investigating the fundamental forces of DNA-intercalator interactions.

    Science.gov (United States)

    Hohenstein, Edward G; Parrish, Robert M; Sherrill, C David; Turney, Justin M; Schaefer, Henry F

    2011-11-07

    Symmetry-adapted perturbation theory (SAPT) provides a means of probing the fundamental nature of intermolecular interactions. Low-orders of SAPT (here, SAPT0) are especially attractive since they provide qualitative (sometimes quantitative) results while remaining tractable for large systems. The application of density fitting and Laplace transformation techniques to SAPT0 can significantly reduce the expense associated with these computations and make even larger systems accessible. We present new factorizations of the SAPT0 equations with density-fitted two-electron integrals and the first application of Laplace transformations of energy denominators to SAPT. The improved scalability of the DF-SAPT0 implementation allows it to be applied to systems with more than 200 atoms and 2800 basis functions. The Laplace-transformed energy denominators are compared to analogous partial Cholesky decompositions of the energy denominator tensor. Application of our new DF-SAPT0 program to the intercalation of DNA by proflavine has allowed us to determine the nature of the proflavine-DNA interaction. Overall, the proflavine-DNA interaction contains important contributions from both electrostatics and dispersion. The energetics of the intercalator interaction are are dominated by the stacking interactions (two-thirds of the total), but contain important contributions from the intercalator-backbone interactions. It is hypothesized that the geometry of the complex will be determined by the interactions of the intercalator with the backbone, because by shifting toward one side of the backbone, the intercalator can form two long hydrogen-bonding type interactions. The long-range interactions between the intercalator and the next-nearest base pairs appear to be negligible, justifying the use of truncated DNA models in computational studies of intercalation interaction energies.

  13. Some fundamental questions concerning the kinetic theory of electrons in molecular gases and the e H2 vibrational cross section controversy

    Science.gov (United States)

    Robson, R. E.; White, R. D.; Morrison, Michael A.

    2003-10-01

    We commence a fundamental re-examination of the kinetic theory of charged particle swarms in molecular gases, focusing on collisional excitation of molecular rotational and ro-vibrational states by electrons. Modern day analysis of electron swarms has been based upon the kinetic equation of Wang-Chang et al, which simply treats all processes as scalar energy excitations, and ignores angular momentum conservation and the vector dynamics associated with rotational excitation. It is pointed out that there is no alternative, more exact kinetic equation readily available for electrons which enables one to directly ascertain the degree of error introduced by this approximation. Thus in this preliminary study, we approach the problem indirectly, from the standpoint of the neutral molecules, using the Waldmann-Snider quantum kinetic equation, and insist that an electron-molecule collision must look the same from the perspective of both electron and molecule. We give a formula for quantitatively assessing the importance of scalar versus vectorial treatments of rotational excitation by looking at the post-collisional 'echo' produced by an electron swarm as it passes through the gas. It is then pointed out that in order to remedy any deficiency, it will be necessary to introduce a kinetic collisional operator non-local in space to properly account for angular momentum conservation, as has long been established in the literature. This is a major exercise and given the preliminary nature of this study, we consider the inclusion of such effects from a formal point of view only. In particular we show how non-local effects lead to a spatially dependent 'source' term in the equation of continuity, and hence to corrections for both drift velocity and diffusion coefficients. The magnitude of these corrections has yet to be established.

  14. Some fundamental questions concerning the kinetic theory of electrons in molecular gases and the e-H2 vibrational cross section controversy

    International Nuclear Information System (INIS)

    Robson, R E; White, R D; Morrison, Michael A

    2003-01-01

    We commence a fundamental re-examination of the kinetic theory of charged particle swarms in molecular gases, focusing on collisional excitation of molecular rotational and ro-vibrational states by electrons. Modern day analysis of electron swarms has been based upon the kinetic equation of Wang-Chang et al, which simply treats all processes as scalar energy excitations, and ignores angular momentum conservation and the vector dynamics associated with rotational excitation. It is pointed out that there is no alternative, more exact kinetic equation readily available for electrons which enables one to directly ascertain the degree of error introduced by this approximation. Thus in this preliminary study, we approach the problem indirectly, from the standpoint of the neutral molecules, using the Waldmann-Snider quantum kinetic equation, and insist that an electron-molecule collision must look the same from the perspective of both electron and molecule. We give a formula for quantitatively assessing the importance of scalar versus vectorial treatments of rotational excitation by looking at the post-collisional 'echo' produced by an electron swarm as it passes through the gas. It is then pointed out that in order to remedy any deficiency, it will be necessary to introduce a kinetic collisional operator non-local in space to properly account for angular momentum conservation, as has long been established in the literature. This is a major exercise and given the preliminary nature of this study, we consider the inclusion of such effects from a formal point of view only. In particular we show how non-local effects lead to a spatially dependent 'source' term in the equation of continuity, and hence to corrections for both drift velocity and diffusion coefficients. The magnitude of these corrections has yet to be established

  15. DOE Fundamentals Handbook: Electrical Science, Volume 2

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive components; batteries; AC and DC voltage regulators; transformers; and electrical test instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  16. Measurement of pitch in speech : an implementation of Goldstein's theory of pitch perception

    NARCIS (Netherlands)

    Duifhuis, H.; Willems, L.F.; Sluyter, R.J.

    1982-01-01

    Recent developments in hearing theory have resulted in the rather general acceptance of the idea that the perception of pitch of complex sounds is the result of the psychological pattern recognition process. The pitch is supposedly mediated by the fundamental of the harmonic spectrum which fits the

  17. Fundamental Processes in Plasmas. Final report

    International Nuclear Information System (INIS)

    O'Neil, Thomas M.; Driscoll, C. Fred

    2009-01-01

    This research focuses on fundamental processes in plasmas, and emphasizes problems for which precise experimental tests of theory can be obtained. Experiments are performed on non-neutral plasmas, utilizing three electron traps and one ion trap with a broad range of operating regimes and diagnostics. Theory is focused on fundamental plasma and fluid processes underlying collisional transport and fluid turbulence, using both analytic techniques and medium-scale numerical simulations. The simplicity of these systems allows a depth of understanding and a precision of comparison between theory and experiment which is rarely possible for neutral plasmas in complex geometry. The recent work has focused on three areas in basic plasma physics. First, experiments and theory have probed fundamental characteristics of plasma waves: from the low-amplitude thermal regime, to inviscid damping and fluid echoes, to cold fluid waves in cryogenic ion plasmas. Second, the wide-ranging effects of dissipative separatrices have been studied experimentally and theoretically, finding novel wave damping and coupling effects and important plasma transport effects. Finally, correlated systems have been investigated experimentally and theoretically: UCSD experients have now measured the Salpeter correlation enhancement, and theory work has characterized the 'guiding center atoms of antihydrogen created at CERN

  18. Fundamentals of ergonomic exoskeleton robots

    OpenAIRE

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a new theoretical framework for analyzing physical human robot interaction (pHRI) with exoskeletons, and (2) a clear set of design rules of how to build wearable, portable exoskeletons to easily and...

  19. Generalised perturbation theory and source of information through chemical measurements

    International Nuclear Information System (INIS)

    Lelek, V.; Marek, T.

    2001-01-01

    It is important to make all analyses and collect all information from the work of the new facility (which the transmutation demonstration unit will surely be) to be sure that the operation corresponds to the forecast or to correct the equations of the facility. The behaviour of the molten salt reactor and in particular the system of measurement are very different from that of the solid fuel reactor. Key information from the long time kinetics could be the nearly on line knowledge of the fuel composition. In this work it is shown how to include it into the control and use such data for the correction of neutron cross-sections for the high actinides or other characteristics. Also the problem of safety - change of the boundary problem to the initial problem - is mentioned. The problem is transformed into the generalised perturbation theory in which the adjoint function is obtained through the solution of the equations with right hand side having the form of source. Such an approach should be a theoretical base for the calculation of the sensitivity coefficients. (authors)

  20. Fundamentals of turbomachines

    CERN Document Server

    Dick, Erik

    2015-01-01

    This book explores the working principles of all kinds of turbomachines. The same theoretical framework is used to analyse the different machine types. Fundamentals are first presented and theoretical concepts are then elaborated for particular machine types, starting with the simplest ones.For each machine type, the author strikes a balance between building basic understanding and exploring knowledge of practical aspects. Readers are invited through challenging exercises to consider how the theory applies to particular cases and how it can be generalised.   The book is primarily meant as a course book. It teaches fundamentals and explores applications. It will appeal to senior undergraduate and graduate students in mechanical engineering and to professional engineers seeking to understand the operation of turbomachines. Readers will gain a fundamental understanding of turbomachines. They will also be able to make a reasoned choice of turbomachine for a particular application and to understand its operation...

  1. Arguing against fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  2. Pedagogical Review of Quantum Measurement Theory with an Emphasis on Weak Measurements

    Directory of Open Access Journals (Sweden)

    Bengt E. Y. Svensson

    2013-05-01

    Full Text Available The quantum theory of measurement has been with us since quantum mechanics was invented. It has recently been invigorated, partly due to the increasing interest in quantum information science. In this partly pedagogical review I attempt to give a self-contained overview of non-relativistic quantum theory of measurement expressed in density matrix formalism. I will not dwell on the applications in quantum information theory; it is well covered by several books in that field. The focus is instead on applications to the theory of weak measurement, as developed by Aharonov and collaborators. Their development of weak measurement combined with what they call post-selection - judiciously choosing not only the initial state of a system (pre-selection but also its final state - has received much attention recently. Not the least has it opened up new, fruitful experimental vistas, like novel approaches to amplification. But the approach has also attached to it some air of mystery. I will attempt to demystify it by showing that (almost all results can be derived in a straight-forward way from conventional quantum mechanics. Among other things, I develop the formalism not only to first order but also to second order in the weak interaction responsible for the measurement. I apply it to the so called Leggett-Garg inequalities, also known as Bell inequalities in time. I also give an outline, even if rough, of some of the ingenious experiments that the work by Aharonov and collaborators has inspired. As an application of weak measurement, not related to the approach by Aharonov and collaborators, the formalism also allows me to derive the master equation for the density matrix of an open system in interaction with an environment. An issue that remains in the weak measurement plus post-selection approach is the interpretation of the so called weak value of an observable. Is it a bona fide property of the system considered? I have no definite answer to this

  3. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, A.; Murthy, S.

    2007-06-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d ( d = 3, . . . , 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings. (author)

  4. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, Atish; Murthy, Sameer

    2008-01-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d (d = 3, ..., 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings

  5. Construct Validity of Measures of Becker's Side Bet Theory.

    Science.gov (United States)

    Shore, Lynn M.; Tetrick, Lois E.; Shore, Ted H.; Barksdale, Kevin

    2000-01-01

    Becker's side bet theory (remaining in a job because of perceived costs of leaving) was tested using data from 327 working business students. Three factors were most consistent with the theory: bureaucratic organization, nonwork-related concerns, and adjustment to social position. Attachment to the organization was significantly linked to tangible…

  6. Historical Views of Invariance: Evidence from the Measurement Theories of Thorndike, Thurstone, and Rasch.

    Science.gov (United States)

    Engelhard, George, Jr.

    1992-01-01

    A historical perspective is provided of the concept of invariance in measurement theory, describing sample-invariant item calibration and item-invariant measurement of individuals. Invariance as a key measurement concept is illustrated through the measurement theories of E. L. Thorndike, L. L. Thurstone, and G. Rasch. (SLD)

  7. Fundamentals of photonics

    CERN Document Server

    Saleh, Bahaa E A

    2007-01-01

    Now in a new full-color edition, Fundamentals of Photonics, Second Edition is a self-contained and up-to-date introductory-level textbook that thoroughly surveys this rapidly expanding area of engineering and applied physics. Featuring a logical blend of theory and applications, coverage includes detailed accounts of the primary theories of light, including ray optics, wave optics, electromagnetic optics, and photon optics, as well as the interaction of photons and atoms, and semiconductor optics. Presented at increasing levels of complexity, preliminary sections build toward more advan

  8. Radiological fundamentals for decision making on public radiation protection measures in case of accident caused radionuclide release

    International Nuclear Information System (INIS)

    Genkel, Simone

    2009-01-01

    Following the accepted revised version of the recommendations concerning in the frame of emergency management by the German SSK (radiation protection commission) the radiological fundamentals dating from 1990 were revised. The corrections of the dose benchmarks for children and juveniles for the case of iodine tablets intake that were included, in the chapter on radiation protection for the field and rescue personnel of fire brigade and police the new regulations of the radiation protection ordinance were added. The volume includes two parts: Guidelines for emergency planning in the environment of nuclear facilities; guideline on public information in nuclear emergency situations.

  9. The relationship between Theory of Mind and Relational Frame Theory: Convergence of perspective-taking measures

    NARCIS (Netherlands)

    Hendriks, A.L.; Barnes-Holmes, Y.; McEnteggart, C.; Mey, H.R.A. De; Witteman, C.L.M.; Janssen, G.T.L.; Egger, J.I.M.

    2016-01-01

    Objective: Perspective-taking difficulties have been demonstrated in autism and schizophrenia spectrum disorders, among other clinical presentations, and are traditionally examined from a Theory of Mind (ToM) point of view. Relational Frame Theory (RFT) offers a behavioural and contextual

  10. Using classical test theory, item response theory, and Rasch measurement theory to evaluate patient-reported outcome measures: a comparison of worked examples.

    Science.gov (United States)

    Petrillo, Jennifer; Cano, Stefan J; McLeod, Lori D; Coon, Cheryl D

    2015-01-01

    To provide comparisons and a worked example of item- and scale-level evaluations based on three psychometric methods used in patient-reported outcome development-classical test theory (CTT), item response theory (IRT), and Rasch measurement theory (RMT)-in an analysis of the National Eye Institute Visual Functioning Questionnaire (VFQ-25). Baseline VFQ-25 data from 240 participants with diabetic macular edema from a randomized, double-masked, multicenter clinical trial were used to evaluate the VFQ at the total score level. CTT, RMT, and IRT evaluations were conducted, and results were assessed in a head-to-head comparison. Results were similar across the three methods, with IRT and RMT providing more detailed diagnostic information on how to improve the scale. CTT led to the identification of two problematic items that threaten the validity of the overall scale score, sets of redundant items, and skewed response categories. IRT and RMT additionally identified poor fit for one item, many locally dependent items, poor targeting, and disordering of over half the response categories. Selection of a psychometric approach depends on many factors. Researchers should justify their evaluation method and consider the intended audience. If the instrument is being developed for descriptive purposes and on a restricted budget, a cursory examination of the CTT-based psychometric properties may be all that is possible. In a high-stakes situation, such as the development of a patient-reported outcome instrument for consideration in pharmaceutical labeling, however, a thorough psychometric evaluation including IRT or RMT should be considered, with final item-level decisions made on the basis of both quantitative and qualitative results. Copyright © 2015. Published by Elsevier Inc.

  11. Mass anomalous dimension in SU(2) with six fundamental fermions

    DEFF Research Database (Denmark)

    Bursa, Francis; Del Debbio, Luigi; Keegan, Liam

    2010-01-01

    We simulate SU(2) gauge theory with six massless fundamental Dirac fermions. We measure the running of the coupling and the mass in the Schroedinger Functional scheme. We observe very slow running of the coupling constant. We measure the mass anomalous dimension gamma, and find it is between 0.13...

  12. Dosimetric fundamentals

    International Nuclear Information System (INIS)

    Nahum, A.E.

    2004-01-01

    This text covers some important concepts in radiation dosimetry with an emphasis on cavity theory, i.e. the theoretical evaluation of D med /D det , for two important classes of detector, 'large' and Bragg-Gray. Monte Carlo simulation continues to play a major role in evaluating this expression through its ability to compute the fluence spectra of electrons and photons as a function of their position in a medium. The key results in the paper can be summarised thus: - Fluence Φ = dN/da Σds/dV and is a scalar quantity. - Kerma K = dE tr /dm, i.e. kinetic energy (k.e.) transferred per unit mass; collision kerma K c excludes charged-particle k.e. converted to Bremsstrahlung. - Kerma and fluence are related by K med = Φ (μ tr /ρ) med for photons of energy E; for collision kerma, K c , the mass energy-absorption coefficient μ en replaces μ tr . - D med = (K c ) med under conditions of charged particle equilibrium (CPE), for a medium med irradiated by photons. - For a fluence Φ of charged particles, e.g. electrons, in medium med, the absorbed dose D med = Φ (S col /ρ) med provided there is δ-ray equilibrium. - For large detectors under photon irradiation (i.e. in which there is CPE as e - ranges - detector size), D med /D det is given by (μ en /ρ) med /(μ en /ρ) det which is evaluated over the photon spectrum at the detector position: e.g. TLD (e.g. LiF) in kV X-ray beams are large. - For 'small' or Bragg-Gray detectors under photon or electron irradiation (e - ranges - detector dimensions), D med /D det is given by (S col /ρ) med /(S col /ρ) det , the (mass) stopping-power ratio, usually written S med.det : e.g. (air-filled) ionisation chambers behave as Bragg -Gray detectors in megavoltage photon and electron beams, but not in kV X-ray beams. - Bragg-Gray theory was extended by Spencer and Attix to take into account the finite range of δ-rays. - General cavity theory provides an approximate treatment of detectors which are neither 'large' nor 'small

  13. Fundamentals of Counting Statistics in Digital PCR: I Just Measured Two Target Copies-What Does It Mean?

    Science.gov (United States)

    Tzonev, Svilen

    2018-01-01

    Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.

  14. Ethics fundamentals.

    Science.gov (United States)

    Chambers, David W

    2011-01-01

    Ethics is about studying the right and the good; morality is about acting as one should. Although there are differences among what is legal, charitable, professional, ethical, and moral, these desirable characteristics tend to cluster and are treasured in dentistry. The traditional approach to professionalism in dentistry is based on a theory of biomedical ethics advanced 30 years ago. Known as the principles approach, general ideals such as respect for autonomy, nonmaleficence, beneficence, justice, and veracity, are offered as guides. Growth in professionalism consists in learning to interpret the application of these principles as one's peers do. Moral behavior is conceived as a continuous cycle of sensitivity to situations requiring moral response, moral reasoning, the moral courage to take action when necessary, and integration of habits of moral behavior into one's character. This essay is the first of two papers that provide the backbone for the IDEA Project of the College--an online, multiformat, interactive "textbook" of ethics for the profession.

  15. Fundamental concepts in Particle Physics course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    The course will provide an introduction to some of the basic theoretical techniques used to describe the fundamental particles and their interactions. Of central importance to our understanding of these forces are the underlying symmetries of nature and I will review the nature of these symmetries and how they are used to build a predictive theory. I discuss how the combination of quantum mechanics and relativity leads to the quantum field theory (QFT) description of the states of matter and their interactions. The Feynman rules used to determine the QFT predictions for experimentally measurable processes are derived and applied to the calculation of decay widths and cross sections.

  16. Fundamental limits of radio interferometers: calibration and source parameter estimation

    OpenAIRE

    Trott, Cathryn M.; Wayth, Randall B.; Tingay, Steven J.

    2012-01-01

    We use information theory to derive fundamental limits on the capacity to calibrate next-generation radio interferometers, and measure parameters of point sources for instrument calibration, point source subtraction, and data deconvolution. We demonstrate the implications of these fundamental limits, with particular reference to estimation of the 21cm Epoch of Reionization power spectrum with next-generation low-frequency instruments (e.g., the Murchison Widefield Array -- MWA, Precision Arra...

  17. Fundamental measure theory for the electric double layer : implications for blue-energy harvesting and water desalination

    NARCIS (Netherlands)

    Hartel, Andreas; Janssen, Mathijs; Samin, Sela; van Roij, Rene

    2015-01-01

    Capacitive mixing (CAPMIX) and capacitive deionization (CDI) are promising candidates for harvesting clean, renewable energy and for the energy efficient production of potable water, respectively. Both CAPMIX and CDI involve water-immersed porous carbon (supercapacitors) electrodes at voltages of

  18. Another argument against fundamental scalars

    International Nuclear Information System (INIS)

    Joglekar, S.D.

    1990-01-01

    An argument, perhaps not as strong, which is based on the inclusion of interaction with external gravity into a theory describing strong, electromagnetic and weak interactions is presented. The argument is related to the basis of the common belief which favours a renormalizable action against a non-renormalizable action as a candidate for a fundamental theory. (author). 12 refs

  19. Fundamental study on the characteristics of a radiophotoluminescence glass dosemeter with no energy compensation filter for measuring patient entrance doses in cardiac interventional procedures

    International Nuclear Information System (INIS)

    Kato, Mamoru; Chida, Koichi; Moritake, Takashi; Koguchi, Yasuhiro; Sato, Tadaya; Kadowaki, Ken; Oosaka, Hajime; Tosa, Tetsuo

    2014-01-01

    Cardiac interventional procedures have been increasing year by year. However, radiation skin injuries have been still reported. There is a necessity to measure the patient entrance skin dose (ESD), but an accurate dose measurement method has not been established. To measure the ESD, a lot of radiophotoluminescence dosemeters (RPLDs) provide an accurate measurement of the direct actual ESD at the points they are arrayed. The purpose of this study was to examine the characteristics of RPLD to measure the ESD. As a result, X-ray permeable RPLD (with no tin filter) did not interfere with the percutaneous coronary intervention procedure. The RPLD also had good fundamental performance characteristics. Although the RPLD had a little energy dependence, it showed excellent dose and dose-rate linearity, and good angular dependence. In conclusion, by calibrating the energy dependence, RPLDs are useful dosemeter to measure the ESD in cardiac intervention. (authors)

  20. Wellness: A Review of Theory and Measurement for Counselors

    Science.gov (United States)

    Roscoe, Lauren J.

    2009-01-01

    Wellness is considered the paradigm of counseling and development (J. E. Myers, 1991, 1992). However, researchers have failed to agree on a definition or on the dimensional structure of wellness. Furthermore, existing quantitative wellness instruments are inadequate for capturing the complexity of wellness. The author reviews wellness theory and…

  1. Loss Aversion under Prospect Theory: a Parameter-Free Measurement

    NARCIS (Netherlands)

    H. Bleichrodt (Han); M. Abdellaoui (Mohammed); C. Paraschiv (Corina)

    2007-01-01

    textabstractA growing body of qualitative evidence shows that loss aversion, a phenomenon formalized in prospect theory, can explain a variety of field and experimental data. Quantifications of loss aversion are, however, hindered by the absence of a general preference-based method to elicit the

  2. A critical analysis of the quantum theory of measurement

    International Nuclear Information System (INIS)

    Fer, F.

    1984-01-01

    Keeping strictly in the positivist and probabilistic, hence hilbertian frame of Quantum Mechanics, the author tries to ascertain whether or not Quantum Mechanics, starting from its axioms, reaches the aim of any physical theory, that is, comparison with experiment. The answer is: no, as long as it keeps close to the existing axiomatics, and also to accurate mathematics. (Auth.)

  3. Post-modern portfolio theory supports diversification in an investment portfolio to measure investment's performance

    OpenAIRE

    Rasiah, Devinaga

    2012-01-01

    This study looks at the Post-Modern Portfolio Theory that maintains greater diversification in an investment portfolio by using the alpha and the beta coefficient to measure investment performance. Post-Modern Portfolio Theory appreciates that investment risk should be tied to each investor's goals and the outcome of this goal did not symbolize economic of the financial risk. Post-Modern Portfolio Theory's downside measure generated a noticeable distinction between downside and upside volatil...

  4. STEP and fundamental physics

    Science.gov (United States)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-09-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 1013 to one part in 1018 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels.

  5. STEP and fundamental physics

    International Nuclear Information System (INIS)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-01-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 10 13 to one part in 10 18 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels. (paper)

  6. Quivers, words and fundamentals

    International Nuclear Information System (INIS)

    Mattioli, Paolo; Ramgoolam, Sanjaye

    2015-01-01

    A systematic study of holomorphic gauge invariant operators in general N=1 quiver gauge theories, with unitary gauge groups and bifundamental matter fields, was recently presented in http://dx.doi.org/10.1007/JHEP04(2013)094. For large ranks a simple counting formula in terms of an infinite product was given. We extend this study to quiver gauge theories with fundamental matter fields, deriving an infinite product form for the refined counting in these cases. The infinite products are found to be obtained from substitutions in a simple building block expressed in terms of the weighted adjacency matrix of the quiver. In the case without fundamentals, it is a determinant which itself is found to have a counting interpretation in terms of words formed from partially commuting letters associated with simple closed loops in the quiver. This is a new relation between counting problems in gauge theory and the Cartier-Foata monoid. For finite ranks of the unitary gauge groups, the refined counting is given in terms of expressions involving Littlewood-Richardson coefficients.

  7. Confidence Measurement in the Light of Signal Detection Theory

    Directory of Open Access Journals (Sweden)

    Sébastien eMassoni

    2014-12-01

    Full Text Available We compare three alternative methods for eliciting retrospective confidence in the context of a simple perceptual task: the Simple Confidence Rating (a direct report on a numerical scale, the Quadratic Scoring Rule (a post-wagering procedure and the Matching Probability (a generalization of the no-loss gambling method. We systematically compare the results obtained with these three rules to the theoretical confidence levels that can be inferred from performance in the perceptual task using Signal Detection Theory. We find that the Matching Probability provides better results in that respect. We conclude that Matching Probability is particularly well suited for studies of confidence that use Signal Detection Theory as a theoretical framework.

  8. The theory, practice, and measurement of Music Therapy

    DEFF Research Database (Denmark)

    Moore, Kimberly Sena; Hanson-Abromeit, Deanna; Magee, Wendy L.

    2013-01-01

    from an understanding of music perception and cognition. Given the diversity of practice, there are several key challenges for the discipline. One is developing a theory-based clinical and research approach. This supports a deeper understanding of the complex music stimulus and therapeutic interactions......Music therapy is a clinical healthcare discipline that draws its evidence base from music neuroscience and psychology to improve the health and well-being in individuals from varied clinical populations. Working with individuals across the lifespan, evidence-based therapeutic methods are developed...... of interest. This symposium will bring together some of the latest research from the discipline of music therapy relating to the clinical needs of complex neurological and psychiatric populations. The papers offer diverse perspectives reflecting interdisciplinary influences on the theory and practice of music...

  9. Fermion families and vacuum in the two measures theory

    International Nuclear Information System (INIS)

    Guendelman, E.; Kaganovich, A.

    2005-01-01

    We present an alternative gravity and matter fields theory where the consistency condition of equations of motion yields strong correlation between states of 'primordial' fermion fields and local value of the scalar fields (dilaton and Higgs) energy density. The same 'primordial' fermion field at different densities can be either in states of regular fermionic matter or in states presumably corresponding to the dark fermionic matter. In regime of the fermion densities typical for normal particle physics, each of the primordial fermions splits into three generations identified with regular fermions. When restricting ourselves to the first two fermion generations, the theory reproduces general relativity and regular particle theory. As fermion energy density is comparable with vacuum energy density, the theory allows new type of states. Such Cosmo-Low Energy Physics (CLEP) state is studied in the framework of the model where FRW universe filled with homogeneous scalar field and uniformly distributed nonrelativistic neutrinos. Neutrinos in CLEP state are drawn into cosmological expansion by means of dynamically changing their own parameters. Some of the features of the CLEP state in the late time universe: neutrino mass increases as α 3/2 (α = α(t) is the scale factor); its energy density scales as a sort of dark energy and approaches constant as α→∞; this cold dark matter possesses negative pressure and its equation of state approaches that of the cosmological constant as α→∞; the total energy density of such universe is less than it would be in the universe free of fermionic matter at all. The latter means that nonrelativistic neutrinos are able to produce expanding bubbles of the CLEP state playing the role of a true 'cosmological vacuum' surrounded by a 'regular' vacuum. (authors)

  10. Atmospheric Gas Tracers in Groundwater: Theory, Sampling. Measurement and Interpretation

    International Nuclear Information System (INIS)

    Bayari, C.S.

    2002-01-01

    Some of the atmospheric gasses posses features that are sought in an environmental tracer of hydrogeologic interest. Among these, chlorofluorocarbons, sulfur hegzafluoride, carbon tetrachloride, methyl chloroform, krypton-85 etc. have found increasing use in groundwater age dating studies during the last ten years. This paper explains the theory of their use as tracer and discusses the major concerns as related to their sampling and analyses. Factors affecting their applicability and the approach to interpret tracer gas data is briefly outlined

  11. Microwave engineering concepts and fundamentals

    CERN Document Server

    Khan, Ahmad Shahid

    2014-01-01

    Detailing the active and passive aspects of microwaves, Microwave Engineering: Concepts and Fundamentals covers everything from wave propagation to reflection and refraction, guided waves, and transmission lines, providing a comprehensive understanding of the underlying principles at the core of microwave engineering. This encyclopedic text not only encompasses nearly all facets of microwave engineering, but also gives all topics—including microwave generation, measurement, and processing—equal emphasis. Packed with illustrations to aid in comprehension, the book: •Describes the mathematical theory of waveguides and ferrite devices, devoting an entire chapter to the Smith chart and its applications •Discusses different types of microwave components, antennas, tubes, transistors, diodes, and parametric devices •Examines various attributes of cavity resonators, semiconductor and RF/microwave devices, and microwave integrated circuits •Addresses scattering parameters and their properties, as well a...

  12. Asymmetrical effects of mesophyll conductance on fundamental photosynthetic parameters and their relationships estimated from leaf gas exchange measurements

    Science.gov (United States)

    Most previous analyses of leaf gas exchange measurements assumed an infinite value of mesophyll conductance (gm) and thus equaled CO2 partial pressures in the substomatal cavity and chloroplast. Yet an increasing number of studies have recognized that gm is finite and there is a drawdown of CO2 part...

  13. Fundamental concepts of mathematics

    CERN Document Server

    Goodstein, R L

    Fundamental Concepts of Mathematics, 2nd Edition provides an account of some basic concepts in modern mathematics. The book is primarily intended for mathematics teachers and lay people who wants to improve their skills in mathematics. Among the concepts and problems presented in the book include the determination of which integral polynomials have integral solutions; sentence logic and informal set theory; and why four colors is enough to color a map. Unlike in the first edition, the second edition provides detailed solutions to exercises contained in the text. Mathematics teachers and people

  14. Fundamentals of attosecond optics

    CERN Document Server

    Chang, Zenghu

    2011-01-01

    Attosecond optical pulse generation, along with the related process of high-order harmonic generation, is redefining ultrafast physics and chemistry. A practical understanding of attosecond optics requires significant background information and foundational theory to make full use of these cutting-edge lasers and advance the technology toward the next generation of ultrafast lasers. Fundamentals of Attosecond Optics provides the first focused introduction to the field. The author presents the underlying concepts and techniques required to enter the field, as well as recent research advances th

  15. A Scale Elasticity Measure for Directional Distance Function and its Dual: Theory and DEA Estimation

    OpenAIRE

    Valentin Zelenyuk

    2012-01-01

    In this paper we focus on scale elasticity measure based on directional distance function for multi-output-multi-input technologies, explore its fundamental properties and show its equivalence with the input oriented and output oriented scale elasticity measures. We also establish duality relationship between the scale elasticity measure based on the directional distance function with scale elasticity measure based on the profit function. Finally, we discuss the estimation issues of the scale...

  16. Species distributions, quantum theory, and the enhancement of biodiversity measures

    DEFF Research Database (Denmark)

    Real, Raimundo; Barbosa, A. Márcia; Bull, Joseph William

    2017-01-01

    Species distributions are typically represented by records of their observed occurrence at a given spatial and temporal scale. Such records are inevitably incomplete and contingent on the spatial–temporal circumstances under which the observations were made. Moreover, organisms may respond...... biodiversity”. We show how conceptualizing species’ distributions in this way could help overcome important weaknesses in current biodiversity metrics, both in theory and by using a worked case study of mammal distributions in Spain over the last decade. We propose that considerable theoretical advances could...

  17. Light scattering by nonspherical particles theory, measurements, and applications

    CERN Document Server

    Mishchenko, Michael I; Travis, Larry D

    1999-01-01

    There is hardly a field of science or engineering that does not have some interest in light scattering by small particles. For example, this subject is important to climatology because the energy budget for the Earth's atmosphere is strongly affected by scattering of solar radiation by cloud and aerosol particles, and the whole discipline of remote sensing relies largely on analyzing the parameters of radiation scattered by aerosols, clouds, and precipitation. The scattering of light by spherical particles can be easily computed using the conventional Mie theory. However, most small solid part

  18. Fundamentals of quantum mechanics

    CERN Document Server

    House, J E

    2017-01-01

    Fundamentals of Quantum Mechanics, Third Edition is a clear and detailed introduction to quantum mechanics and its applications in chemistry and physics. All required math is clearly explained, including intermediate steps in derivations, and concise review of the math is included in the text at appropriate points. Most of the elementary quantum mechanical models-including particles in boxes, rigid rotor, harmonic oscillator, barrier penetration, hydrogen atom-are clearly and completely presented. Applications of these models to selected “real world” topics are also included. This new edition includes many new topics such as band theory and heat capacity of solids, spectroscopy of molecules and complexes (including applications to ligand field theory), and small molecules of astrophysical interest.

  19. Fundamentals of electronic image processing

    CERN Document Server

    Weeks, Arthur R

    1996-01-01

    This book is directed to practicing engineers and scientists who need to understand the fundamentals of image processing theory and algorithms to perform their technical tasks. It is intended to fill the gap between existing high-level texts dedicated to specialists in the field and the need for a more practical, fundamental text on image processing. A variety of example images are used to enhance reader understanding of how particular image processing algorithms work.

  20. Qualitative insights on fundamental mechanics

    OpenAIRE

    Mardari, G. N.

    2002-01-01

    The gap between classical mechanics and quantum mechanics has an important interpretive implication: the Universe must have an irreducible fundamental level, which determines the properties of matter at higher levels of organization. We show that the main parameters of any fundamental model must be theory-independent. They cannot be predicted, because they cannot have internal causes. However, it is possible to describe them in the language of classical mechanics. We invoke philosophical reas...

  1. [Development of a system for static measurement of skin-muscle hardness and a fundamental study on its applications].

    Science.gov (United States)

    Honda, T

    1990-10-01

    There have been many attempts to quantitatively measure the hardness of skin-muscle, but no objective method for doing so has been established, because there is no universal standard for the hardness of organisms. The author considered elasticity and viscosity as the most important mechanical properties of the hardness of skin-muscle and applied the Maxwell model, in which a spring and a dash-pot are arranged in a series, to the static mechanical behavior of skin-muscle. A relatively large globular pressing body with a radius of 5 mm was set as a transducer in the measuring system, so that the conformity of the practically measured values to those calculated theoretically by the model was increased. Strain of skin-muscle is expressed as a function of the load, which includes indices of elasticity (1/M) (M(N/mm2) = E/(1-lambda 2) (E: Young's modulus, lambda:Poisson's ratio)) and viscosity (1/eta) (eta:modulus of viscosity) in a particular region. Because hardness is defined as the degree of resistance against transformation by loading, decreases in the indices of both elasticity and viscosity mean increases of hardness. With 150 male and female office workers chosen as the subjects, the model was examined and the indices were calculated. The results were as follows. 1) Very good conformity of practically measured values to those calculated theoretically by the Maxwell model was recognized within the range of load velocity from 0.3 G to 3.0 G (N/sec). 2) In both males and females the regions with values nearest to those of a Newtonian fluid were, in descending order, the distal phalanxes of digiti 2-4, the palm, the distal phalanx of the first digitus and the arm. In reverse order these regions approached complete elasticity. 3) In males it was suggested that the element of viscosity in the region of the biceps brachii muscle and the hardness in the regions of the brachioradialis, the flexor carpi radialis and palmalis longus muscles and the distal phalanxes of the 4

  2. Performance measurement, expectancy and agency theory: An experimental study

    NARCIS (Netherlands)

    Sloof, R.; van Praag, C.M.

    2008-01-01

    Theoretical analyses of (optimal) performance measures are typically performed within the realm of the linear agency model. This model implies that, for a given compensation scheme, the agent’s optimal effort is unrelated to the amount of noise in the performance measure. In contrast, expectancy

  3. Uncertainty relation and simultaneous measurements in quantum theory

    International Nuclear Information System (INIS)

    Busch, P.

    1982-01-01

    In this thesis the question for the interpretation of the uncertainty relation is picked up, and a program for the justification of its individualistic interpretation is formulated. By means of quantum mechanical models for the position and momentum measurement a justification of the interpretaton has been tried by reconstruction of the origin of the uncertainties from the conditions of the measuring devices and the determination of the relation of the measured results to the object. By means of a model of the common measurement it could be shown how the uncertainty relation results from the not eliminable mutual disturbance of the devices and the uncertainty relation for the measuring system. So finally the commutation relation is conclusive. For the illustration the split experiment is discussed, first according to Heisenberg with fixed split, then for the quantum mechanical, movable split (Bohr-Einstein). (orig./HSI) [de

  4. Implications Of The Crisis Of Objectivity In Accounting Measurement On The Development Of Finance Theory

    OpenAIRE

    Saratiel Wedzerai Musvoto

    2011-01-01

    Studies in accounting measurement indicate the absence of empirical relational structures that should form the basis for accounting measurement. This suggests the lack of objectivity of accounting information. Landmarks in the development of finance theory indicate the use of accounting measurement information as a basis for their development. This indicates that subjective accounting information is incorporated in finance theory. Consequently, this questions the status of finance as a univer...

  5. Strings and fundamental physics

    International Nuclear Information System (INIS)

    Baumgartl, Marco; Brunner, Ilka; Haack, Michael

    2012-01-01

    The basic idea, simple and revolutionary at the same time, to replace the concept of a point particle with a one-dimensional string, has opened up a whole new field of research. Even today, four decades later, its multifaceted consequences are still not fully conceivable. Up to now string theory has offered a new way to view particles as different excitations of the same fundamental object. It has celebrated success in discovering the graviton in its spectrum, and it has naturally led scientists to posit space-times with more than four dimensions - which in turn has triggered numerous interesting developments in fields as varied as condensed matter physics and pure mathematics. This book collects pedagogical lectures by leading experts in string theory, introducing the non-specialist reader to some of the newest developments in the field. The carefully selected topics are at the cutting edge of research in string theory and include new developments in topological strings, AdS/CFT dualities, as well as newly emerging subfields such as doubled field theory and holography in the hydrodynamic regime. The contributions to this book have been selected and arranged in such a way as to form a self-contained, graduate level textbook. (orig.)

  6. Strings and fundamental physics

    Energy Technology Data Exchange (ETDEWEB)

    Baumgartl, Marco [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Brunner, Ilka; Haack, Michael (eds.) [Muenchen Univ. (Germany). Fakultaet fuer Physik

    2012-07-01

    The basic idea, simple and revolutionary at the same time, to replace the concept of a point particle with a one-dimensional string, has opened up a whole new field of research. Even today, four decades later, its multifaceted consequences are still not fully conceivable. Up to now string theory has offered a new way to view particles as different excitations of the same fundamental object. It has celebrated success in discovering the graviton in its spectrum, and it has naturally led scientists to posit space-times with more than four dimensions - which in turn has triggered numerous interesting developments in fields as varied as condensed matter physics and pure mathematics. This book collects pedagogical lectures by leading experts in string theory, introducing the non-specialist reader to some of the newest developments in the field. The carefully selected topics are at the cutting edge of research in string theory and include new developments in topological strings, AdS/CFT dualities, as well as newly emerging subfields such as doubled field theory and holography in the hydrodynamic regime. The contributions to this book have been selected and arranged in such a way as to form a self-contained, graduate level textbook. (orig.)

  7. Asymmetrical effects of mesophyll conductance on fundamental photosynthetic parameters and their relationships estimated from leaf gas exchange measurements.

    Science.gov (United States)

    Sun, Ying; Gu, Lianhong; Dickinson, Robert E; Pallardy, Stephen G; Baker, John; Cao, Yonghui; DaMatta, Fábio Murilo; Dong, Xuejun; Ellsworth, David; Van Goethem, Davina; Jensen, Anna M; Law, Beverly E; Loos, Rodolfo; Martins, Samuel C Vitor; Norby, Richard J; Warren, Jeffrey; Weston, David; Winter, Klaus

    2014-04-01

    Worldwide measurements of nearly 130 C3 species covering all major plant functional types are analysed in conjunction with model simulations to determine the effects of mesophyll conductance (g(m)) on photosynthetic parameters and their relationships estimated from A/Ci curves. We find that an assumption of infinite g(m) results in up to 75% underestimation for maximum carboxylation rate V(cmax), 60% for maximum electron transport rate J(max), and 40% for triose phosphate utilization rate T(u) . V(cmax) is most sensitive, J(max) is less sensitive, and T(u) has the least sensitivity to the variation of g(m). Because of this asymmetrical effect of g(m), the ratios of J(max) to V(cmax), T(u) to V(cmax) and T(u) to J(max) are all overestimated. An infinite g(m) assumption also limits the freedom of variation of estimated parameters and artificially constrains parameter relationships to stronger shapes. These findings suggest the importance of quantifying g(m) for understanding in situ photosynthetic machinery functioning. We show that a nonzero resistance to CO2 movement in chloroplasts has small effects on estimated parameters. A non-linear function with gm as input is developed to convert the parameters estimated under an assumption of infinite gm to proper values. This function will facilitate gm representation in global carbon cycle models. © 2013 John Wiley & Sons Ltd.

  8. Coherent versus Measurement Feedback: Linear Systems Theory for Quantum Information

    Directory of Open Access Journals (Sweden)

    Naoki Yamamoto

    2014-11-01

    Full Text Available To control a quantum system via feedback, we generally have two options in choosing a control scheme. One is the coherent feedback, which feeds the output field of the system, through a fully quantum device, back to manipulate the system without involving any measurement process. The other one is measurement-based feedback, which measures the output field and performs a real-time manipulation on the system based on the measurement results. Both schemes have advantages and disadvantages, depending on the system and the control goal; hence, their comparison in several situations is important. This paper considers a general open linear quantum system with the following specific control goals: backaction evasion, generation of a quantum nondemolished variable, and generation of a decoherence-free subsystem, all of which have important roles in quantum information science. Some no-go theorems are proven, clarifying that those goals cannot be achieved by any measurement-based feedback control. On the other hand, it is shown that, for each control goal there exists a coherent feedback controller accomplishing the task. The key idea to obtain all the results is system theoretic characterizations of the above three notions in terms of controllability and observability properties or transfer functions of linear systems, which are consistent with their standard definitions.

  9. Fundamentals of functional analysis

    CERN Document Server

    Farenick, Douglas

    2016-01-01

    This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...

  10. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) are counted by high resolution gamma spectrometry registrating a pulse-heights distribution (acquisition of a multichannel spectrum), for example on samples. It considers exclusively the random character of radioactive decay and of pulse counting and ignores all other influences (e.g. arising from sample treatment, weighing, enrichment or the instability of the test setup). It assumes that the distance of neighbouring peaks of gamma lines is not smaller than four times the full width half maximum (FWHM) of gamma line and that the background near to gamma line is nearly a straight line. Otherwise ISO 11929-1 or ISO 11929-2 should be used. ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as nonradioactive waste, and ISO 11929-1, ISO 11929-2 and ISO 11929-4, and is, consequently, complementary to these documents

  11. Changing Investment in Activities and Interests in Elders' Lives: Theory and Measurement

    Science.gov (United States)

    Adams, Kathryn Betts

    2004-01-01

    Socioemotional selectivity and gerotranscendence, newer theories with roots in the disengagement theory of aging, provided the theoretical framework for a new measure of perceived change in investment in a variety of pursuits. The 30-item Change in Activity and Interest Index (CAII) was given to a sample of 327 outpatients aged 65-94. Items with…

  12. A New Computerised Advanced Theory of Mind Measure for Children with Asperger Syndrome: The ATOMIC

    Science.gov (United States)

    Beaumont, Renae B.; Sofronoff, Kate

    2008-01-01

    This study examined the ability of children with Asperger Syndrome (AS) to attribute mental states to characters in a new computerised, advanced theory of mind measure: The Animated Theory of Mind Inventory for Children (ATOMIC). Results showed that children with AS matched on IQ, verbal comprehension, age and gender performed equivalently on…

  13. Fundamental study on the characteristics of a radiophotoluminescence glass dosemeter with no energy compensation filter for measuring patient entrance doses in cardiac interventional procedures.

    Science.gov (United States)

    Kato, Mamoru; Chida, Koichi; Moritake, Takashi; Koguchi, Yasuhiro; Sato, Tadaya; Oosaka, Hajime; Tosa, Tetsuo; Kadowaki, Ken

    2014-12-01

    Cardiac interventional procedures have been increasing year by year. However, radiation skin injuries have been still reported. There is a necessity to measure the patient entrance skin dose (ESD), but an accurate dose measurement method has not been established. To measure the ESD, a lot of radiophotoluminescence dosemeters (RPLDs) provide an accurate measurement of the direct actual ESD at the points they are arrayed. The purpose of this study was to examine the characteristics of RPLD to measure the ESD. As a result, X-ray permeable RPLD (with no tin filter) did not interfere with the percutaneous coronary intervention procedure. The RPLD also had good fundamental performance characteristics. Although the RPLD had a little energy dependence, it showed excellent dose and dose-rate linearity, and good angular dependence. In conclusion, by calibrating the energy dependence, RPLDs are useful dosemeter to measure the ESD in cardiac intervention. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Investigation of an Error Theory for Conjoint Measurement Methodology.

    Science.gov (United States)

    1983-05-01

    1ybren, 1982; Srinivasan and Shocker, 1973a, 1973b; Ullrich =d Cumins , 1973; Takane, Young, and de Leeui, 190C; Yount,, 1972’. & OEM...procedures as a diagnostic tool. Specifically, they used the oompted STRESS - value and a measure of fit they called PRECAP that could be obtained

  15. GY SAMPLING THEORY IN ENVIRONMENTAL STUDIES 2: SUBSAMPLING ERROR MEASUREMENTS

    Science.gov (United States)

    Sampling can be a significant source of error in the measurement process. The characterization and cleanup of hazardous waste sites require data that meet site-specific levels of acceptable quality if scientifically supportable decisions are to be made. In support of this effort,...

  16. Measurement of Classroom Teaching Quality with Item Response Theory

    Science.gov (United States)

    Kelcey, Ben; McGinn, Daniel; Hill, Heather

    2013-01-01

    Recent policy has charged schools and districts with maintaining highly qualified teachers and differentiating among teachers in terms of their effectiveness (U.S. Department of Education, 2009). This emphasis has driven the development and implementation of teacher quality measures which are increasingly being used to evaluate teachers with…

  17. Performance measurement, expectancy and agency theory: An experimental study

    NARCIS (Netherlands)

    Sloof, R.; van Praag, C.M.

    2005-01-01

    Theoretical analyses of (optimal) performance measures are typically performed within the realm of the linear agency model. An important implication of this model is that, for a given compensation scheme, the agent's optimal effort choice is unrelated to the amount of noise in the performance

  18. High voltage engineering fundamentals

    CERN Document Server

    Kuffel, E; Hammond, P

    1984-01-01

    Provides a comprehensive treatment of high voltage engineering fundamentals at the introductory and intermediate levels. It covers: techniques used for generation and measurement of high direct, alternating and surge voltages for general application in industrial testing and selected special examples found in basic research; analytical and numerical calculation of electrostatic fields in simple practical insulation system; basic ionisation and decay processes in gases and breakdown mechanisms of gaseous, liquid and solid dielectrics; partial discharges and modern discharge detectors; and over

  19. Biomedical engineering fundamentals

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Engineering Fundamentals, the first volume of the handbook, presents material from respected scientists with diverse backgrounds in physiological systems, biomechanics, biomaterials, bioelectric phenomena, and neuroengineering. More than three dozen specific topics are examined, including cardia

  20. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Mill, A.J.; Charles, M.W.; Wells, J.

    1978-04-01

    A review is presented of basic radiation physics with particular relevance to radiological protection. The processes leading to the production and absorption of ionising radiation are outlined, and the important dosimetric quantities and their units of measurements. The review is the first of a series of reports presenting the fundamentals necessary for an understanding of the basis of regulatory criteria such as those recommended by the ICRP. (author)

  1. Protocol: validation of the INCODE barometer to measure the innovation compe-tence through the Rasch Measurement Theory

    Directory of Open Access Journals (Sweden)

    Lidia Sanchez

    2017-06-01

    Full Text Available This communication presents a protocol in order to show the different phases that must be followed in order to validate the INCODE barometer, which is used to measure the innovation competence, with Rasch Measurement Theory. Five phases are stated: dimensionality analysis, individual reliability and validity analysis of ítems and persons, global reliability and validity analysis, and cathegory analysis.

  2. Measurements of proton strength functions and comparisons with theory

    International Nuclear Information System (INIS)

    Arai, E.; Ozawa, Y.

    1986-01-01

    Using a high-resolution proton beam of the Tokyo Institute of Technology Van de Graaff, precise measurements of elastic and inelastic scattering cross sections have been performed in the past 15 years. By directly observing individual proton resonances, their spins, parities and proton decay widths were deduced. From these experiments we have evaluated (1) proton strength functions in terms of target mass number and of incident proton energy and (2) Coulomb matrix elements for split analogue resonances. (orig.)

  3. Invariant Set Theory: Violating Measurement Independence without Fine Tuning, Conspiracy, Constraints on Free Will or Retrocausality

    Directory of Open Access Journals (Sweden)

    Tim Palmer

    2015-11-01

    Full Text Available Invariant Set (IS theory is a locally causal ontic theory of physics based on the Cosmological Invariant Set postulate that the universe U can be considered a deterministic dynamical system evolving precisely on a (suitably constructed fractal dynamically invariant set in U's state space. IS theory violates the Bell inequalities by violating Measurement Independence. Despite this, IS theory is not fine tuned, is not conspiratorial, does not constrain experimenter free will and does not invoke retrocausality. The reasons behind these claims are discussed in this paper. These arise from properties not found in conventional ontic models: the invariant set has zero measure in its Euclidean embedding space, has Cantor Set structure homeomorphic to the p-adic integers (p>>0 and is non-computable. In particular, it is shown that the p-adic metric encapulates the physics of the Cosmological Invariant Set postulate, and provides the technical means to demonstrate no fine tuning or conspiracy. Quantum theory can be viewed as the singular limit of IS theory when when p is set equal to infinity. Since it is based around a top-down constraint from cosmology, IS theory suggests that gravitational and quantum physics will be unified by a gravitational theory of the quantum, rather than a quantum theory of gravity. Some implications arising from such a perspective are discussed.

  4. Detecting vocal fatigue in student singers using acoustic measures of mean fundamental frequency, jitter, shimmer, and harmonics-to-noise ratio

    Science.gov (United States)

    Sisakun, Siphan

    2000-12-01

    The purpose of this study is to explore the ability of four acoustic parameters, mean fundamental frequency, jitter, shimmer, and harmonics-to-noise ratio, to detect vocal fatigue in student singers. The participants are 15 voice students, who perform two distinct tasks, data collection task and vocal fatiguing task. The data collection task includes the sustained vowel /a/, reading a standard passage, and self-rate on a vocal fatigue form. The vocal fatiguing task is the vocal practice of musical scores for a total of 45 minutes. The four acoustic parameters are extracted using the software EZVoicePlus. The data analyses are performed to answer eight research questions. The first four questions relate to correlations of the self-rating scale and each of the four parameters. The next four research questions relate to differences in the parameters over time using one-factor repeated measures analysis of variance (ANOVA). The result yields a proposed acoustic profile of vocal fatigue in student singers. This profile is characterized by increased fundamental frequency; slightly decreased jitter; slightly decreased shimmer; and slightly increased harmonics-to-noise ratio. The proposed profile requires further investigation.

  5. Bayesian modeling of measurement error in predictor variables using item response theory

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Glas, Cornelis A.W.

    2000-01-01

    This paper focuses on handling measurement error in predictor variables using item response theory (IRT). Measurement error is of great important in assessment of theoretical constructs, such as intelligence or the school climate. Measurement error is modeled by treating the predictors as unobserved

  6. Comparison of Theory with Rotation Measurements in JET ICRH Plasmas

    International Nuclear Information System (INIS)

    R.V. Budny; C.S. Chang; C. Giroud; R.J. Goldston; D. McCune; J. Ongena; F.W. Perkins; R.B. White; K.-D. Zastrow; and contributors to the EFDA-JET work programme

    2001-01-01

    Plasma rotation appears to improve plasma performance by increasing the E x B flow shearing rate, thus decreasing radial correlations in the microturbulence. Also, plasma rotation can increase the stability to resistive MHD modes. In the Joint European Torus (JET), toroidal rotation rates omega (subscript ''tor'') with high Mach numbers are generally measured in NBI-heated plasmas (since the neutral beams aim in the co-plasma current direction). They are considerably lower with only ICRH (and Ohmic) heating, but still surprisingly large considering that ICRH appears to inject relatively small amounts of angular momentum. Either the applied torques are larger than naively expected, or the anomalous transport of angular momentum is smaller than expected. Since ICRH is one of the main candidates for heating next-step tokamaks, and for creating burning plasmas in future tokamak reactors, this paper attempts to understand ICRH-induced plasma rotation

  7. Theory and measurements of emittance preservation in plasma wakefield acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Frederico, Joel

    2016-12-01

    In this dissertation, we examine the preservation and measurement of emittance in the plasma wakefield acceleration blowout regime. Plasma wakefield acceleration (PWFA) is a revolutionary approach to accelerating charged particles that has been demonstrated to have the potential for gradients orders of magnitude greater than traditional approaches. The application of PWFA to the design of a linear collider will make new high energy physics research possible, but the design parameters must first be shown to be competitive with traditional methods. Emittance preservation is necessary in the design of a linear collider in order to maximize luminosity. We examine the conditions necessary for circular symmetry in the PWFA blowout regime, and demonstrate that current proposals meet these bounds. We also present an application of beam lamentation which describes the process of beam parameter and emittance matching. We show that the emittance growth saturates as a consequence of energy spread in the beam. The initial beam parameters determine the amount of emittance growth, while the contribution of energy spread is negligible. We also present a model for ion motion in the presence of a beam that is much more dense than the plasma. By combining the model of ion motion and emittance growth, we find the emittance growth due to ion motion is minimal in the case of marginal ion motion. In addition, we present a simulation that validates the ion motion model, which is under further development to examine emittance growth of both marginal and pronounced ion motion. Finally, we present a proof-of-concept of an emittance measurement which may enable the analysis of emittance preservation in future PWFA experiments.

  8. Summary: fundamental interactions and processes

    International Nuclear Information System (INIS)

    Koltun, D.S.

    1982-01-01

    The subjects of the talks of the first day of the workshop are discussed in terms of fundamental interactions, dynamical theory, and relevant degrees of freedom. Some general considerations are introduced and are used to confront the various approaches taken in the earlier talks

  9. Superconducting gravity gradiometer for sensitive gravity measurements. I. Theory

    International Nuclear Information System (INIS)

    Chan, H.A.; Paik, H.J.

    1987-01-01

    Because of the equivalence principle, a global measurement is necessary to distinguish gravity from acceleration of the reference frame. A gravity gradiometer is therefore an essential instrument needed for precision tests of gravity laws and for applications in gravity survey and inertial navigation. Superconductivity and SQUID (superconducting quantum interference device) technology can be used to obtain a gravity gradiometer with very high sensitivity and stability. A superconducting gravity gradiometer has been developed for a null test of the gravitational inverse-square law and space-borne geodesy. Here we present a complete theoretical model of this instrument. Starting from dynamical equations for the device, we derive transfer functions, a common mode rejection characteristic, and an error model of the superconducting instrument. Since a gradiometer must detect a very weak differential gravity signal in the midst of large platform accelerations and other environmental disturbances, the scale factor and common mode rejection stability of the instrument are extremely important in addition to its immunity to temperature and electromagnetic fluctuations. We show how flux quantization, the Meissner effect, and properties of liquid helium can be utilized to meet these challenges

  10. Fundamentals of negative refractive index optical trapping: forces and radiation pressures exerted by focused Gaussian beams using the generalized Lorenz-Mie theory.

    Science.gov (United States)

    Ambrosio, Leonardo A; Hernández-Figueroa, Hugo E

    2010-11-04

    Based on the generalized Lorenz-Mie theory (GLMT), this paper reveals, for the first time in the literature, the principal characteristics of the optical forces and radiation pressure cross-sections exerted on homogeneous, linear, isotropic and spherical hypothetical negative refractive index (NRI) particles under the influence of focused Gaussian beams in the Mie regime. Starting with ray optics considerations, the analysis is then extended through calculating the Mie coefficients and the beam-shape coefficients for incident focused Gaussian beams. Results reveal new and interesting trapping properties which are not observed for commonly positive refractive index particles and, in this way, new potential applications in biomedical optics can be devised.

  11. Testing Fundamental Gravitation in Space

    Energy Technology Data Exchange (ETDEWEB)

    Turyshev, Slava G.

    2013-10-15

    General theory of relativity is a standard theory of gravitation; as such, it is used to describe gravity when the problems in astronomy, astrophysics, cosmology, and fundamental physics are concerned. The theory is also relied upon in many modern applications involving spacecraft navigation, geodesy, and time transfer. Here we review the foundations of general relativity and discuss its current empirical status. We describe both the theoretical motivation and the scientific progress that may result from the new generation of high-precision tests that are anticipated in the near future.

  12. Atomic spectroscopy and highly accurate measurement: determination of fundamental constants; Spectroscopie atomique et mesures de grande precision: determination de constantes fonfamentales

    Energy Technology Data Exchange (ETDEWEB)

    Schwob, C

    2006-12-15

    This document reviews the theoretical and experimental achievements of the author concerning highly accurate atomic spectroscopy applied for the determination of fundamental constants. A pure optical frequency measurement of the 2S-12D 2-photon transitions in atomic hydrogen and deuterium has been performed. The experimental setting-up is described as well as the data analysis. Optimized values for the Rydberg constant and Lamb shifts have been deduced (R = 109737.31568516 (84) cm{sup -1}). An experiment devoted to the determination of the fine structure constant with an aimed relative uncertainty of 10{sup -9} began in 1999. This experiment is based on the fact that Bloch oscillations in a frequency chirped optical lattice are a powerful tool to transfer coherently many photon momenta to the atoms. We have used this method to measure accurately the ratio h/m(Rb). The measured value of the fine structure constant is {alpha}{sub -1} = 137.03599884 (91) with a relative uncertainty of 6.7*10{sup -9}. The future and perspectives of this experiment are presented. This document presented before an academic board will allow his author to manage research work and particularly to tutor thesis students. (A.C.)

  13. Quantum measurement

    CERN Document Server

    Busch, Paul; Pellonpää, Juha-Pekka; Ylinen, Kari

    2016-01-01

    This is a book about the Hilbert space formulation of quantum mechanics and its measurement theory. It contains a synopsis of what became of the Mathematical Foundations of Quantum Mechanics since von Neumann’s classic treatise with this title. Fundamental non-classical features of quantum mechanics—indeterminacy and incompatibility of observables, unavoidable measurement disturbance, entanglement, nonlocality—are explicated and analysed using the tools of operational quantum theory. The book is divided into four parts: 1. Mathematics provides a systematic exposition of the Hilbert space and operator theoretic tools and relevant measure and integration theory leading to the Naimark and Stinespring dilation theorems; 2. Elements develops the basic concepts of quantum mechanics and measurement theory with a focus on the notion of approximate joint measurability; 3. Realisations offers in-depth studies of the fundamental observables of quantum mechanics and some of their measurement implementations; and 4....

  14. The predictive validity of prospect theory versus expected utility in health utility measurement.

    Science.gov (United States)

    Abellan-Perpiñan, Jose Maria; Bleichrodt, Han; Pinto-Prades, Jose Luis

    2009-12-01

    Most health care evaluations today still assume expected utility even though the descriptive deficiencies of expected utility are well known. Prospect theory is the dominant descriptive alternative for expected utility. This paper tests whether prospect theory leads to better health evaluations than expected utility. The approach is purely descriptive: we explore how simple measurements together with prospect theory and expected utility predict choices and rankings between more complex stimuli. For decisions involving risk prospect theory is significantly more consistent with rankings and choices than expected utility. This conclusion no longer holds when we use prospect theory utilities and expected utilities to predict intertemporal decisions. The latter finding cautions against the common assumption in health economics that health state utilities are transferable across decision contexts. Our results suggest that the standard gamble and algorithms based on, should not be used to value health.

  15. Can theory of mind deficits be measured reliably in people with mild and moderate Alzheimer's dementia?

    Science.gov (United States)

    Choong, Caroline Sm; Doody, Gillian A

    2013-01-01

    Patients suffering from Alzheimer's dementia develop difficulties in social functioning. This has led to an interest in the study of "theory of mind" in this population. However, difficulty has arisen because the associated cognitive demands of traditional short story theory of mind assessments result in failure per se in this population, making it challenging to test pure theory of mind ability. Simplified, traditional 1st and 2nd order theory of mind short story tasks and a battery of alternative theory of mind cartoon jokes and control slapstick cartoon jokes, without memory components, were administered to 16 participants with mild-moderate Alzheimer's dementia, and 11 age-matched healthy controls. No significant differences were detected between participants with Alzheimer's dementia and controls on the 1st or 2nd order traditional short story theory of mind tasks (p = 0.155 and p = 0.154 respectively). However, in the cartoon joke tasks there were significant differences in performance between the Alzheimer participants and the control group, this was evident for both theory of mind cartoons and the control 'slapstick' jokes. It remains very difficult to assess theory of mind as an isolated phenomenon in populations with global cognitive impairment, such as Alzheimer's dementia, as the tasks used to assess this cognition invariably depend on other cognitive functions. Although a limitation of this study is the small sample size, the results suggest that there is no measurable specific theory of mind deficit in people with Alzheimer's dementia, and that the use of theory of mind representational models to measure social cognitive ability may not be appropriate in this population.

  16. Value of Fundamental Science

    Science.gov (United States)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  17. On divergence of finite measures and their applicability in statistics and information theory

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Stummer, W.

    2009-01-01

    Roč. 44, č. 2 (2009), s. 169-187 ISSN 0233-1888 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Local and global divergences of finite measures * Divergences of sigma-finite measures * Statistical censoring * Pinsker's inequality, Ornstein's distance * Differential power entropies Subject RIV: BD - Theory of Information Impact factor: 0.759, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/vajda-on divergence of finite measures and their applicability in statistics and information theory.pdf

  18. Sex and Self-Control Theory: The Measures and Causal Model May Be Different

    Science.gov (United States)

    Higgins, George E.; Tewksbury, Richard

    2006-01-01

    This study examines the distribution differences across sexes in key measures of self-control theory and differences in a causal model. Using cross-sectional data from juveniles ("n" = 1,500), the study shows mean-level differences in many of the self-control, risky behavior, and delinquency measures. Structural equation modeling…

  19. General theory of three-dimensional radiance measurements with optical microprobes RID A-1977-2009

    DEFF Research Database (Denmark)

    FukshanskyKazarinova, N.; Fukshansky, L.; Kuhl, M.

    1997-01-01

    Measurements of the radiance distribution and fluence rate within turbid samples with fiber-optic radiance microprobes contain a large variable instrumental error caused by the nonuniform directional sensitivity of the microprobes. A general theory of three-dimensional radiance measurements...

  20. Open problems in Banach spaces and measure theory | Rodríguez ...

    African Journals Online (AJOL)

    We collect several open questions in Banach spaces, mostly related to measure theoretic aspects of the theory. The problems are divided into five categories: miscellaneous problems in Banach spaces (non-separable Lp spaces, compactness in Banach spaces, w*-null sequences in dual spaces), measurability in Banach ...

  1. Measuring Theory of Mind in Children. Psychometric Properties of the ToM Storybooks

    NARCIS (Netherlands)

    Blijd-Hoogewys, E. M. A.; van Geert, P. L. C.; Serra, M.; Minderaa, R. B.

    Although research on Theory-of-Mind (ToM) is often based on single task measurements, more comprehensive instruments result in a better understanding of ToM development. The ToM Storybooks is a new instrument measuring basic ToM-functioning and associated aspects. There are 34 tasks, tapping various

  2. Measuring Theory of Mind in Children. Psychometric Properties of the ToM Storybooks

    NARCIS (Netherlands)

    Blijd-Hoogewys, E. M. A.; van Geert, P. L. C.; Serra, M.; Minderaa, R. B.

    2008-01-01

    Although research on Theory-of-Mind (ToM) is often based on single task measurements, more comprehensive instruments result in a better understanding of ToM development. The ToM Storybooks is a new instrument measuring basic ToM-functioning and associated aspects. There are 34 tasks, tapping various

  3. Measuring Constructs in Family Science: How Can Item Response Theory Improve Precision and Validity?

    Science.gov (United States)

    Gordon, Rachel A.

    2015-01-01

    This article provides family scientists with an understanding of contemporary measurement perspectives and the ways in which item response theory (IRT) can be used to develop measures with desired evidence of precision and validity for research uses. The article offers a nontechnical introduction to some key features of IRT, including its…

  4. Finite-measuring approximation of operators of scattering theory in representation of wave packets

    International Nuclear Information System (INIS)

    Kukulin, V.I.; Rubtsova, O.A.

    2004-01-01

    Several types of the packet quantization of the continuos spectrum in the scattering theory quantum problems are considered. Such a quantization leads to the convenient finite-measuring (i.e. matrix) approximation of the integral operators in the scattering theory and it makes it possible to reduce the solution of the singular integral equations, complying with the scattering theory, to the convenient purely algebraic equations on the analytical basis, whereby all the singularities are separated in the obvious form. The main attention is paid to the problems of the method practical realization [ru

  5. Qualitative insights on fundamental mechanics

    International Nuclear Information System (INIS)

    Mardari, Ghenadie N

    2007-01-01

    The gap between classical mechanics and quantum mechanics has an important interpretive implication: the Universe must have an irreducible fundamental level, which determines the properties of matter at higher levels of organization. We show that the main parameters of any fundamental model must be theory-independent. Moreover, such models must also contain discrete identical entities with constant properties. These conclusions appear to support the work of Kaniadakis on subquantum mechanics. A qualitative analysis is offered to suggest compatibility with relevant phenomena, as well as to propose new means for verification

  6. Critical investigation of Jauch's approach to the quantum theory of measurement

    International Nuclear Information System (INIS)

    Herbut, Fedor

    1986-01-01

    To make Jauch's approach more realistic, his assumptions are modified in two ways: (1) On the quantum system plus the measuring apparatus (S + MA) after the measuring interaction has ceased, one can actually measure only operators of the form given. (2) Measurement is defined in the most general way (including, besides first-kind, also second-kind and third-kind or indirect measurements). It is shown that Jauch's basic result that the microstates (statistical operators) of S + MA before and after the collapse correspond to the same macrostate (belong to the same equivalence class of microstates) remains valid under the above modifications, and that the significance of this result goes beyond measurement theory. On the other hand, it is argued that taking the orthodox (i.e. uncompromisingly quantum) view of quantum mechanics, it is not the collapse, but the Jauch-type macrostates that are spurious in a Jauch-type theory. (author)

  7. Brief Report: Preliminary Evaluation of the Theory of Mind Inventory and Its Relationship to Measures of Social Skills

    Science.gov (United States)

    Lerner, Matthew D.; Hutchins, Tiffany L.; Prelock, Patricia A.

    2011-01-01

    This study presents updated information on a parent-report measure of Theory of Mind (ToM), formerly called the Perception of Children's Theory of Mind Measure (Hutchins et al., "J Autism Dev Disord" 38:143-155, 2008), renamed the Theory of Mind Inventory (ToMI), for use with parents of children with autism spectrum disorder (ASD). This study…

  8. Fundamentos epistemológicos da teoria modular da mente de Jerry A. Fodor Epistemological fundaments of Jerry A. Fodor's modular theory of mind

    Directory of Open Access Journals (Sweden)

    Kleber Bez Birolo Candiotto

    2008-01-01

    Full Text Available Este artigo é uma apresentação dos fundamentos da teoria modular desenvolvida por Jerry A. Fodor e uma reflexão sobre seus principais desafios. A noção de modularidade da mente de Fodor, por um lado, procura superar as insuficiências metodológicas e epistemológicas do associacionismo e do localizacionismo a respeito das explicações da estrutura e do funcionamento mental; por outro lado, é uma oposição à postura culturalista de Vygotsky, para o qual as funções superiores da mente, como a cognição, são produtos artificiais, culturais. A psicologia cognitiva de Chomsky converteu esse produto "artificial" em "natural", postulando a existência de módulos inatos para desempenhar funções cognitivas específicos. Com base nessa ideia de Chomsky, Fodor procura explicar a mente como um conjunto de módulos. No entanto, sua principal contribuição para as ciências cognitivas é a apresentação da arquitetura mental em dois níveis e a afirmação da existência de módulos centrais responsáveis pelas atividades cognitivas superiores, como criatividade, reflexão ou imaginação.The aim of this paper is to present the basic elements regarding the modular theory developed by Jerry A. Fodor and some considerations about its main challenges. Fodor's notion of mind modularity, on the one hand, aims at overcoming the methodological and epistemological gaps of associationism and localizationism concerning the explanations of the structure and functioning of the mind; on the other hand, Fodor's notion stands as an opposition to Vygotsky's culturalist posture, since for the latter the higher functions of the mind, such as cognition, are artificial and cultural products. Chomsky's cognitive psychology has converted this "artificial" product into a "natural" one, postulating the existence of innate modules to perform specific cognitive functions. Based on Chomsky's idea, Fodor describes the mind as a group of modules. However, his main

  9. Theory of equidistant three-dimensional radiance measurements with optical microprobes RID A-1977-2009

    DEFF Research Database (Denmark)

    FukshanskyKazarinova, N.; Fukshansky, L.; Kuhl, Morten

    1996-01-01

    Fiber-optic radiance microprobes, increasingly applied for measurements of internal light fields in living tissues, provide three-dimensional radiance distribution solids and radiant energy fluence rates at different depths of turbid samples. These data are, however, distorted because of an inher...... of application is presented. The limitations of this theory and the prospects for this approach are discussed....... of an inherent feature of optical fibers: nonuniform angular sensitivity. Because of this property a radiance microprobe during a single measurement partly underestimates light from the envisaged direction and partly senses light from other directions. A theory of three-dimensional equidistant radiance...

  10. Covariance operator of functional measure in P(φ)2-quantum field theory

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.; Zhidkov, E.P.

    1988-01-01

    Functional integration measure in the Euclidean quantum field theory with polynomial interactions of boson fields with zero spin in two-dimensional space-time is investigated. The representation for the kernal of the measure covariance operator is obtained in the form of expansion over the eigenfunctions of some boundary problem for the heat equation. Two cases of the integration domains with different configurations are considered. Some trends and perspectives of employing the functional integration method in quantum field theory are also discussed. 43 refs

  11. Fundamental Safety Principles

    International Nuclear Information System (INIS)

    Abdelmalik, W.E.Y.

    2011-01-01

    This work presents a summary of the IAEA Safety Standards Series publication No. SF-1 entitled F UDAMENTAL Safety PRINCIPLES p ublished on 2006. This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purposes. Safety measures and security measures have in common the aim of protecting human life and health and the environment. These safety principles are: 1) Responsibility for safety, 2) Role of the government, 3) Leadership and management for safety, 4) Justification of facilities and activities, 5) Optimization of protection, 6) Limitation of risks to individuals, 7) Protection of present and future generations, 8) Prevention of accidents, 9)Emergency preparedness and response and 10) Protective action to reduce existing or unregulated radiation risks. The safety principles concern the security of facilities and activities to the extent that they apply to measures that contribute to both safety and security. Safety measures and security measures must be designed and implemented in an integrated manner so that security measures do not compromise safety and safety measures do not compromise security.

  12. Search for fundamental 'God Particle' speeds up

    CERN Multimedia

    Spotts, P N

    2000-01-01

    This month researchers at CERN are driving the accelerator to its limits and beyond to find the missing Higgs boson. Finding it would confirm a 30-yr-old theory about why matter's most fundamental particles have mass (1 page).

  13. A signal detection-item response theory model for evaluating neuropsychological measures.

    Science.gov (United States)

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G

    2018-02-05

    Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the

  14. Improving measurement of injection drug risk behavior using item response theory.

    Science.gov (United States)

    Janulis, Patrick

    2014-03-01

    Recent research highlights the multiple steps to preparing and injecting drugs and the resultant viral threats faced by drug users. This research suggests that more sensitive measurement of injection drug HIV risk behavior is required. In addition, growing evidence suggests there are gender differences in injection risk behavior. However, the potential for differential item functioning between genders has not been explored. To explore item response theory as an improved measurement modeling technique that provides empirically justified scaling of injection risk behavior and to examine for potential gender-based differential item functioning. Data is used from three studies in the National Institute on Drug Abuse's Criminal Justice Drug Abuse Treatment Studies. A two-parameter item response theory model was used to scale injection risk behavior and logistic regression was used to examine for differential item functioning. Item fit statistics suggest that item response theory can be used to scale injection risk behavior and these models can provide more sensitive estimates of risk behavior. Additionally, gender-based differential item functioning is present in the current data. Improved measurement of injection risk behavior using item response theory should be encouraged as these models provide increased congruence between construct measurement and the complexity of injection-related HIV risk. Suggestions are made to further improve injection risk behavior measurement. Furthermore, results suggest direct comparisons of composite scores between males and females may be misleading and future work should account for differential item functioning before comparing levels of injection risk behavior.

  15. Fundamental structures of algebra and discrete mathematics

    CERN Document Server

    Foldes, Stephan

    2011-01-01

    Introduces and clarifies the basic theories of 12 structural concepts, offering a fundamental theory of groups, rings and other algebraic structures. Identifies essentials and describes interrelationships between particular theories. Selected classical theorems and results relevant to current research are proved rigorously within the theory of each structure. Throughout the text the reader is frequently prompted to perform integrated exercises of verification and to explore examples.

  16. Fundamentals of thermophotovoltaic energy conversion

    CERN Document Server

    Chubb, Donald L

    2007-01-01

    This is a text book presenting the fundamentals of thermophotovoltaic(TPV) energy conversion suitable for an upper undergraduate or first year graduate course. In addition it can serve as a reference or design aid for engineers developing TPV systems. Mathematica design programs for interference filters and a planar TPV system are included on a CD-Rom disk. Each chapter includes a summary and concludes with a set of problems. The first chapter presents the electromagnetic theory and radiation transfer theory necessary to calculate the optical properties of the components in a TPV optical cavity. Using a simplified model, Chapter 2 develops expressions for the maximum efficiency and power density for an ideal TPV system. The next three chapters consider the three major components in a TPV system; the emitter, filter and photovoltaic(PV) array. Chapter 3 applies the electromagnetic theory and radiation transfer theory presented in Chapter 1 in the calculation of spectral emittance. From the spectral emittance t...

  17. Gendered language attitudes: exploring language as a gendered construct using Rasch measurement theory.

    Science.gov (United States)

    Knisely, Kris A; Wind, Stefanie A

    2015-01-01

    Gendered language attitudes (GLAs) are gender-based perceptions of language varieties based on connections between gender-related and linguistic characteristics of individuals, including the perception of language varieties as possessing degrees of masculinity and femininity. This study combines substantive theory about language learning and gender with a model based on Rasch measurement theory to explore the psychometric properties of a new measure of GLAs. Findings suggest that GLAs is a unidimensional construct and that the items used can be used to describe differences among students in terms of the strength of their GLAs. Implications for research, theory, and practice are discussed. Special emphasis is given to the teaching and learning of languages.

  18. Foam engineering fundamentals and applications

    CERN Document Server

    2012-01-01

    Containing contributions from leading academic and industrial researchers, this book provides a much needed update of foam science research. The first section of the book presents an accessible summary of the theory and fundamentals of foams. This includes chapters on morphology, drainage, Ostwald ripening, coalescence, rheology, and pneumatic foams. The second section demonstrates how this theory is used in a wide range of industrial applications, including foam fractionation, froth flotation and foam mitigation. It includes chapters on suprafroths, flotation of oil sands, foams in enhancing petroleum recovery, Gas-liquid Mass Transfer in foam, foams in glass manufacturing, fire-fighting foam technology and consumer product foams.

  19. Emotional vitality in caregivers: application of Rasch Measurement Theory with secondary data to development and test a new measure.

    Science.gov (United States)

    Barbic, Skye P; Bartlett, Susan J; Mayo, Nancy E

    2015-07-01

    To describe the practical steps in identifying items and evaluating scoring strategies for a new measure of emotional vitality in informal caregivers of individuals who have experienced a significant health event. The psychometric properties of responses to selected items from validated health-related quality of life and other psychosocial questionnaires administered four times over a one-year period were evaluated using Rasch Measurement Theory. Community. A total of 409 individuals providing informal care at home to older adults who had experienced a recent stroke. Rasch Measurement Theory was used to test the ordering of response option thresholds, fit, spread of the item locations, residual correlations, person separation index, and stability across time. Based on a theoretical framework developed in earlier work, we identified 22 candidate items from a pool of relevant psychosocial measures available. Of these, additional evaluation resulted in 19 items that could be used to assess the five core domains. The overall model fit was reasonable (χ(2) = 202.26, DF = 117, p = 0.06), stable across time, with borderline evidence of multidimensionality (10%). Items and people covered a continuum ranging from -3.7 to +2.7 logits, reflecting coverage of the measurement continuum, with a person separation index of 0.85. Mean fit of caregivers was lower than expected (-1.31 ±1.10 logits). Established methods from the Rasch Measurement Theory were applied to develop a prototype measure of emotional vitality that is acceptable, reliable, and can be used to obtain an interval level score for use in future research and clinical settings. © The Author(s) 2014.

  20. New progress of fundamental aspects in quantum mechanics

    International Nuclear Information System (INIS)

    Sun Changpu

    2001-01-01

    The review recalls the conceptual origins of various interpretations of quantum mechanics. With the focus on quantum measurement problems, new developments of fundamental quantum theory are described in association with recent experiments such as the decoherence process in cavity quantum electrodynamics 'which-way' detection using the Bragg scattering of cold atoms, and quantum interference using the small quantum system of molecular C 60 . The fundamental problems include the quantum coherence of a macroscopic object, the von Neumann chain in quantum measurement, the Schroedinger cat paradox, et al. Many land math experiments have been accomplished with possible important applications in quantum information. The most recent research on the new quantum theory by G.'t Hooft is reviewed, as well as future prospects of quantum mechanics

  1. Measuring Adjunct Instructor Job Satisfaction by Using Herzberg's Motivation-Hygiene Theory

    Science.gov (United States)

    Dickens, Durrell

    2011-01-01

    This study was designed to use Herzberg's motivation-hygiene theory to investigate the different levels of job satisfaction among adjunct college instructors at eight institutions of higher education located in southeast Texas. Differences in job satisfaction were measured by instructor gender, ethnicity, age, teaching experience, type of course…

  2. Generalizability Theory Reliability of Written Expression Curriculum-Based Measurement in Universal Screening

    Science.gov (United States)

    Keller-Margulis, Milena A.; Mercer, Sterett H.; Thomas, Erin L.

    2016-01-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African…

  3. Cognitive load measurement as a means to advance cognitive load theory

    NARCIS (Netherlands)

    Paas, F.; Tuovinen, J.E.; Tabbers, H.; van Gerven, P.W.M.

    2003-01-01

    This paper discusses cognitive load measurement techniques with regard to their contribution to cognitive load theory (CLT). CLT is concerned with the design of instructional methods that efficiently use people's limited cognitive processing capacity to apply acquired knowledge and skills to new

  4. Clean test of the electroweak theory by measuring weak boson masses

    International Nuclear Information System (INIS)

    Hioki, Zenro

    1985-01-01

    Role of the weak boson masses in the studies of electroweak higher order effects is surveyed. It is shown that precise measurements of these masses give us quite useful information for performing a clean test of the electroweak theory, and for a heavy fermion search. Effects of supersymmetric particles in these studies are also discussed. (author)

  5. The Theory and Measurement of Interorganizational Collaborative Capacity in the Acquisition and Contracting Context

    Science.gov (United States)

    2009-04-22

    members) of some defined population” ( Thorndike , 1971, p. 533). Norms in this context would allow an organization to understand its relative standing on...theory. New York: McGraw-Hill. Thorndike , R.L. (1971). Educational measurement (2nd Ed.). Washington, DC: American Council on Education. USD (AT&L

  6. Theory of thermal stresses

    CERN Document Server

    Boley, Bruno A

    1997-01-01

    Highly regarded text presents detailed discussion of fundamental aspects of theory, background, problems with detailed solutions. Basics of thermoelasticity, heat transfer theory, thermal stress analysis, more. 1985 edition.

  7. Theoretical prediction and impact of fundamental electric dipole moments

    International Nuclear Information System (INIS)

    Ellis, Sebastian A.R.; Kane, Gordon L.

    2016-01-01

    The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level in the theory at the unification or string scale ∼O(10 16 GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5×10 −30 e cm, and the neutron EDM should not be larger than about 5×10 −29 e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. We comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.

  8. Theoretical prediction and impact of fundamental electric dipole moments

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Sebastian A.R.; Kane, Gordon L. [Michigan Center for Theoretical Physics (MCTP),Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States)

    2016-01-13

    The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level in the theory at the unification or string scale ∼O(10{sup 16} GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5×10{sup −30}e cm, and the neutron EDM should not be larger than about 5×10{sup −29}e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. We comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.

  9. Exchange Rates and Fundamentals.

    Science.gov (United States)

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  10. Using Generalizability Theory to Disattenuate Correlation Coefficients for Multiple Sources of Measurement Error.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-05-02

    Over the years, research in the social sciences has been dominated by reporting of reliability coefficients that fail to account for key sources of measurement error. Use of these coefficients, in turn, to correct for measurement error can hinder scientific progress by misrepresenting true relationships among the underlying constructs being investigated. In the research reported here, we addressed these issues using generalizability theory (G-theory) in both traditional and new ways to account for the three key sources of measurement error (random-response, specific-factor, and transient) that affect scores from objectively scored measures. Results from 20 widely used measures of personality, self-concept, and socially desirable responding showed that conventional indices consistently misrepresented reliability and relationships among psychological constructs by failing to account for key sources of measurement error and correlated transient errors within occasions. The results further revealed that G-theory served as an effective framework for remedying these problems. We discuss possible extensions in future research and provide code from the computer package R in an online supplement to enable readers to apply the procedures we demonstrate to their own research.

  11. Generalized Galilean transformations and the measurement problem in the entropic dynamics approach to quantum theory

    Science.gov (United States)

    Johnson, David T.

    Quantum mechanics is an extremely successful and accurate physical theory, yet since its inception, it has been afflicted with numerous conceptual difficulties. The primary subject of this thesis is the theory of entropic quantum dynamics (EQD), which seeks to avoid these conceptual problems by interpreting quantum theory from an informational perspective. We begin by reviewing Cox's work in describing probability theory as a means of rationally and consistently quantifying uncertainties. We then discuss how probabilities can be updated according to either Bayes' theorem or the extended method of maximum entropy (ME). After that discussion, we review the work of Caticha and Giffin that shows that Bayes' theorem is a special case of ME. This important result demonstrates that the ME method is the general method for updating probabilities. We then review some motivating difficulties in quantum mechanics before discussing Caticha's work in deriving quantum theory from the approach of entropic dynamics, which concludes our review. After entropic dynamics is introduced, we develop the concepts of symmetries and transformations from an informational perspective. The primary result is the formulation of a symmetry condition that any transformation must satisfy in order to qualify as a symmetry in EQD. We then proceed to apply this condition to the extended Galilean transformation. This transformation is of interest as it exhibits features of both special and general relativity. The transformation yields a gravitational potential that arises from an equivalence of information. We conclude the thesis with a discussion of the measurement problem in quantum mechanics. We discuss the difficulties that arise in the standard quantum mechanical approach to measurement before developing our theory of entropic measurement. In entropic dynamics, position is the only observable. We show how a theory built on this one observable can account for the multitude of measurements present in

  12. [Instrument to measure adherence in hypertensive patients: contribution of Item Response Theory].

    Science.gov (United States)

    Rodrigues, Malvina Thaís Pacheco; Moreira, Thereza Maria Magalhaes; Vasconcelos, Alexandre Meira de; Andrade, Dalton Francisco de; Silva, Daniele Braz da; Barbetta, Pedro Alberto

    2013-06-01

    To analyze, by means of "Item Response Theory", an instrument to measure adherence to t treatment for hypertension. Analytical study with 406 hypertensive patients with associated complications seen in primary care in Fortaleza, CE, Northeastern Brazil, 2011 using "Item Response Theory". The stages were: dimensionality test, calibrating the items, processing data and creating a scale, analyzed using the gradual response model. A study of the dimensionality of the instrument was conducted by analyzing the polychoric correlation matrix and factor analysis of complete information. Multilog software was used to calibrate items and estimate the scores. Items relating to drug therapy are the most directly related to adherence while those relating to drug-free therapy need to be reworked because they have less psychometric information and low discrimination. The independence of items, the small number of levels in the scale and low explained variance in the adjustment of the models show the main weaknesses of the instrument analyzed. The "Item Response Theory" proved to be a relevant analysis technique because it evaluated respondents for adherence to treatment for hypertension, the level of difficulty of the items and their ability to discriminate between individuals with different levels of adherence, which generates a greater amount of information. The instrument analyzed is limited in measuring adherence to hypertension treatment, by analyzing the "Item Response Theory" of the item, and needs adjustment. The proper formulation of the items is important in order to accurately measure the desired latent trait.

  13. A new non-specificity measure in evidence theory based on belief intervals

    Institute of Scientific and Technical Information of China (English)

    Yang Yi; Han Deqiang; Jean Dezert

    2016-01-01

    In the theory of belief functions, the measure of uncertainty is an important concept, which is used for representing some types of uncertainty incorporated in bodies of evidence such as the discord and the non-specificity. For the non-specificity part, some traditional measures use for reference the Hartley measure in classical set theory;other traditional measures use the simple and heuristic function for joint use of mass assignments and the cardinality of focal elements. In this paper, a new non-specificity measure is proposed using lengths of belief intervals, which represent the degree of imprecision. Therefore, it has more intuitive physical meaning. It can be proved that our new measure can be rewritten in a general form for the non-specificity. Our new measure is also proved to be a strict non-specificity measure with some desired properties. Numerical examples, simulations, the related analyses and proofs are provided to show the characteristics and good properties of the new non-specificity definition. An example of an application of the new non-specificity measure is also presented.

  14. Metric-independent measures for supersymmetric extended object theories on curved backgrounds

    International Nuclear Information System (INIS)

    Nishino, Hitoshi; Rajpoot, Subhash

    2014-01-01

    For Green–Schwarz superstring σ-model on curved backgrounds, we introduce a non-metric measure Φ≡ϵ ij ϵ IJ (∂ i φ I )(∂ j φ J ) with two scalars φ I (I=1,2) used in ‘Two-Measure Theory’ (TMT). As in the flat-background case, the string tension T=(2πα ′ ) −1 emerges as an integration constant for the A i -field equation. This mechanism is further generalized to supermembrane theory, and to super-p-brane theory, both on general curved backgrounds. This shows the universal applications of dynamical measure of TMT to general supersymmetric extended objects on general curved backgrounds

  15. Soft Measurement Modeling Based on Chaos Theory for Biochemical Oxygen Demand (BOD

    Directory of Open Access Journals (Sweden)

    Junfei Qiao

    2016-12-01

    Full Text Available The precision of soft measurement for biochemical oxygen demand (BOD is always restricted due to various factors in the wastewater treatment plant (WWTP. To solve this problem, a new soft measurement modeling method based on chaos theory is proposed and is applied to BOD measurement in this paper. Phase space reconstruction (PSR based on Takens embedding theorem is used to extract more information from the limited datasets of the chaotic system. The WWTP is first testified as a chaotic system by the correlation dimension (D, the largest Lyapunov exponents (λ1, the Kolmogorov entropy (K of the BOD and other water quality parameters time series. Multivariate chaotic time series modeling method with principal component analysis (PCA and artificial neural network (ANN is then adopted to estimate the value of the effluent BOD. Simulation results show that the proposed approach has higher accuracy and better prediction ability than the corresponding modeling approaches not based on chaos theory.

  16. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  17. Electric batteries. Fundamental principles and theory, present state of the art of technology and trends of development. 3. rev. and enlarged ed. Batterien. Grundlagen und Theorie, aktueller technischer Stand und Entwicklungstendenzen

    Energy Technology Data Exchange (ETDEWEB)

    Kiehne, H.A.; Berndt, D.; Boettger, K.; Fischer, W.; Franke, H.; Friedheim, G.; Koethe, H.K.; Krakowski, H.; Middendorf, E.; Preuss, P.

    1988-01-01

    This volume gives a comprehensive survey of the present state of the electrochemical power storage with special consideration of their technical characteristics of application. The volume is structured as follows: 1) Electrochemical energy storage, general fundamentals; 2) Batteries for electric-powered industrial trucks; 2a) Energy supply concepts for driverless industrial trucks; 3) Batteries for electric-powered road vehicles; 4) Battery-fed electric drive from the user's point of view (=charging, maintenance); 5) Secured power supply with electric batteries; 6) Batteries for stationary power supplies; 7) Operation and use of batteries for a large-scale consumer (emergency power supplies for communication equipment of the Deutsche Bundespost); 8) Starter batteries of vehicles; 9) High-energy batteries (e.g. Zn/Cl/sub 2/-, Na/S-, Li/FeS-cells, fuel cells); 10) Solar-electric power supply with batteries; 11) Charging methods and charging technique; 12) Technology of battery chargers and current transformer, monitoring methods; 13) Standards and regulations for batteries and battery systems. (MM) With 192 figs.

  18. Quantum theory of multiple-input-multiple-output Markovian feedback with diffusive measurements

    International Nuclear Information System (INIS)

    Chia, A.; Wiseman, H. M.

    2011-01-01

    Feedback control engineers have been interested in multiple-input-multiple-output (MIMO) extensions of single-input-single-output (SISO) results of various kinds due to its rich mathematical structure and practical applications. An outstanding problem in quantum feedback control is the extension of the SISO theory of Markovian feedback by Wiseman and Milburn [Phys. Rev. Lett. 70, 548 (1993)] to multiple inputs and multiple outputs. Here we generalize the SISO homodyne-mediated feedback theory to allow for multiple inputs, multiple outputs, and arbitrary diffusive quantum measurements. We thus obtain a MIMO framework which resembles the SISO theory and whose additional mathematical structure is highlighted by the extensive use of vector-operator algebra.

  19. Comparisons of some scattering theories with recent scatterometer measurements. [sea roughness radar model

    Science.gov (United States)

    Fung, A. K.; Dome, G.; Moore, R. K.

    1977-01-01

    The paper compares the predictions of two different types of sea scatter theories with recent scatterometer measurements which indicate the variations of the backscattering coefficient with polarization, incident angle, wind speed, and azimuth angle. Wright's theory (1968) differs from that of Chan and Fung (1977) in two major aspects: (1) Wright uses Phillips' sea spectrum (1966) while Chan and Fung use that of Mitsuyasu and Honda, and (2) Wright uses a modified slick sea slope distribution by Cox and Munk (1954) while Chan and Fung use the slick sea slope distribution of Cox and Munk defined with respect to the plane perpendicular to the look direction. Satisfactory agreements between theory and experimental data are obtained when Chan and Fung's model is used to explain the wind and azimuthal dependence of the scattering coefficient.

  20. Fundamentals of the theory of plasticity

    CERN Document Server

    Kachanov, L M

    2004-01-01

    Intended for use by advanced engineering students and professionals, this volume focuses on plastic deformation of metals at normal temperatures, as applied to strength of machines and structures. 1971 edition.

  1. Fundamentals of set and number theory

    CERN Document Server

    Rodionov, Timofey V

    2018-01-01

    The series is devoted to the publication of monographs and high-level textbooks in mathematics, mathematical methods and their applications. Apart from covering important areas of current interest, a major aim is to make topics of an interdisciplinary nature accessible to the non-specialist. The works in this series are addressed to advanced students and researchers in mathematics and theoretical physics. In addition, it can serve as a guide for lectures and seminars on a graduate level. The series de Gruyter Studies in Mathematics was founded ca. 30 years ago by the late Professor Heinz Bauer and Professor Peter Gabriel with the aim to establish a series of monographs and textbooks of high standard, written by scholars with an international reputation presenting current fields of research in pure and applied mathematics. While the editorial board of the Studies has changed with the years, the aspirations of the Studies are unchanged. In times of rapid growth of mathematical knowledge carefully written monogr...

  2. Quantum optics and fundamentals of quantum theory

    International Nuclear Information System (INIS)

    Dusek, M.

    1997-01-01

    Quantum optics has opened up new opportunities for experimental verification of the basic principles of quantum mechanics, particularly in the field of quantum interference and so-called non-local phenomena. The results of the experiments described provide unambiguous support to quantum mechanics. (Z.J.)

  3. Merging Psychophysical and Psychometric Theory to Estimate Global Visual State Measures from Forced-Choices

    International Nuclear Information System (INIS)

    Massof, Robert W; Schmidt, Karen M; Laby, Daniel M; Kirschen, David; Meadows, David

    2013-01-01

    Visual acuity, a forced-choice psychophysical measure of visual spatial resolution, is the sine qua non of clinical visual impairment testing in ophthalmology and optometry patients with visual system disorders ranging from refractive error to retinal, optic nerve, or central visual system pathology. Visual acuity measures are standardized against a norm, but it is well known that visual acuity depends on a variety of stimulus parameters, including contrast and exposure duration. This paper asks if it is possible to estimate a single global visual state measure from visual acuity measures as a function of stimulus parameters that can represent the patient's overall visual health state with a single variable. Psychophysical theory (at the sensory level) and psychometric theory (at the decision level) are merged to identify the conditions that must be satisfied to derive a global visual state measure from parameterised visual acuity measures. A global visual state measurement model is developed and tested with forced-choice visual acuity measures from 116 subjects with no visual impairments and 560 subjects with uncorrected refractive error. The results are in agreement with the expectations of the model

  4. Free release measurement of radioactive waste on the basis of the Bayes theory

    International Nuclear Information System (INIS)

    Sokcic-Kostic, M.; Langer, F.; Schultheis, R.

    2013-01-01

    The application of Bayesian theory in the evaluation of the free release measurements requires complex co-ordination between experiment and analysis. The algorithms are more complex compared to those used in the frequentist data analysis and partly to those of the Monte Carlo methods. The user can get an objective treatment of parameters of the measurement error and - as a result - a reliable indication of confidence intervals. For release measurement, the upper limit of the confidence interval must be compared with the limit given by the Radiation Protection Regulations (StrlSchV) to decide on a possible release of the material under test. (orig.)

  5. Astrophysical probes of fundamental physics

    International Nuclear Information System (INIS)

    Martins, C.J.A.P.

    2009-01-01

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  6. Astrophysical probes of fundamental physics

    Energy Technology Data Exchange (ETDEWEB)

    Martins, C.J.A.P. [Centro de Astrofisica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); DAMTP, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom)

    2009-10-15

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  7. Molecular imaging. Fundamentals and applications

    International Nuclear Information System (INIS)

    Tian, Jie

    2013-01-01

    Covers a wide range of new theory, new techniques and new applications. Contributed by many experts in China. The editor has obtained the National Science and Technology Progress Award twice. ''Molecular Imaging: Fundamentals and Applications'' is a comprehensive monograph which describes not only the theory of the underlying algorithms and key technologies but also introduces a prototype system and its applications, bringing together theory, technology and applications. By explaining the basic concepts and principles of molecular imaging, imaging techniques, as well as research and applications in detail, the book provides both detailed theoretical background information and technical methods for researchers working in medical imaging and the life sciences. Clinical doctors and graduate students will also benefit from this book.

  8. The discriminating capacity of a measuring instrument: Revisiting Bloom (1942’s theory and formula

    Directory of Open Access Journals (Sweden)

    Louis Laurencelle

    2014-04-01

    Full Text Available Discriminating capacity” is defined as a property of a test, measuring device or scholastic exam, which enables us to segregate and categorize objects or people according to their measured values. The concept, anticipated by Bloom and derived here from Ferguson’s index of classificatory power, is developed upon three bases: the probability of categorizing an object (or person in its proper measuring interval; the sufficient length of measuring intervals; the number of efficacious intervals in an empirical or theoretical distribution of measures. Expressed as a function of the reliability coefficient of a measuring device, discriminating capacity appears as a new tool in the conceptual apparatus of classical test theory.

  9. Measuring Individual Differences in Generic Beliefs in Conspiracy Theories Across Cultures: Conspiracy Mentality Questionnaire

    Science.gov (United States)

    Bruder, Martin; Haffke, Peter; Neave, Nick; Nouripanah, Nina; Imhoff, Roland

    2013-01-01

    Conspiracy theories are ubiquitous when it comes to explaining political events and societal phenomena. Individuals differ not only in the degree to which they believe in specific conspiracy theories, but also in their general susceptibility to explanations based on such theories, that is, their conspiracy mentality. We present the Conspiracy Mentality Questionnaire (CMQ), an instrument designed to efficiently assess differences in the generic tendency to engage in conspiracist ideation within and across cultures. The CMQ is available in English, German, and Turkish. In four studies, we examined the CMQ’s factorial structure, reliability, measurement equivalence across cultures, and its convergent, discriminant, and predictive validity. Analyses based on a cross-cultural sample (Study 1a; N = 7,766) supported the conceptualization of conspiracy mentality as a one-dimensional construct across the three language versions of the CMQ that is stable across time (Study 1b; N = 141). Multi-group confirmatory factor analysis demonstrated cross-cultural measurement equivalence of the CMQ items. The instrument could therefore be used to examine differences in conspiracy mentality between European, North American, and Middle Eastern cultures. In Studies 2–4 (total N = 476), we report (re-)analyses of three datasets demonstrating the validity of the CMQ in student and working population samples in the UK and Germany. First, attesting to its convergent validity, the CMQ was highly correlated with another measure of generic conspiracy belief. Second, the CMQ showed patterns of meaningful associations with personality measures (e.g., Big Five dimensions, schizotypy), other generalized political attitudes (e.g., social dominance orientation and right-wing authoritarianism), and further individual differences (e.g., paranormal belief, lack of socio-political control). Finally, the CMQ predicted beliefs in specific conspiracy theories over and above other individual

  10. Measuring Individual Differences in Generic Beliefs in Conspiracy Theories Across Cultures: The Conspiracy Mentality Questionnaire (CMQ

    Directory of Open Access Journals (Sweden)

    Martin eBruder

    2013-04-01

    Full Text Available Conspiracy theories are ubiquitous when it comes to explaining political events and societal phenomena. Individuals differ not only in the degree to which they believe in specific conspiracy theories, but also in their general susceptibility to explanations based on such theories, that is, their conspiracy mentality. We present the Conspiracy Mentality Questionnaire (CMQ, an instrument designed to efficiently assess differences in the generic tendency to engage in conspiracist ideation within and across cultures. The CMQ is available in English, German, and Turkish. In four studies, we examined the CMQ’s factorial structure, reliability, measurement equivalence across cultures, and its convergent, discriminant, and predictive validity. Analyses based on a cross-cultural sample (Study 1a; N = 7,766 supported the conceptualization of conspiracy mentality as a one-dimensional construct across the three language versions of the CMQ that is stable across time (Study 1b; N = 141. Multigroup confirmatory factor analysis demonstrated cross-cultural measurement equivalence of the CMQ items. The instrument could therefore be used to examine differences in conspiracy mentality between European, North American, and Middle Eastern cultures. In Studies 2-4 (total N = 476, we report (re-analyses of 3 datasets demonstrating the validity of the CMQ in student and working population samples in the UK and Germany. First, attesting to its convergent validity, the CMQ was highly correlated with another measure of generic conspiracy belief. Second, the CMQ showed patterns of meaningful associations with personality measures (e.g., Big Five dimensions, schizotypy, other generalized political attitudes (e.g., social dominance orientation and right-wing authoritarianism, and further individual differences (e.g., paranormal belief, lack of socio-political control. Finally, the CMQ predicted beliefs in specific conspiracy theories over and above other individual

  11. Duality and free measures in vector spaces, the spectral theory of actions of non-locally compact groups

    OpenAIRE

    Vershik, A.

    2017-01-01

    The paper presents a general duality theory for vector measure spaces taking its origin in the author's papers written in the 1960s. The main result establishes a direct correspondence between the geometry of a measure in a vector space and the properties of the space of measurable linear functionals on this space regarded as closed subspaces of an abstract space of measurable functions. An example of useful new features of this theory is the notion of a free measure and its applications.

  12. Robust Measurement via A Fused Latent and Graphical Item Response Theory Model.

    Science.gov (United States)

    Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang

    2018-03-12

    Item response theory (IRT) plays an important role in psychological and educational measurement. Unlike the classical testing theory, IRT models aggregate the item level information, yielding more accurate measurements. Most IRT models assume local independence, an assumption not likely to be satisfied in practice, especially when the number of items is large. Results in the literature and simulation studies in this paper reveal that misspecifying the local independence assumption may result in inaccurate measurements and differential item functioning. To provide more robust measurements, we propose an integrated approach by adding a graphical component to a multidimensional IRT model that can offset the effect of unknown local dependence. The new model contains a confirmatory latent variable component, which measures the targeted latent traits, and a graphical component, which captures the local dependence. An efficient proximal algorithm is proposed for the parameter estimation and structure learning of the local dependence. This approach can substantially improve the measurement, given no prior information on the local dependence structure. The model can be applied to measure both a unidimensional latent trait and multidimensional latent traits.

  13. Dielectric properties of agricultural products – fundamental principles, influencing factors, and measurement technirques. Chapter 4. Electrotechnologies for Food Processing: Book Series. Volume 3. Radio-Frequency Heating

    Science.gov (United States)

    In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...

  14. Mercury in Environmental and Biological Samples Using Online Combustion with Sequential Atomic Absorption and Fluorescence Measurements: A Direct Comparison of Two Fundamental Techniques in Spectrometry

    Science.gov (United States)

    Cizdziel, James V.

    2011-01-01

    In this laboratory experiment, students quantitatively determine the concentration of an element (mercury) in an environmental or biological sample while comparing and contrasting the fundamental techniques of atomic absorption spectrometry (AAS) and atomic fluorescence spectrometry (AFS). A mercury analyzer based on sample combustion,…

  15. A game theory-based trust measurement model for social networks.

    Science.gov (United States)

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  16. Aberration measurement of projection optics in lithographic tools based on two-beam interference theory

    International Nuclear Information System (INIS)

    Ma Mingying; Wang Xiangzhao; Wang Fan

    2006-01-01

    The degradation of image quality caused by aberrations of projection optics in lithographic tools is a serious problem in optical lithography. We propose what we believe to be a novel technique for measuring aberrations of projection optics based on two-beam interference theory. By utilizing the partial coherent imaging theory, a novel model that accurately characterizes the relative image displacement of a fine grating pattern to a large pattern induced by aberrations is derived. Both even and odd aberrations are extracted independently from the relative image displacements of the printed patterns by two-beam interference imaging of the zeroth and positive first orders. The simulation results show that by using this technique we can measure the aberrations present in the lithographic tool with higher accuracy

  17. Aberration measurement of projection optics in lithographic tools based on two-beam interference theory.

    Science.gov (United States)

    Ma, Mingying; Wang, Xiangzhao; Wang, Fan

    2006-11-10

    The degradation of image quality caused by aberrations of projection optics in lithographic tools is a serious problem in optical lithography. We propose what we believe to be a novel technique for measuring aberrations of projection optics based on two-beam interference theory. By utilizing the partial coherent imaging theory, a novel model that accurately characterizes the relative image displacement of a fine grating pattern to a large pattern induced by aberrations is derived. Both even and odd aberrations are extracted independently from the relative image displacements of the printed patterns by two-beam interference imaging of the zeroth and positive first orders. The simulation results show that by using this technique we can measure the aberrations present in the lithographic tool with higher accuracy.

  18. Islamic fundamentalism in Indonesia

    OpenAIRE

    Nagy, Sandra L.

    1996-01-01

    This is a study of Islamic fundamentalism in Indonesia. Islamic fundamentalism is defined as the return to the foundations and principles of Islam including all movements based on the desire to create a more Islamic society. After describing the practices and beliefs of Islam, this thesis examines the three aspects of universal Islamic fundamentalism: revivalism, resurgence, and radicalism. It analyzes the role of Islam in Indonesia under Dutch colonial rule, an alien Christian imperialist po...

  19. In search of the "lost capital". A theory for valuation, investment decisions, performance measurement

    OpenAIRE

    Magni, Carlo Alberto

    2007-01-01

    This paper presents a theoretical framework for valuation, investment decisions, and performance measurement based on a nonstandard theory of residual income. It is derived from the notion of "unrecovered" capital, which is here named "lost" capital because it represents the capital foregone by the investors. Its theoretical strength and meaningfulness is shown by deriving it from four main perspectives: financial, microeconomic, axiomatic, accounting. Implications for asset valuation, cap...

  20. Fundamentals of gas dynamics

    CERN Document Server

    Babu, V

    2014-01-01

    Fundamentals of Gas Dynamics, Second Edition isa comprehensively updated new edition and now includes a chapter on the gas dynamics of steam. It covers the fundamental concepts and governing equations of different flows, and includes end of chapter exercises based on the practical applications. A number of useful tables on the thermodynamic properties of steam are also included.Fundamentals of Gas Dynamics, Second Edition begins with an introduction to compressible and incompressible flows before covering the fundamentals of one dimensional flows and normal shock wav

  1. A new measurement for the revised reinforcement sensitivity theory: psychometric criteria and genetic validation

    Directory of Open Access Journals (Sweden)

    Martin eReuter

    2015-03-01

    Full Text Available Jeffrey Gray’s Reinforcement Sensitivity Theory (RST represents one of the most influential biologically-based personality theories describing individual differences in approach and avoidance tendencies. The most prominent self-report inventory to measure individual differences in approach and avoidance behavior to date is the BIS/BAS scale by Carver & White (1994. As Gray & McNaughton (2000 revised the RST after its initial formulation in the 1970/80s, and given the Carver & White measure is based on the initial conceptualization of RST, there is a growing need for self-report inventories measuring individual differences in the revised behavioral inhibition system (BIS, behavioral activation system (BAS and the fight, flight, freezing system (FFFS. Therefore, in this paper we present a new questionnaire measuring individual differences in the revised constructs of the BIS, BAS and FFFS in N = 1814 participants (German sample. An English translated version of the new measure is also presented and tested in N = 299 English language participants. A large number of German participants (N = 1090 also filled in the BIS/BAS scales by Carver & White (1994 and the correlations between these measures are presented. Finally, this same subgroup of participants provided buccal swaps for the investigation of the arginine vasopressin receptor 1a (AVPR1a gene. Here, a functional genetic polymorphism (rs11174811 on the AVPR1a gene was shown to be associated with individual differences in both the revised BIS and classic BIS dimensions.

  2. Critical Investigation of Jauch's Approach to the Quantum Theory of Measurement

    Science.gov (United States)

    Herbut, Fedor

    1986-08-01

    To make Jauch's approach more realistic, his assumptions are modified in two ways: (1) On the quantum system plus the measuring apparatus (S+MA) after the measuring interaction has ceased, one can actually measure only operators of the form A⊗∑ k b k Q k ,where A is any Hermitian operator for S, the resolution of the identity ∑kQk=1 defines MA as a classical system (following von Neumann), and the b k are real numbers (S and MA are distant). (2) Measurement is defined in the most general way (including, besides first-kind, also second-kind and third-kind or indirect measurements). It is shown that Jauch's basic result that the microstates (statistical operators) of S+MA before and after the collapse correspond to the same macrostate (belong to the same equivalence class of microstates) remains valid under the above modifications, and that the significance of this result goes beyond measurement theory. On the other hand, it is argued that taking the orthodox (i.e. uncompromisingly quantum) view of quantum mechanics, it is not the collapse, but the Jauch-type macrostates that are spurious in a Jauch-type theory.

  3. An Introduction to Item Response Theory for Patient-Reported Outcome Measurement

    Science.gov (United States)

    Nguyen, Tam H.; Han, Hae-Ra; Kim, Miyong T.

    2015-01-01

    The growing emphasis on patient-centered care has accelerated the demand for high-quality data from patient-reported outcome (PRO) measures. Traditionally, the development and validation of these measures has been guided by classical test theory. However, item response theory (IRT), an alternate measurement framework, offers promise for addressing practical measurement problems found in health-related research that have been difficult to solve through classical methods. This paper introduces foundational concepts in IRT, as well as commonly used models and their assumptions. Existing data on a combined sample (n = 636) of Korean American and Vietnamese American adults who responded to the High Blood Pressure Health Literacy Scale and the Patient Health Questionnaire-9 are used to exemplify typical applications of IRT. These examples illustrate how IRT can be used to improve the development, refinement, and evaluation of PRO measures. Greater use of methods based on this framework can increase the accuracy and efficiency with which PROs are measured. PMID:24403095

  4. Differential item functioning magnitude and impact measures from item response theory models.

    Science.gov (United States)

    Kleinman, Marjorie; Teresi, Jeanne A

    2016-01-01

    Measures of magnitude and impact of differential item functioning (DIF) at the item and scale level, respectively are presented and reviewed in this paper. Most measures are based on item response theory models. Magnitude refers to item level effect sizes, whereas impact refers to differences between groups at the scale score level. Reviewed are magnitude measures based on group differences in the expected item scores and impact measures based on differences in the expected scale scores. The similarities among these indices are demonstrated. Various software packages are described that provide magnitude and impact measures, and new software presented that computes all of the available statistics conveniently in one program with explanations of their relationships to one another.

  5. Reliability and validity of advanced theory-of-mind measures in middle childhood and adolescence.

    Science.gov (United States)

    Hayward, Elizabeth O; Homer, Bruce D

    2017-09-01

    Although theory-of-mind (ToM) development is well documented for early childhood, there is increasing research investigating changes in ToM reasoning in middle childhood and adolescence. However, the psychometric properties of most advanced ToM measures for use with older children and adolescents have not been firmly established. We report on the reliability and validity of widely used, conventional measures of advanced ToM with this age group. Notable issues with both reliability and validity of several of the measures were evident in the findings. With regard to construct validity, results do not reveal a clear empirical commonality between tasks, and, after accounting for comprehension, developmental trends were evident in only one of the tasks investigated. Statement of contribution What is already known on this subject? Second-order false belief tasks have acceptable internal consistency. The Eyes Test has poor internal consistency. Validity of advanced theory-of-mind tasks is often based on the ability to distinguish clinical from typical groups. What does this study add? This study examines internal consistency across six widely used advanced theory-of-mind tasks. It investigates validity of tasks based on comprehension of items by typically developing individuals. It further assesses construct validity, or commonality between tasks. © 2017 The British Psychological Society.

  6. Fundamentals of estuarine physical oceanography

    CERN Document Server

    Bruner de Miranda, Luiz; Kjerfve, Björn; Castro Filho, Belmiro Mendes de

    2017-01-01

    This book provides an introduction to the complex system functions, variability and human interference in ecosystem between the continent and the ocean. It focuses on circulation, transport and mixing of estuarine and coastal water masses, which is ultimately related to an understanding of the hydrographic and hydrodynamic characteristics (salinity, temperature, density and circulation), mixing processes (advection and diffusion), transport timescales such as the residence time and the exposure time. In the area of physical oceanography, experiments using these water bodies as a natural laboratory and interpreting their circulation and mixing processes using theoretical and semi-theoretical knowledge are of fundamental importance. Small-scale physical models may also be used together with analytical and numerical models. The book highlights the fact that research and theory are interactive, and the results provide the fundamentals for the development of the estuarine research.

  7. A pilot study to validate measures of the theory of reasoned action for organ donation behavior.

    Science.gov (United States)

    Wong, Shui Hung; Chow, Amy Yin Man

    2018-04-01

    The present study aimed at taking the first attempt in validating the measures generated based on the theory of reasoned action (TRA). A total of 211 university students participated in the study, 95 were included in the exploratory factor analysis and 116 were included in the confirmatory factor analysis. The TRA measurements were established with adequate psychometric properties, internal consistency, and construct validity. Findings also suggested that attitude toward organ donation has both a cognitive and affective nature, while the subjective norm of the family seems to be important to students' views on organ donation.

  8. Utilizing measure-based feedback in control-mastery theory: A clinical error.

    Science.gov (United States)

    Snyder, John; Aafjes-van Doorn, Katie

    2016-09-01

    Clinical errors and ruptures are an inevitable part of clinical practice. Often times, therapists are unaware that a clinical error or rupture has occurred, leaving no space for repair, and potentially leading to patient dropout and/or less effective treatment. One way to overcome our blind spots is by frequently and systematically collecting measure-based feedback from the patient. Patient feedback measures that focus on the process of psychotherapy such as the Patient's Experience of Attunement and Responsiveness scale (PEAR) can be used in conjunction with treatment outcome measures such as the Outcome Questionnaire 45.2 (OQ-45.2) to monitor the patient's therapeutic experience and progress. The regular use of these types of measures can aid clinicians in the identification of clinical errors and the associated patient deterioration that might otherwise go unnoticed and unaddressed. The current case study describes an instance of clinical error that occurred during the 2-year treatment of a highly traumatized young woman. The clinical error was identified using measure-based feedback and subsequently understood and addressed from the theoretical standpoint of the control-mastery theory of psychotherapy. An alternative hypothetical response is also presented and explained using control-mastery theory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  10. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  11. Act No. 23.592 adopting measures against those who arbitrarily impede the full exercise of fundamental rights and guarantees recognized in the National Constitution, 23 August 1988.

    Science.gov (United States)

    1988-01-01

    This Act provides that whoever arbitrarily impedes, obstructs, restricts, or in any other way limits the full exercise on an equal basis of fundamental rights and guarantees recognized by the National Constitution of Argentina will be obligated, at the request of the injured person, to render the discriminatory act without effect or cease in its realization and repair the moral and material damage that has resulted. To be particularly scrutinized are discriminatory acts or omissions undertaken for sexual reasons, among others. full text

  12. Fundamental neutron physics

    International Nuclear Information System (INIS)

    Deslattes, R.; Dombeck, T.; Greene, G.; Ramsey, N.; Rauch, H.; Werner, S.

    1984-01-01

    Fundamental physics experiments of merit can be conducted at the proposed intense neutron sources. Areas of interest include: neutron particle properties, neutron wave properties, and fundamental physics utilizing reactor produced γ-rays. Such experiments require intense, full-time utilization of a beam station for periods ranging from several months to a year or more

  13. Overview: Parity Violation and Fundamental Symmetries

    Science.gov (United States)

    Carlini, Roger

    2017-09-01

    The fields of nuclear and particle physics have undertaken extensive programs of research to search for evidence of new phenomena via the precision measurement of observables that are well predicted within the standard model of electroweak interaction. It is already known that the standard model is incomplete as it does not include gravity and dark matter/energy and therefore likely the low energy approximation of a more complex theory. This talk will be an overview of the motivation, experimental methods and status of some of these efforts (past and future) related to precision in-direct searches that are complementary to the direct searches underway at the Large Hadron Collider. This abstract is for the invited talk associated with the Mini-symposium titled ``Electro-weak Physics and Fundamental Symmetries'' organized by Julie Roche.

  14. Current challenges in fundamental physics

    Science.gov (United States)

    Egana Ugrinovic, Daniel

    The discovery of the Higgs boson at the Large Hadron Collider completed the Standard Model of particle physics. The Standard Model is a remarkably successful theory of fundamental physics, but it suffers from severe problems. It does not provide an explanation for the origin or stability of the electroweak scale nor for the origin and structure of flavor and CP violation. It predicts vanishing neutrino masses, in disagreement with experimental observations. It also fails to explain the matter-antimatter asymmetry of the universe, and it does not provide a particle candidate for dark matter. In this thesis we provide experimentally testable solutions for most of these problems and we study their phenomenology.

  15. Macroscopic fundamental strings in cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Aharonov, Y; Englert, F; Orloff, J

    1987-12-24

    We show that, when D greater than or equal to 4, theories of closed strings of closed strings in D, non-compact space-time dimensions exhibit a phase transition. The high-temperature phase is characterized by a condensate of arbitrarily long strings with Hausdorff dimension two (area filling curves). We suggest that this stringy phase is the ancestor of the adiabatic era. Fundamental strings could then both drive the inflation and seed, in a way reminiscent of the cosmic string mechanism, the large structures in the universe.

  16. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  17. Experimental test of proximity effect theories by surface impedance measurements on the Pb-Sn system

    International Nuclear Information System (INIS)

    Hook, J.R.; Battilana, J.A.

    1976-01-01

    The proximity effect in the Pb-Sn system in zero magnetic field has been studied by measuring the surface impedance at 3 GHz of a thin film of tin evaporated on to a bulk lead substrate. The results are compared with the predictions of theories of the proximity effect. It is found that good agreement can be obtained by using a theory due to Hook and Waldram of the spatial variation of the superconducting order parameter Δ inside each metal together with suitable boundary conditions on Δ at the interface between the metals. The required boundary conditions are a generalization to the case of non-zero electron reflection at the interface of the boundary conditions given by Zaitsev for the Ginsburg-Landau equation. (author)

  18. Fundamentals and advanced techniques in derivatives hedging

    CERN Document Server

    Bouchard, Bruno

    2016-01-01

    This book covers the theory of derivatives pricing and hedging as well as techniques used in mathematical finance. The authors use a top-down approach, starting with fundamentals before moving to applications, and present theoretical developments alongside various exercises, providing many examples of practical interest. A large spectrum of concepts and mathematical tools that are usually found in separate monographs are presented here. In addition to the no-arbitrage theory in full generality, this book also explores models and practical hedging and pricing issues. Fundamentals and Advanced Techniques in Derivatives Hedging further introduces advanced methods in probability and analysis, including Malliavin calculus and the theory of viscosity solutions, as well as the recent theory of stochastic targets and its use in risk management, making it the first textbook covering this topic. Graduate students in applied mathematics with an understanding of probability theory and stochastic calculus will find this b...

  19. Fundamental Physics with Antihydrogen

    Science.gov (United States)

    Hangst, J. S.

    Antihydrogen—the antimatter equivalent of the hydrogen atom—is of fundamental interest as a test bed for universal symmetries—such as CPT and the Weak Equivalence Principle for gravitation. Invariance under CPT requires that hydrogen and antihydrogen have the same spectrum. Antimatter is of course intriguing because of the observed baryon asymmetry in the universe—currently unexplained by the Standard Model. At the CERN Antiproton Decelerator (AD) [1], several groups have been working diligently since 1999 to produce, trap, and study the structure and behaviour of the antihydrogen atom. One of the main thrusts of the AD experimental program is to apply precision techniques from atomic physics to the study of antimatter. Such experiments complement the high-energy searches for physics beyond the Standard Model. Antihydrogen is the only atom of antimatter to be produced in the laboratory. This is not so unfortunate, as its matter equivalent, hydrogen, is one of the most well-understood and accurately measured systems in all of physics. It is thus very compelling to undertake experimental examinations of the structure of antihydrogen. As experimental spectroscopy of antihydrogen has yet to begin in earnest, I will give here a brief introduction to some of the ion and atom trap developments necessary for synthesizing and trapping antihydrogen, so that it can be studied.

  20. Psychometric properties of three measures assessing advanced theory of mind: Evidence from people with schizophrenia.

    Science.gov (United States)

    Chen, Kuan-Wei; Lee, Shih-Chieh; Chiang, Hsin-Yu; Syu, Ya-Cing; Yu, Xiao-Xuan; Hsieh, Ching-Lin

    2017-11-01

    Patients with schizophrenia tend to have deficits in advanced Theory of Mind (ToM). The "Reading the mind in the eyes" test (RMET), the Faux Pas Task, and the Strange Stories are commonly used for assessing advanced ToM. However, most of the psychometric properties of these 3 measures in patients with schizophrenia are unknown. The aims of this study were to validate the psychometric properties of the 3 advanced ToM measures in patients with schizophrenia, including: (1) test-retest reliability; (2) random measurement error; (3) practice effect; (4) concurrent validity; and (5) ecological validity. We recruited 53 patients with schizophrenia, who completed the 3 measures twice, 4 weeks apart. The Revised Social Functioning Scale-Taiwan short version (R-SFST) was completed within 3 days of first session of assessments. We found that the intraclass correlation coefficients of the RMET, Strange Stories, and Faux Pas Task were 0.24, 0.5, and 0.76. All 3 advanced ToM measures had large random measurement error, trivial to small practice effects, poor concurrent validity, and low ecological validity. We recommend that the scores of the 3 advanced ToM measures be interpreted with caution because these measures may not provide reliable and valid results on patients' advanced ToM abilities. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Overview of Classical Test Theory and Item Response Theory for Quantitative Assessment of Items in Developing Patient-Reported Outcome Measures

    Science.gov (United States)

    Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.

    2014-01-01

    Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753

  2. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures.

    Science.gov (United States)

    Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D

    2014-05-01

    The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  3. Multiphase flow dynamics 1 fundamentals

    CERN Document Server

    Kolev, Nikolay Ivanov

    2004-01-01

    Multi-phase flows are part of our natural environment such as tornadoes, typhoons, air and water pollution and volcanic activities as well as part of industrial technology such as power plants, combustion engines, propulsion systems, or chemical and biological industry. The industrial use of multi-phase systems requires analytical and numerical strategies for predicting their behavior. In its third extended edition this monograph contains theory, methods and practical experience for describing complex transient multi-phase processes in arbitrary geometrical configurations, providing a systematic presentation of the theory and practice of numerical multi-phase fluid dynamics. In the present first volume the fundamentals of multiphase dynamics are provided. This third edition includes various updates, extensions and improvements in all book chapters.

  4. SU (2) with fundamental fermions and scalars

    DEFF Research Database (Denmark)

    Hansen, Martin; Janowski, Tadeusz; Pica, Claudio

    2018-01-01

    We present preliminary results on the lattice simulation of an SU(2) gauge theory with two fermion flavors and one strongly interacting scalar field, all in the fundamental representation of SU(2). The motivation for this study comes from the recent proposal of "fundamental" partial compositeness...... the properties of light meson resonances previously obtained for the SU(2) model. Preprint: CP3-Origins-2017-047 DNRF90...

  5. Fundamentals of electronics

    CERN Document Server

    Schubert, Thomas F

    2015-01-01

    This book, Electronic Devices and Circuit Application, is the first of four books of a larger work, Fundamentals of Electronics. It is comprised of four chapters describing the basic operation of each of the four fundamental building blocks of modern electronics: operational amplifiers, semiconductor diodes, bipolar junction transistors, and field effect transistors. Attention is focused on the reader obtaining a clear understanding of each of the devices when it is operated in equilibrium. Ideas fundamental to the study of electronic circuits are also developed in the book at a basic level to

  6. Fundamental principles, measurement techniques and data analysis in a ion accelerator; Principios fundamentales, tecnicas de medicion y analisis de datos en un acelerador de iones

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez M, O. [Facultad de Ciencias, UNAM, Ciudad Universitaria, 04510 Mexico D. F. (Mexico); Gleason, C. [Facultad de Ciencias, Universidad Autonoma del Estado de Morelos, Cuernavaca, Morelos (Mexico); Hinojosa, G. [Instituto de Ciencias Fisicas, UNAM, Ciudad Universitaria, 04510 Mexico D. F. (Mexico)]. e-mail: hinojosa@fis.unam.mx

    2008-07-01

    The present work is intended to be a general reference for students and professionals interested in the field. Here, we present an introduction to the analysis techniques and fundamental principles for data processing and operation of a typical ion accelerator that operates in the low energy range. We also present a detailed description of the apparatus and propose new analysis methods for the results. In addition, we introduce illustrative simulations of the ion's trajectories in the different components of the apparatus performed with specialized software and, a new computer data acquisition and control interface. (Author)

  7. Maximum Entropy Fundamentals

    Directory of Open Access Journals (Sweden)

    F. Topsøe

    2001-09-01

    Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over

  8. Einstein Gravity Explorer–a medium-class fundamental physics mission

    NARCIS (Netherlands)

    Schiller, S.; Tino, G.M.; Gill, E.

    2008-01-01

    The Einstein Gravity Explorer mission (EGE) is devoted to a precise measurement of the properties of space-time using atomic clocks. It tests one of the most fundamental predictions of Einstein’s Theory of General Relativity, the gravitational redshift, and thereby searches for hints of quantum

  9. Mass anomalous dimension and running of the coupling in SU(2) with six fundamental fermions

    DEFF Research Database (Denmark)

    Bursa, Francis; Del Debbio, Luigi; Keegan, Liam

    2010-01-01

    We simulate SU(2) gauge theory with six massless fundamental Dirac fermions. By using the Schr\\"odinger Functional method we measure the running of the coupling and the fermion mass over a wide range of length scales. We observe very slow running of the coupling and construct an estimator for the...

  10. A new measure of skill mismatch: theory and evidence from PIAAC

    Directory of Open Access Journals (Sweden)

    Michele Pellizzari

    2017-01-01

    Full Text Available Abstract This paper proposes a new measure of skill mismatch to be applied to the recent OECD Survey of Adult Skills (PIAAC. The measure is derived from a formal theory and combines information about skill proficiency, self-reported mismatch and skill use. The theoretical foundations underling this measure allow identifying minimum and maximum skill requirements for each occupation and to classify workers into three groups: the well-matched, the under-skilled and the over-skilled. The availability of skill use data further permits the computation of the degree of under- and over-usage of skills in the economy. The empirical analysis is carried out using the first round of the PIAAC data, allowing comparisons across skill domains, labour market statuses and countries.

  11. Measuring organizational effectiveness in information and communication technology companies using item response theory.

    Science.gov (United States)

    Trierweiller, Andréa Cristina; Peixe, Blênio César Severo; Tezza, Rafael; Pereira, Vera Lúcia Duarte do Valle; Pacheco, Waldemar; Bornia, Antonio Cezar; de Andrade, Dalton Francisco

    2012-01-01

    The aim of this paper is to measure the effectiveness of the organizations Information and Communication Technology (ICT) from the point of view of the manager, using Item Response Theory (IRT). There is a need to verify the effectiveness of these organizations which are normally associated to complex, dynamic, and competitive environments. In academic literature, there is disagreement surrounding the concept of organizational effectiveness and its measurement. A construct was elaborated based on dimensions of effectiveness towards the construction of the items of the questionnaire which submitted to specialists for evaluation. It demonstrated itself to be viable in measuring organizational effectiveness of ICT companies under the point of view of a manager through using Two-Parameter Logistic Model (2PLM) of the IRT. This modeling permits us to evaluate the quality and property of each item placed within a single scale: items and respondents, which is not possible when using other similar tools.

  12. Psychometric properties of the Triarchic Psychopathy Measure: An item response theory approach.

    Science.gov (United States)

    Shou, Yiyun; Sellbom, Martin; Xu, Jing

    2018-05-01

    There is cumulative evidence for the cross-cultural validity of the Triarchic Psychopathy Measure (TriPM; Patrick, 2010) among non-Western populations. Recent studies using correlational and regression analyses show promising construct validity of the TriPM in Chinese samples. However, little is known about the efficiency of items in TriPM in assessing the proposed latent traits. The current study evaluated the psychometric properties of the Chinese TriPM at the item level using item response theory analyses. It also examined the measurement invariance of the TriPM between the Chinese and the U.S. student samples by applying differential item functioning analyses under the item response theory framework. The results supported the unidimensional nature of the Disinhibition and Meanness scales. Both scales had a greater level of precision in the respective underlying constructs at the positive ends. The two scales, however, had several items that were weakly associated with their respective latent traits in the Chinese student sample. Boldness, on the other hand, was found to be multidimensional, and reflected a more normally distributed range of variation. The examination of measurement bias via differential item functioning analyses revealed that a number of items of the TriPM were not equivalent across the Chinese and the U.S. Some modification and adaptation of items might be considered for improving the precision of the TriPM for Chinese participants. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Formulation of uncertainty relation of error and disturbance in quantum measurement by using quantum estimation theory

    International Nuclear Information System (INIS)

    Yu Watanabe; Masahito Ueda

    2012-01-01

    Full text: When we try to obtain information about a quantum system, we need to perform measurement on the system. The measurement process causes unavoidable state change. Heisenberg discussed a thought experiment of the position measurement of a particle by using a gamma-ray microscope, and found a trade-off relation between the error of the measured position and the disturbance in the momentum caused by the measurement process. The trade-off relation epitomizes the complementarity in quantum measurements: we cannot perform a measurement of an observable without causing disturbance in its canonically conjugate observable. However, at the time Heisenberg found the complementarity, quantum measurement theory was not established yet, and Kennard and Robertson's inequality erroneously interpreted as a mathematical formulation of the complementarity. Kennard and Robertson's inequality actually implies the indeterminacy of the quantum state: non-commuting observables cannot have definite values simultaneously. However, Kennard and Robertson's inequality reflects the inherent nature of a quantum state alone, and does not concern any trade-off relation between the error and disturbance in the measurement process. In this talk, we report a resolution to the complementarity in quantum measurements. First, we find that it is necessary to involve the estimation process from the outcome of the measurement for quantifying the error and disturbance in the quantum measurement. We clarify the implicitly involved estimation process in Heisenberg's gamma-ray microscope and other measurement schemes, and formulate the error and disturbance for an arbitrary quantum measurement by using quantum estimation theory. The error and disturbance are defined in terms of the Fisher information, which gives the upper bound of the accuracy of the estimation. Second, we obtain uncertainty relations between the measurement errors of two observables [1], and between the error and disturbance in the

  14. Fundamentals of electrochemical science

    CERN Document Server

    Oldham, Keith

    1993-01-01

    Key Features* Deals comprehensively with the basic science of electrochemistry* Treats electrochemistry as a discipline in its own right and not as a branch of physical or analytical chemistry* Provides a thorough and quantitative description of electrochemical fundamentals

  15. Fundamentals of ion exchange

    International Nuclear Information System (INIS)

    Townsend, R.P.

    1993-01-01

    In this paper the fundamentals of ion exchange mechanisms and their thermodynamics are described. A range of ion exchange materials is considered and problems of communication and technology transfer between scientists working in the field are discussed. (UK)

  16. Linear algebraic theory of partial coherence: discrete fields and measures of partial coherence.

    Science.gov (United States)

    Ozaktas, Haldun M; Yüksel, Serdar; Kutay, M Alper

    2002-08-01

    A linear algebraic theory of partial coherence is presented that allows precise mathematical definitions of concepts such as coherence and incoherence. This not only provides new perspectives and insights but also allows us to employ the conceptual and algebraic tools of linear algebra in applications. We define several scalar measures of the degree of partial coherence of an optical field that are zero for full incoherence and unity for full coherence. The mathematical definitions are related to our physical understanding of the corresponding concepts by considering them in the context of Young's experiment.

  17. Land Prices and Fundamentals

    OpenAIRE

    Koji Nakamura; Yumi Saita

    2007-01-01

    This paper examines the long-term relationship between macro economic fundamentals and the weighted-average land price indicators, which are supposed to be more appropriate than the official land price indicators when analyzing their impacts on the macro economy. In many cases, we find the cointegrating relationships between the weighted-average land price indicators and the discounted present value of land calculated based on the macro economic fundamentals indicators. We also find that the ...

  18. Information security fundamentals

    CERN Document Server

    Peltier, Thomas R

    2013-01-01

    Developing an information security program that adheres to the principle of security as a business enabler must be the first step in an enterprise's effort to build an effective security program. Following in the footsteps of its bestselling predecessor, Information Security Fundamentals, Second Edition provides information security professionals with a clear understanding of the fundamentals of security required to address the range of issues they will experience in the field.The book examines the elements of computer security, employee roles and r

  19. Religious fundamentalism and conflict

    OpenAIRE

    Muzaffer Ercan Yılmaz

    2006-01-01

    This study provides an analytical discussion for the issue of religious fundamentalism and itsrelevance to conflict, in its broader sense. It is stressed that religious fundamentalism manifests itself in twoways: nonviolent intolerance and violent intolerance. The sources of both types of intolerance and theirconnection to conflict are addressed and discussed in detail. Further research is also suggested on conditionsconnecting religion to nonviolent intolerance so as to cope with the problem...

  20. arXiv Minimal Fundamental Partial Compositeness

    CERN Document Server

    Cacciapaglia, Giacomo; Sannino, Francesco; Thomsen, Anders Eller

    Building upon the fundamental partial compositeness framework we provide consistent and complete composite extensions of the standard model. These are used to determine the effective operators emerging at the electroweak scale in terms of the standard model fields upon consistently integrating out the heavy composite dynamics. We exhibit the first effective field theories matching these complete composite theories of flavour and analyse their physical consequences for the third generation quarks. Relations with other approaches, ranging from effective analyses for partial compositeness to extra dimensions as well as purely fermionic extensions, are briefly discussed. Our methodology is applicable to any composite theory of dynamical electroweak symmetry breaking featuring a complete theory of flavour.

  1. Fundamentalism and science

    Directory of Open Access Journals (Sweden)

    Massimo Pigliucci

    2006-06-01

    Full Text Available The many facets of fundamentalism. There has been much talk about fundamentalism of late. While most people's thought on the topic go to the 9/11 attacks against the United States, or to the ongoing war in Iraq, fundamentalism is affecting science and its relationship to society in a way that may have dire long-term consequences. Of course, religious fundamentalism has always had a history of antagonism with science, and – before the birth of modern science – with philosophy, the age-old vehicle of the human attempt to exercise critical thinking and rationality to solve problems and pursue knowledge. “Fundamentalism” is defined by the Oxford Dictionary of the Social Sciences1 as “A movement that asserts the primacy of religious values in social and political life and calls for a return to a 'fundamental' or pure form of religion.” In its broadest sense, however, fundamentalism is a form of ideological intransigence which is not limited to religion, but includes political positions as well (for example, in the case of some extreme forms of “environmentalism”.

  2. A laboratory scale fundamental time?

    International Nuclear Information System (INIS)

    Mendes, R.V.

    2012-01-01

    The existence of a fundamental time (or fundamental length) has been conjectured in many contexts. However, the ''stability of physical theories principle'' seems to be the one that provides, through the tools of algebraic deformation theory, an unambiguous derivation of the stable structures that Nature might have chosen for its algebraic framework. It is well-known that c and ℎ are the deformation parameters that stabilize the Galilean and the Poisson algebra. When the stability principle is applied to the Poincare-Heisenberg algebra, two deformation parameters emerge which define two time (or length) scales. In addition there are, for each of them, a plus or minus sign possibility in the relevant commutators. One of the deformation length scales, related to non-commutativity of momenta, is probably related to the Planck length scale but the other might be much larger and already detectable in laboratory experiments. In this paper, this is used as a working hypothesis to look for physical effects that might settle this question. Phase-space modifications, resonances, interference, electron spin resonance and non-commutative QED are considered. (orig.)

  3. Quantum dissipation theory and applications to quantum transport and quantum measurement in mesoscopic systems

    Science.gov (United States)

    Cui, Ping

    The thesis comprises two major themes of quantum statistical dynamics. One is the development of quantum dissipation theory (QDT). It covers the establishment of some basic relations of quantum statistical dynamics, the construction of several nonequivalent complete second-order formulations, and the development of exact QDT. Another is related to the applications of quantum statistical dynamics to a variety of research fields. In particular, unconventional but novel theories of the electron transfer in Debye solvents, quantum transport, and quantum measurement are developed on the basis of QDT formulations. The thesis is organized as follows. In Chapter 1, we present some background knowledge in relation to the aforementioned two themes of this thesis. The key quantity in QDT is the reduced density operator rho(t) ≡ trBrho T(t); i.e., the partial trace of the total system and bath composite rhoT(t) over the bath degrees of freedom. QDT governs the evolution of reduced density operator, where the effects of bath are treated in a quantum statistical manner. In principle, the reduced density operator contains all dynamics information of interest. However, the conventional quantum transport theory is formulated in terms of nonequilibrium Green's function. The newly emerging field of quantum measurement in relation to quantum information and quantum computing does exploit a sort of QDT formalism. Besides the background of the relevant theoretical development, some representative experiments on molecular nanojunctions are also briefly discussed. In chapter 2, we outline some basic (including new) relations that highlight several important issues on QDT. The content includes the background of nonequilibrium quantum statistical mechanics, the general description of the total composite Hamiltonian with stochastic system-bath interaction, a novel parameterization scheme for bath correlation functions, a newly developed exact theory of driven Brownian oscillator (DBO

  4. Triangles in ROC space: History and theory of "nonparametric" measures of sensitivity and response bias.

    Science.gov (United States)

    Macmillan, N A; Creelman, C D

    1996-06-01

    Can accuracy and response bias in two-stimulus, two-response recognition or detection experiments be measured nonparametrically? Pollack and Norman (1964) answered this question affirmatively for sensitivity, Hodos (1970) for bias: Both proposed measures based on triangular areas in receiver-operating characteristic space. Their papers, and especially a paper by Grier (1971) that provided computing formulas for the measures, continue to be heavily cited in a wide range of content areas. In our sample of articles, most authors described triangle-based measures as making fewer assumptions than measures associated with detection theory. However, we show that statistics based on products or ratios of right triangle areas, including a recently proposed bias index and a not-yetproposed but apparently plausible sensitivity index, are consistent with a decision process based on logistic distributions. Even the Pollack and Norman measure, which is based on non-right triangles, is approximately logistic for low values of sensitivity. Simple geometric models for sensitivity and bias are not nonparametric, even if their implications are not acknowledged in the defining publications.

  5. The aerodynamic cost of flight in bats--comparing theory with measurement

    Science.gov (United States)

    von Busse, Rhea; Waldman, Rye M.; Swartz, Sharon M.; Breuer, Kenneth S.

    2012-11-01

    Aerodynamic theory has long been used to predict the aerodynamic power required for animal flight. However, even though the actuator disk model does not account for the flapping motion of a wing, it is used for lack of any better model. The question remains: how close are these predictions to reality? We designed a study to compare predicted aerodynamic power to measured power from the kinetic energy contained in the wake shed behind a bat flying in a wind tunnel. A high-accuracy displaced light-sheet stereo PIV system was used in the Trefftz plane to capture the wake behind four bats flown over a range of flight speeds (1-6m/s). The total power in the wake was computed from the wake vorticity and these estimates were compared with the power predicted using Pennycuick's model for bird flight as well as estimates derived from measurements of the metabolic cost of flight, previously acquired from the same individuals.

  6. Measuring theory of mind in children. Psychometric properties of the ToM Storybooks.

    Science.gov (United States)

    Blijd-Hoogewys, E M A; van Geert, P L C; Serra, M; Minderaa, R B

    2008-11-01

    Although research on Theory-of-Mind (ToM) is often based on single task measurements, more comprehensive instruments result in a better understanding of ToM development. The ToM Storybooks is a new instrument measuring basic ToM-functioning and associated aspects. There are 34 tasks, tapping various emotions, beliefs, desires and mental-physical distinctions. Four studies on the validity and reliability of the test are presented, in typically developing children (n = 324, 3-12 years) and children with PDD-NOS (n = 30). The ToM Storybooks have good psychometric qualities. A component analysis reveals five components corresponding with the underlying theoretical constructs. The internal consistency, test-retest reliability, inter-rater reliability, construct validity and convergent validity are good. The ToM Storybooks can be used in research as well as in clinical settings.

  7. Is PT -symmetric quantum theory false as a fundamental theory?

    Czech Academy of Sciences Publication Activity Database

    Znojil, Miloslav

    2016-01-01

    Roč. 56, č. 3 (2016), s. 254-257 ISSN 1210-2709 R&D Projects: GA ČR GA16-22945S Institutional support: RVO:61389005 Keywords : quantum mechanics * PT-symmetric representations of observables * masurement outcomes Subject RIV: BE - Theoretical Physics

  8. The Influence of Preprocessing Steps on Graph Theory Measures Derived from Resting State fMRI.

    Science.gov (United States)

    Gargouri, Fatma; Kallel, Fathi; Delphine, Sebastien; Ben Hamida, Ahmed; Lehéricy, Stéphane; Valabregue, Romain

    2018-01-01

    Resting state functional MRI (rs-fMRI) is an imaging technique that allows the spontaneous activity of the brain to be measured. Measures of functional connectivity highly depend on the quality of the BOLD signal data processing. In this study, our aim was to study the influence of preprocessing steps and their order of application on small-world topology and their efficiency in resting state fMRI data analysis using graph theory. We applied the most standard preprocessing steps: slice-timing, realign, smoothing, filtering, and the tCompCor method. In particular, we were interested in how preprocessing can retain the small-world economic properties and how to maximize the local and global efficiency of a network while minimizing the cost. Tests that we conducted in 54 healthy subjects showed that the choice and ordering of preprocessing steps impacted the graph measures. We found that the csr (where we applied realignment, smoothing, and tCompCor as a final step) and the scr (where we applied realignment, tCompCor and smoothing as a final step) strategies had the highest mean values of global efficiency (eg) . Furthermore, we found that the fscr strategy (where we applied realignment, tCompCor, smoothing, and filtering as a final step), had the highest mean local efficiency (el) values. These results confirm that the graph theory measures of functional connectivity depend on the ordering of the processing steps, with the best results being obtained using smoothing and tCompCor as the final steps for global efficiency with additional filtering for local efficiency.

  9. Using the theory of planned behaviour to measure motivation for recovery in anorexia nervosa.

    Science.gov (United States)

    Dawson, Lisa; Mullan, Barbara; Sainsbury, Kirby

    2015-01-01

    Anorexia nervosa (AN) is a difficult to treat mental illness associated with low motivation for change. Despite criticisms of the transtheoretical stages of change model, both generally and in the eating disorders (EDs), this remains the only model to have been applied to the understanding of motivation to recover from AN. The aim of this pilot study was to determine whether the theory of planned behaviour (TPB) would provide a good fit for understanding and predicting motivation to recover from AN. Two studies were conducted - in the first study eight women who had recovered from chronic AN were interviewed about their experiences of recovery. The interview data were subsequently used to inform the development of a purpose-designed questionnaire to measure the components of the TPB in relation to recovery. In the second study, the resultant measure was administered to 67 females with a current diagnosis of AN, along with measures of eating disorder psychopathology, psychological symptoms, and an existing measure of motivation to recover (based on the transtheoretical model). Data from the interview study confirmed that the TPB is an appropriate model for understanding the factors that influence motivation to recover from AN. The results of the questionnaire study indicated that the pre-intention variables of the TPB accounted for large proportions of variance in the intention to recover (72%), and more specifically the intention to eat normally and gain weight (51%). Perceived behavioural control was the strongest predictor of intention to recover, while attitudes were more important in the prediction of the intention to eat normally/gain weight. The positive results suggest that the TPB is an appropriate model for understanding and predicting motivation in AN. Implications for theory and practice are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. The Influence of Preprocessing Steps on Graph Theory Measures Derived from Resting State fMRI

    Directory of Open Access Journals (Sweden)

    Fatma Gargouri

    2018-02-01

    Full Text Available Resting state functional MRI (rs-fMRI is an imaging technique that allows the spontaneous activity of the brain to be measured. Measures of functional connectivity highly depend on the quality of the BOLD signal data processing. In this study, our aim was to study the influence of preprocessing steps and their order of application on small-world topology and their efficiency in resting state fMRI data analysis using graph theory. We applied the most standard preprocessing steps: slice-timing, realign, smoothing, filtering, and the tCompCor method. In particular, we were interested in how preprocessing can retain the small-world economic properties and how to maximize the local and global efficiency of a network while minimizing the cost. Tests that we conducted in 54 healthy subjects showed that the choice and ordering of preprocessing steps impacted the graph measures. We found that the csr (where we applied realignment, smoothing, and tCompCor as a final step and the scr (where we applied realignment, tCompCor and smoothing as a final step strategies had the highest mean values of global efficiency (eg. Furthermore, we found that the fscr strategy (where we applied realignment, tCompCor, smoothing, and filtering as a final step, had the highest mean local efficiency (el values. These results confirm that the graph theory measures of functional connectivity depend on the ordering of the processing steps, with the best results being obtained using smoothing and tCompCor as the final steps for global efficiency with additional filtering for local efficiency.

  11. The Influence of Preprocessing Steps on Graph Theory Measures Derived from Resting State fMRI

    Science.gov (United States)

    Gargouri, Fatma; Kallel, Fathi; Delphine, Sebastien; Ben Hamida, Ahmed; Lehéricy, Stéphane; Valabregue, Romain

    2018-01-01

    Resting state functional MRI (rs-fMRI) is an imaging technique that allows the spontaneous activity of the brain to be measured. Measures of functional connectivity highly depend on the quality of the BOLD signal data processing. In this study, our aim was to study the influence of preprocessing steps and their order of application on small-world topology and their efficiency in resting state fMRI data analysis using graph theory. We applied the most standard preprocessing steps: slice-timing, realign, smoothing, filtering, and the tCompCor method. In particular, we were interested in how preprocessing can retain the small-world economic properties and how to maximize the local and global efficiency of a network while minimizing the cost. Tests that we conducted in 54 healthy subjects showed that the choice and ordering of preprocessing steps impacted the graph measures. We found that the csr (where we applied realignment, smoothing, and tCompCor as a final step) and the scr (where we applied realignment, tCompCor and smoothing as a final step) strategies had the highest mean values of global efficiency (eg). Furthermore, we found that the fscr strategy (where we applied realignment, tCompCor, smoothing, and filtering as a final step), had the highest mean local efficiency (el) values. These results confirm that the graph theory measures of functional connectivity depend on the ordering of the processing steps, with the best results being obtained using smoothing and tCompCor as the final steps for global efficiency with additional filtering for local efficiency. PMID:29497372

  12. DOE Fundamentals Handbook: Classical Physics

    International Nuclear Information System (INIS)

    1992-06-01

    The Classical Physics Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of physical forces and their properties. The handbook includes information on the units used to measure physical properties; vectors, and how they are used to show the net effect of various forces; Newton's Laws of motion, and how to use these laws in force and motion applications; and the concepts of energy, work, and power, and how to measure and calculate the energy involved in various applications. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility systems and equipment

  13. Development of the effectiveness measure for an advanced alarm system using signal detection theory

    International Nuclear Information System (INIS)

    Park, J.K.; Choi, S.S.; Hong, J.H.; Chang, S.H.

    1997-01-01

    Since many alarms which are activated during major process deviations or accidents in nuclear power plants can result in negative effects for operators, various types of advanced alarm systems that can select important alarms for the identification of process deviation have been developed to reduce the operator's workload. However, the irrelevant selection of important alarms could distract the operator from correct identification of process deviation. Therefore, to evaluate the effectiveness of the advanced alarm system, a tradeoff between the alarm reduction rate (how many alarms are reduced?) and informativeness (how many important alarms that are conducive to identifying process deviation are provided?) of an advanced alarm system should be considered. In this paper, a new measure is proposed to evaluate the effectiveness of an advanced alarm system with regard to the identification of process deviation. Here, the effectiveness measure is the combination of informativeness measure and reduction rate, and the informativeness measure means the information processing capability performed by the advanced alarm system including wrong rejection and wrong acceptance, and it can be calculated using the signal detection theory (SDT). The effectiveness of the prototype alarm system was evaluated using the loss of coolant accident (LOCA) scenario, and the validity of the effectiveness measure was investigated from two types of the operator response, such as the identification accuracy and the operator's preference for the identification of LOCA

  14. Different Variants of Fundamental Portfolio

    Directory of Open Access Journals (Sweden)

    Tarczyński Waldemar

    2014-06-01

    Full Text Available The paper proposes the fundamental portfolio of securities. This portfolio is an alternative for the classic Markowitz model, which combines fundamental analysis with portfolio analysis. The method’s main idea is based on the use of the TMAI1 synthetic measure and, in limiting conditions, the use of risk and the portfolio’s rate of return in the objective function. Different variants of fundamental portfolio have been considered under an empirical study. The effectiveness of the proposed solutions has been related to the classic portfolio constructed with the help of the Markowitz model and the WIG20 market index’s rate of return. All portfolios were constructed with data on rates of return for 2005. Their effectiveness in 2006- 2013 was then evaluated. The studied period comprises the end of the bull market, the 2007-2009 crisis, the 2010 bull market and the 2011 crisis. This allows for the evaluation of the solutions’ flexibility in various extreme situations. For the construction of the fundamental portfolio’s objective function and the TMAI, the study made use of financial and economic data on selected indicators retrieved from Notoria Serwis for 2005.

  15. Inequivalent quantizations and fundamentally perfect spaces

    International Nuclear Information System (INIS)

    Imbo, T.D.; Sudarshan, E.C.G.

    1987-06-01

    We investigate the problem of inequivalent quantizations of a physical system with multiply connected configuration space X. For scalar quantum theory on X we show that state vectors must be single-valued if and only if the first homology group H 1 (X) is trivial, or equivalently the fundamental group π 1 (X) is perfect. The θ-structure of quantum gauge and gravitational theories is discussed in light of this result

  16. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, A.V.

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments, which are our concern in this review [ru

  17. Fundamentals of piping design

    CERN Document Server

    Smith, Peter

    2013-01-01

    Written for the piping engineer and designer in the field, this two-part series helps to fill a void in piping literature,since the Rip Weaver books of the '90s were taken out of print at the advent of the Computer Aid Design(CAD) era. Technology may have changed, however the fundamentals of piping rules still apply in the digitalrepresentation of process piping systems. The Fundamentals of Piping Design is an introduction to the designof piping systems, various processes and the layout of pipe work connecting the major items of equipment forthe new hire, the engineering student and the vetera

  18. Infosec management fundamentals

    CERN Document Server

    Dalziel, Henry

    2015-01-01

    Infosec Management Fundamentals is a concise overview of the Information Security management concepts and techniques, providing a foundational template for both experienced professionals and those new to the industry. This brief volume will also appeal to business executives and managers outside of infosec who want to understand the fundamental concepts of Information Security and how it impacts their business decisions and daily activities. Teaches ISO/IEC 27000 best practices on information security management Discusses risks and controls within the context of an overall information securi

  19. Homeschooling and religious fundamentalism

    Directory of Open Access Journals (Sweden)

    Robert Kunzman

    2010-10-01

    Full Text Available This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to contemporary culture; suspicion of institutional authority and professional expertise; parental control and centrality of the family; and interweaving of faith and academics. It is important to recognize, however, that fundamentalism exists on a continuum; conservative religious homeschoolers resist liberal democratic values to varying degrees, and efforts to foster dialogue and accommodation with religious homeschoolers can ultimately help strengthen the broader civic fabric.

  20. Fundamentals of continuum mechanics

    CERN Document Server

    Rudnicki, John W

    2014-01-01

    A concise introductory course text on continuum mechanics Fundamentals of Continuum Mechanics focuses on the fundamentals of the subject and provides the background for formulation of numerical methods for large deformations and a wide range of material behaviours. It aims to provide the foundations for further study, not just of these subjects, but also the formulations for much more complex material behaviour and their implementation computationally.  This book is divided into 5 parts, covering mathematical preliminaries, stress, motion and deformation, balance of mass, momentum and energ