WorldWideScience

Sample records for fundamental measure theory

  1. Fundamental statistical theories

    International Nuclear Information System (INIS)

    Demopoulos, W.

    1976-01-01

    Einstein argued that since quantum mechanics is not a fundamental theory it cannot be regarded as in any sense final. The pure statistical states of the quantum theory are not dispersion-free. In this sense, the theory is significantly statistical. The problem investigated in this paper is to determine under what conditions is a significalty statistical theory correctly regarded as fundamental. The solution developed in this paper is that a statistical theory is fundamental only if it is complete; moreover the quantum theory is complete. (B.R.H.)

  2. Inhomogeneous fluids of colloidal hard dumbbells: fundamental measure theory and Monte Carlo simulations.

    Science.gov (United States)

    Marechal, Matthieu; Goetzke, Hanns Hagen; Härtel, Andreas; Löwen, Hartmut

    2011-12-21

    Recently, a density functional theory for hard particles with shape anisotropy was developed, the extended deconvolution fundamental measure theory (edFMT). We apply edFMT to hard dumbbells, arguably the simplest non-convex shape and readily available experimentally in the form of colloids. We obtain good agreement between edFMT and Monte Carlo simulations for fluids of dumbbells in a slit and for the same system under gravity. This indicates that edFMT can be successfully applied to nearly all colloidal shapes, not just for the convex shapes for which edFMT was originally derived. A theory, such as edFMT, that allows a fast and general way of mapping the phase behavior of anisotropic colloids, can act as a useful guide for the design of colloidal shapes for various applications.

  3. Theory of fundamental interactions

    International Nuclear Information System (INIS)

    Pestov, A.B.

    1992-01-01

    In the present article the theory of fundamental interactions is derived in a systematic way from the first principles. In the developed theory there is no separation between space-time and internal gauge space. Main equations for basic fields are derived. In is shown that the theory satisfies the correspondence principle and gives rise to new notions in the considered region. In particular, the conclusion is made about the existence of particles which are characterized not only by the mass, spin, charge but also by the moment of inertia. These are rotating particles, the particles which represent the notion of the rigid body on the microscopical level and give the key for understanding strong interactions. The main concepts and dynamical laws for these particles are formulated. The basic principles of the theory may be examined experimentally not in the distant future. 29 refs

  4. Fundamentals of queueing theory

    CERN Document Server

    Gross, Donald; Thompson, James M; Harris, Carl M

    2013-01-01

    Praise for the Third Edition ""This is one of the best books available. Its excellent organizational structure allows quick reference to specific models and its clear presentation . . . solidifies the understanding of the concepts being presented.""-IIE Transactions on Operations Engineering Thoroughly revised and expanded to reflect the latest developments in the field, Fundamentals of Queueing Theory, Fourth Edition continues to present the basic statistical principles that are necessary to analyze the probabilistic nature of queues. Rather than pre

  5. Recent developments in classical density functional theory: Internal energy functional and diagrammatic structure of fundamental measure theory

    Directory of Open Access Journals (Sweden)

    M. Schmidt

    2012-12-01

    Full Text Available An overview of several recent developments in density functional theory for classical inhomogeneous liquids is given. We show how Levy's constrained search method can be used to derive the variational principle that underlies density functional theory. An advantage of the method is that the Helmholtz free energy as a functional of a trial one-body density is given as an explicit expression, without reference to an external potential as is the case in the standard Mermin-Evans proof by reductio ad absurdum. We show how to generalize the approach in order to express the internal energy as a functional of the one-body density distribution and of the local entropy distribution. Here the local chemical potential and the bulk temperature play the role of Lagrange multipliers in the Euler-Lagrange equations for minimiziation of the functional. As an explicit approximation for the free-energy functional for hard sphere mixtures, the diagrammatic structure of Rosenfeld's fundamental measure density functional is laid out. Recent extensions, based on the Kierlik-Rosinberg scalar weight functions, to binary and ternary non-additive hard sphere mixtures are described.

  6. Fundamental measure theory for non-spherical hard particles: predicting liquid crystal properties from the particle shape.

    Science.gov (United States)

    Wittmann, René; Marechal, Matthieu; Mecke, Klaus

    2016-06-22

    Density functional theory (DFT) for hard bodies provides a theoretical description of the effect of particle shape on inhomogeneous fluids. We present improvements of the DFT framework fundamental measure theory (FMT) for hard bodies and validate these improvements for hard spherocylinders. To keep the paper self-contained, we first discuss the recent advances in FMT for hard bodies that lead to the introduction of fundamental mixed measure theory (FMMT) in our previous paper (2015 Europhys. Lett. 109 26003). Subsequently, we provide an efficient semi-empirical alternative to FMMT and show that the phase diagram for spherocylinders is described with similar accuracy in both versions of the theory. Finally, we present a semi-empirical modification of FMMT whose predictions for the phase diagram for spherocylinders are in excellent quantitative agreement with computer simulation results.

  7. Fundamental number theory with applications

    CERN Document Server

    Mollin, Richard A

    2008-01-01

    An update of the most accessible introductory number theory text available, Fundamental Number Theory with Applications, Second Edition presents a mathematically rigorous yet easy-to-follow treatment of the fundamentals and applications of the subject. The substantial amount of reorganizing makes this edition clearer and more elementary in its coverage. New to the Second Edition           Removal of all advanced material to be even more accessible in scope           New fundamental material, including partition theory, generating functions, and combinatorial number theory           Expa

  8. Fundamentals of electroweak theory

    CERN Document Server

    Hořejší, Jiří

    2002-01-01

    This monograph of Prof. Horejší is based on a series of his lectures which took place at Faculty of Mathematics and Physics of Charles University during 1990s. The author gives a thorough and easy-to-read account of the basic principles of the standard model of electroweak interactions, describes various theories of electromagnetic and weak interactions, and explains the gauge theory of electroweak interactions. Five appendices expound on some special techniques of the Standard Model, used in the main body of the text. Thanks to the author's pedagogical skills and professional erudition, the book can be read just with a preliminary knowledge of quantum field theory.

  9. Fundamentals of number theory

    CERN Document Server

    LeVeque, William J

    1996-01-01

    This excellent textbook introduces the basics of number theory, incorporating the language of abstract algebra. A knowledge of such algebraic concepts as group, ring, field, and domain is not assumed, however; all terms are defined and examples are given - making the book self-contained in this respect.The author begins with an introductory chapter on number theory and its early history. Subsequent chapters deal with unique factorization and the GCD, quadratic residues, number-theoretic functions and the distribution of primes, sums of squares, quadratic equations and quadratic fields, diopha

  10. Hydromechanics theory and fundamentals

    CERN Document Server

    Sinaiski, Emmanuil G

    2010-01-01

    Written by an experienced author with a strong background in applications of this field, this monograph provides a comprehensive and detailed account of the theory behind hydromechanics. He includes numerous appendices with mathematical tools, backed by extensive illustrations. The result is a must-have for all those needing to apply the methods in their research, be it in industry or academia.

  11. Fundamental principles of quantum theory

    International Nuclear Information System (INIS)

    Bugajski, S.

    1980-01-01

    After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)

  12. Radiometric temperature measurements fundamentals

    CERN Document Server

    Zhang, Zhuomin M; Machin, Graham

    2009-01-01

    This book describes the theory of radiation thermometry, both at a primary level and for a variety of applications, such as in the materials processing industries and remote sensing. This book is written for those who will apply radiation thermometry in industrial practice; use radiation thermometers for scientific research; the radiation thermometry specialist in a national measurement institute; developers of radiation thermometers who are working to innovate products for instrument manufacturers, and developers of non-contact thermometry methods to address challenging thermometry problems.

  13. Origins and fundamentals of nodal aberration theory

    Science.gov (United States)

    Rogers, John R.

    2017-11-01

    Nodal Aberration Theory, developed by Kevin Thompson and Roland Shack, predicts several important aberration phenomena but remains poorly understood. To de-mystify the theory, we describe the origins and fundamental concepts of the theory.

  14. Fundamental papers in wavelet theory

    CERN Document Server

    Walnut, David F

    2006-01-01

    This book traces the prehistory and initial development of wavelet theory, a discipline that has had a profound impact on mathematics, physics, and engineering. Interchanges between these fields during the last fifteen years have led to a number of advances in applications such as image compression, turbulence, machine vision, radar, and earthquake prediction. This book contains the seminal papers that presented the ideas from which wavelet theory evolved, as well as those major papers that developed the theory into its current form. These papers originated in a variety of journals from differ

  15. Measurement and Fundamental Processes in Quantum Mechanics

    Science.gov (United States)

    Jaeger, Gregg

    2015-07-01

    In the standard mathematical formulation of quantum mechanics, measurement is an additional, exceptional fundamental process rather than an often complex, but ordinary process which happens also to serve a particular epistemic function: during a measurement of one of its properties which is not already determined by a preceding measurement, a measured system, even if closed, is taken to change its state discontinuously rather than continuously as is usual. Many, including Bell, have been concerned about the fundamental role thus given to measurement in the foundation of the theory. Others, including the early Bohr and Schwinger, have suggested that quantum mechanics naturally incorporates the unavoidable uncontrollable disturbance of physical state that accompanies any local measurement without the need for an exceptional fundamental process or a special measurement theory. Disturbance is unanalyzable for Bohr, but for Schwinger it is due to physical interactions' being borne by fundamental particles having discrete properties and behavior which is beyond physical control. Here, Schwinger's approach is distinguished from more well known treatments of measurement, with the conclusion that, unlike most, it does not suffer under Bell's critique of quantum measurement. Finally, Schwinger's critique of measurement theory is explicated as a call for a deeper investigation of measurement processes that requires the use of a theory of quantum fields.

  16. Twenty five years of fundamental theory

    International Nuclear Information System (INIS)

    Bell, J.S.

    1980-01-01

    In reviewing the last twenty five years in fundamental physics theory it is stated that there has been no revolution in this field. In the absence of gravitation, Lorentz invariance remains a requirement on fundamental laws. Einstein's theory of gravitation inspires increasing conviction on the astronomical scale. Quantum theory remains the framework for all serious effort in microphysics, and quantum electrodynamics remains the model of a fully articulated microphysical theory, completely successful in its domain. However,a number of ideas have appeared, of great theoretical interest and some phenomenological success, which may well contribute to the next decisive step. Recent work on the following topics is mentioned; gravitational radiation, singularites, black body radiation from black holes, gauge and hidden symmetry in quantum electrodynamics, the renormalization of electromagnetic and weak interaction theory, non-Abelian gauge theories, magnetic monopoles as the most striking example of solitons, and supersymmetry. (UK)

  17. Modern measurements fundamentals and applications

    CERN Document Server

    Petri, D; Carbone, P; Catelani, M

    2015-01-01

    This book explores the modern role of measurement science for both the technically most advanced applications and in everyday and will help readers gain the necessary skills to specialize their knowledge for a specific field in measurement. Modern Measurements is divided into two parts. Part I (Fundamentals) presents a model of the modern measurement activity and the already recalled fundamental bricks. It starts with a general description that introduces these bricks and the uncertainty concept. The next chapters provide an overview of these bricks and finishes (Chapter 7) with a more general and complex model that encompasses both traditional (hard) measurements and (soft) measurements, aimed at quantifying non-physical concepts, such as quality, satisfaction, comfort, etc. Part II (Applications) is aimed at showing how the concepts presented in Part I can be usefully applied to design and implement measurements in some very impor ant and broad fields. The editors cover System Identification (Chapter 8...

  18. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  19. Fundamental Neutron Physics: Theory and Analysis

    International Nuclear Information System (INIS)

    Gudkov, Vladimir

    2016-01-01

    The goal of the proposal was to study the possibility of searching for manifestations of new physics beyond the Standard model in fundamental neutron physics experiments. This involves detailed theoretical analyses of parity- and time reversal invariance-violating processes in neutron-induced reactions, properties of neutron β-decay, and the precise description of properties of neutron interactions with nuclei. To describe neutron-nuclear interactions, we use both the effective field theory approach and the theory of nuclear reaction with phenomenological nucleon potentials for the systematic description of parity- and time reversal-violating effects in the consistent way. A major emphasis of our research during the funding period has been the study of parity violation (PV) and time reversal invariance violation (TRIV) in few-body systems. We studied PV effects in non-elastic processes in three-nucleon system using both ''DDH-like'' and effective field theory (EFT) approaches. The wave functions were obtained by solving three-body Faddeev equations in configuration space for a number of realistic strong potentials. The observed model dependence for the DDH approach indicates intrinsic difficulty in the description of nuclear PV effects, and it could be the reason for the observed discrepancies in the nuclear PV data analysis. It shows that the DDH approach could be a reasonable approach for analysis of PV effects only if exactly the same strong and weak potentials are used in calculating all PV observables in all nuclei. However, the existing calculations of nuclear PV effects were performed using different potentials; therefore, strictly speaking, one cannot compare the existing results of these calculations among themselves.

  20. The Fundamentals Of Kants Moral Theory

    Directory of Open Access Journals (Sweden)

    Adriana Saraiva Lamounier Rodrigues

    2015-12-01

    Full Text Available The article intends to study the moral thought in the philosophy of Immanuel Kant, considered the first of the philosophers who composes the movement known as German Idealism, especially in the book "Critique of Practical Reason". To achieve the objective the article begins with the traces of the moral studies at Kants time and its fundamental questions as well as traces of his formation that influenced his writings. Soon after, it analyzes the Kantian thought itself, through the work "The Idea of Justice in Kant", from Joaquim Carlos Salgado, theoretical framework of this research. It will be analyzed the postulate of freedom and its relationship with the sollen and the moral law, the species of imperatives, the categorical imperative and equality, connections that the moral theory makes for the existence of positive law and that the author considers the greater pillar of the Idea of justice from the Prussian philosophers point of of view. The methodology used in the research is theoretical, based on the analysis of the theoretical framework and its relationship to other publications concerning the same subject.

  1. Measurement theory for engineers

    CERN Document Server

    Gertsbakh, Ilya

    2003-01-01

    The emphasis of this textbook is on industrial applications of Statistical Measurement Theory. It deals with the principal issues of measurement theory, is concise and intelligibly written, and to a wide extent self-contained. Difficult theoretical issues are separated from the mainstream presentation. Each topic starts with an informal introduction followed by an example, the rigorous problem formulation, solution method, and a detailed numerical solution. Each chapter concludes with a set of exercises of increasing difficulty, mostly with solutions. The book is meant as a text for graduate students and a reference for researchers and industrial experts specializing in measurement and measurement data analysis for quality control, quality engineering and industrial process improvement using statistical methods. Knowledge of calculus and fundamental probability and statistics is required for the understanding of its contents.

  2. An integration of Integrated Information Theory with fundamental physics

    Directory of Open Access Journals (Sweden)

    Adam B Barrett

    2014-02-01

    Full Text Available To truly eliminate Cartesian ghosts from the science of consciousness, we must describe consciousness as an aspect of the physical. Integrated Information Theory states that consciousness arises from intrinsic information generated by dynamical systems; however existing formulations of this theory are not applicable to standard models of fundamental physical entities. Modern physics has shown that fields are fundamental entities, and in particular that the electromagnetic field is fundamental. Here I hypothesise that consciousness arises from information intrinsic to fundamental fields. This hypothesis unites fundamental physics with what we know empirically about the neuroscience underlying consciousness, and it bypasses the need to consider quantum effects.

  3. An integration of integrated information theory with fundamental physics.

    Science.gov (United States)

    Barrett, Adam B

    2014-01-01

    To truly eliminate Cartesian ghosts from the science of consciousness, we must describe consciousness as an aspect of the physical. Integrated Information Theory states that consciousness arises from intrinsic information generated by dynamical systems; however existing formulations of this theory are not applicable to standard models of fundamental physical entities. Modern physics has shown that fields are fundamental entities, and in particular that the electromagnetic field is fundamental. Here I hypothesize that consciousness arises from information intrinsic to fundamental fields. This hypothesis unites fundamental physics with what we know empirically about the neuroscience underlying consciousness, and it bypasses the need to consider quantum effects.

  4. Fundamental structures of M(brane) theory

    International Nuclear Information System (INIS)

    Hoppe, Jens

    2011-01-01

    A dynamical symmetry, as well as special diffeomorphism algebras generalizing the Witt-Virasoro algebra, related to Poincare invariance and crucial with regard to quantization, questions of integrability, and M(atrix) theory, are found to exist in the theory of relativistic extended objects of any dimension.

  5. Fundamentals of the physical theory of diffraction

    CERN Document Server

    Ufimtsev, Pyotr Ya

    2014-01-01

    A complete presentation of the modern physical theory of diffraction and its applications, by the world's leading authority on the topicExtensive revisions and additions to the first edition yield a second edition that is 492 pages in length, with 122 figuresNew sections examine the nature of polarization coupling, and extend the theory of shadow radiation and reflection to opaque objectsThis book features end-of-chapter problems and a solutions manual for university professors and graduate studentsMATLAB codes presented in appendices allow for quick numeric calculations of diffracted waves

  6. Fundamentals of the theory of metals

    CERN Document Server

    Abrikosov, A A

    2017-01-01

    This primer by a Nobel Prize-winning physicist offers detailed coverage of all aspects of the energy spectra of electrons in metals and the theory of superconductivity. Topics include electrical and thermal conductivities, galvanomagnetic and thermoelectrical phenomena, the behavior of metals in high-frequency fields, sound absorption, and Fermi-liquid phenomena. In addition to its value as a reference, this volume is suitable as a text for undergraduate and graduate students.

  7. Fundamentals of machine theory and mechanisms

    CERN Document Server

    Simón Mata, Antonio; Cabrera Carrillo, Juan Antonio; Ezquerro Juanco, Francisco; Guerra Fernández, Antonio Jesús; Nadal Martínez, Fernando; Ortiz Fernández, Antonio

    2016-01-01

    This book covers the basic contents for an introductory course in Mechanism and Machine Theory. The topics dealt with are: kinematic and dynamic analysis of machines, introduction to vibratory behaviour, rotor and piston balance, kinematics of gears, ordinary and planetary gear trains and synthesis of planar mechanisms. A new approach to dimensional synthesis of mechanisms based on turning functions has been added for closed and open path generation using an optimization method based on evolutionary algorithms. The text, developed by a group of experts in kinematics and dynamics of mechanisms at the University of Málaga, Spain, is clear and is supported by more than 350 images. More than 60 outlined and explained problems have been included to clarify the theoretical concepts. .

  8. Group theory for chemists fundamental theory and applications

    CERN Document Server

    Molloy, K C

    2010-01-01

    The basics of group theory and its applications to themes such as the analysis of vibrational spectra and molecular orbital theory are essential knowledge for the undergraduate student of inorganic chemistry. The second edition of Group Theory for Chemists uses diagrams and problem-solving to help students test and improve their understanding, including a new section on the application of group theory to electronic spectroscopy.Part one covers the essentials of symmetry and group theory, including symmetry, point groups and representations. Part two deals with the application of group theory t

  9. Random measures, theory and applications

    CERN Document Server

    Kallenberg, Olav

    2017-01-01

    Offering the first comprehensive treatment of the theory of random measures, this book has a very broad scope, ranging from basic properties of Poisson and related processes to the modern theories of convergence, stationarity, Palm measures, conditioning, and compensation. The three large final chapters focus on applications within the areas of stochastic geometry, excursion theory, and branching processes. Although this theory plays a fundamental role in most areas of modern probability, much of it, including the most basic material, has previously been available only in scores of journal articles. The book is primarily directed towards researchers and advanced graduate students in stochastic processes and related areas.

  10. Fundamental link between system theory and statistical mechanics

    International Nuclear Information System (INIS)

    Atmanspacher, H.; Scheingraber, H.

    1987-01-01

    A fundamental link between system theory and statistical mechanics has been found to be established by the Kolmogorov entropy. By this quantity the temporal evolution of dynamical systems can be classified into regular, chaotic, and stochastic processes. Since K represents a measure for the internal information creation rate of dynamical systems, it provides an approach to irreversibility. The formal relationship to statistical mechanics is derived by means of an operator formalism originally introduced by Prigogine. For a Liouville operator L and an information operator M tilde acting on a distribution in phase space, it is shown that i[L, M tilde] = KI (I = identity operator). As a first consequence of this equivalence, a relation is obtained between the chaotic correlation time of a system and Prigogine's concept of a finite duration of presence. Finally, the existence of chaos in quantum systems is discussed with respect to the existence of a quantum mechanical time operator

  11. DOE fundamentals handbook: Nuclear physics and reactor theory

    International Nuclear Information System (INIS)

    1993-01-01

    The Nuclear Physics and Reactor Theory Handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of nuclear physics and reactor theory. The handbook includes information on atomic and nuclear physics; neutron characteristics; reactor theory and nuclear parameters; and the theory of reactor operation. This information will provide personnel with a foundation for understanding the scientific principles that are associated with various DOE nuclear facility operations and maintenance

  12. Fundamental U-Theory of Time. Part 1

    Directory of Open Access Journals (Sweden)

    Yuvraj J. Gopaul

    2016-02-01

    Full Text Available The Fundamental U-Theory of Time (Part 1 is an original theory that aims to unravel the mystery of what exactly is ‘time’. To date very few explanations, from the branches of physics or cosmology, have succeeded to provide an accurate and comprehensive depiction of time. Most explanations have only managed to provide partial understanding or at best, glimpses of its true nature. The U-Theory uses ‘Thought Experiments’ to uncover the determining characteristics of time. In part 1 of this theory, the focus is not on the mathematics as it is on the accuracy of the depiction of time. Moreover, it challenges current views on theoretical physics, particularly on the idea of ‘time travel’. Notably, it is a theory seeking to present a fresh approach for reviewing Einstein’s Theory of Relativity, while unlocking new pathways for upcoming research in the field of physics and cosmology.

  13. SU(2) Gauge Theory with Two Fundamental Flavours

    DEFF Research Database (Denmark)

    Arthur, Rudy; Drach, Vincent; Hansen, Martin

    2016-01-01

    We investigate the continuum spectrum of the SU(2) gauge theory with $N_f=2$ flavours of fermions in the fundamental representation. This model provides a minimal template which is ideal for a wide class of Standard Model extensions featuring novel strong dynamics that range from composite...

  14. Measurement theory in quantum mechanics

    International Nuclear Information System (INIS)

    Klein, G.

    1980-01-01

    It is assumed that consciousness, memory and liberty (within the limits of the quantum mechanics indeterminism) are fundamental properties of elementary particles. Then, using this assumption it is shown how measurements and observers may be introduced in a natural way in the quantum mechanics theory. There are no longer fundamental differences between macroscopic and microscopic objects, between classical and quantum objects, between observer and object. Thus, discrepancies and paradoxes have disappeared from the conventional quantum mechanics theory. One consequence of the cumulative memory of the particles is that the sum of negentropy plus information is a constant. Using this theory it is also possible to explain the 'paranormal' phenomena and what is their difference from the 'normal' ones [fr

  15. Measurement Decision Theory.

    Science.gov (United States)

    Rudner, Lawrence M.

    This paper describes and evaluates the use of decision theory as a tool for classifying examinees based on their item response patterns. Decision theory, developed by A. Wald (1947) and now widely used in engineering, agriculture, and computing, provides a simple model for the analysis of categorical data. Measurement decision theory requires only…

  16. M(atrix) theory: matrix quantum mechanics as a fundamental theory

    International Nuclear Information System (INIS)

    Taylor, Washington

    2001-01-01

    This article reviews the matrix model of M theory. M theory is an 11-dimensional quantum theory of gravity that is believed to underlie all superstring theories. M theory is currently the most plausible candidate for a theory of fundamental physics which reconciles gravity and quantum field theory in a realistic fashion. Evidence for M theory is still only circumstantial -- no complete background-independent formulation of the theory exists as yet. Matrix theory was first developed as a regularized theory of a supersymmetric quantum membrane. More recently, it has appeared in a different guise as the discrete light-cone quantization of M theory in flat space. These two approaches to matrix theory are described in detail and compared. It is shown that matrix theory is a well-defined quantum theory that reduces to a supersymmetric theory of gravity at low energies. Although its fundamental degrees of freedom are essentially pointlike, higher-dimensional fluctuating objects (branes) arise through the non-Abelian structure of the matrix degrees of freedom. The problem of formulating matrix theory in a general space-time background is discussed, and the connections between matrix theory and other related models are reviewed

  17. Explaining crude oil prices using fundamental measures

    International Nuclear Information System (INIS)

    Coleman, Les

    2012-01-01

    Oil is the world's most important commodity, and improving the understanding of drivers of its price is a longstanding research objective. This article analyses real oil prices during 1984–2007 using a monthly dataset of fundamental and market parameters that cover financial markets, global economic growth, demand and supply of oil, and geopolitical measures. The innovation is to incorporate proxies for speculative and terrorist activity and dummies for major industry events, and quantify price impacts of each. New findings are positive links between oil prices and speculative activity, bond yields, an interaction term incorporating OPEC market share and OECD import dependence, and the number of US troops and frequency of terrorist attacks in the Middle East. Shocks also prove significant with a $6–18 per barrel impact on price for several months. - Highlights: ► Article introduces new variables to the study of oil prices. ► New variables are terrorist incidents and military activity, and oil futures market size. ► Shocks prove important affecting prices by $6–18 per barrel for several months. ► OPEC market influence rises with OECD import dependence.

  18. Fundamentals of functions and measure theory

    CERN Document Server

    Mikhalev, Alexander V; Zakharov, Valeriy K

    2018-01-01

    The series is devoted to the publication of monographs and high-level textbooks in mathematics, mathematical methods and their applications. Apart from covering important areas of current interest, a major aim is to make topics of an interdisciplinary nature accessible to the non-specialist. The works in this series are addressed to advanced students and researchers in mathematics and theoretical physics. In addition, it can serve as a guide for lectures and seminars on a graduate level. The series de Gruyter Studies in Mathematics was founded ca. 30 years ago by the late Professor Heinz Bauer and Professor Peter Gabriel with the aim to establish a series of monographs and textbooks of high standard, written by scholars with an international reputation presenting current fields of research in pure and applied mathematics. While the editorial board of the Studies has changed with the years, the aspirations of the Studies are unchanged. In times of rapid growth of mathematical knowledge carefully written monogr...

  19. Essentials of measure theory

    CERN Document Server

    Kubrusly, Carlos S

    2015-01-01

    Classical in its approach, this textbook is thoughtfully designed and composed in two parts. Part I is meant for a one-semester beginning graduate course in measure theory, proposing an “abstract” approach to measure and integration, where the classical concrete cases of Lebesgue measure and Lebesgue integral are presented as an important particular case of general theory. Part II of the text is more advanced and is addressed to a more experienced reader. The material is designed to cover another one-semester graduate course subsequent to a first course, dealing with measure and integration in topological spaces. The final section of each chapter in Part I presents problems that are integral to each chapter, the majority of which consist of auxiliary results, extensions of the theory, examples, and counterexamples. Problems which are highly theoretical have accompanying hints. The last section of each chapter of Part II consists of Additional Propositions containing auxiliary and complementary results. Th...

  20. Geometric measure theory

    CERN Document Server

    Waerden, B

    1996-01-01

    From the reviews: "... Federer's timely and beautiful book indeed fills the need for a comprehensive treatise on geometric measure theory, and his detailed exposition leads from the foundations of the theory to the most recent discoveries. ... The author writes with a distinctive style which is both natural and powerfully economical in treating a complicated subject. This book is a major treatise in mathematics and is essential in the working library of the modern analyst." Bulletin of the London Mathematical Society.

  1. Laser measurement technology fundamentals and applications

    CERN Document Server

    Donges, Axel

    2015-01-01

    Laser measurement technology has evolved in the last years in a versatile and reflationary way. Today, its methods are indispensable for research and development activities as well as for production technology. Every physicist and engineer should therefore gain a working knowledge of laser measurement technology. This book closes the gap of existing textbooks. It introduces in a comprehensible presentation laser measurement technology in all its aspects. Numerous figures, graphs and tables allow for a fast access into the matter. In the first part of the book the important physical and optical basics are described being necessary to understand laser measurement technology. In the second part technically significant measuring methods are explained and application examples are presented. Target groups of this textbook are students of natural and engineering sciences as well as working physicists and engineers, who are interested to make themselves familiar with laser measurement technology and its fascinating p...

  2. The photoelectric effect. Fundamentals of radiation measurement

    International Nuclear Information System (INIS)

    Herrmann, K.H.

    1994-01-01

    This handbook presents the physical backgrounds of the photoelectric effect (emission and capture) and indicates applications in solid state physics as photoelectron spectroscopy and photoelectric radiation measurements

  3. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  4. Five fundamental constraints on theories of the origins of music.

    Science.gov (United States)

    Merker, Bjorn; Morley, Iain; Zuidema, Willem

    2015-03-19

    The diverse forms and functions of human music place obstacles in the way of an evolutionary reconstruction of its origins. In the absence of any obvious homologues of human music among our closest primate relatives, theorizing about its origins, in order to make progress, needs constraints from the nature of music, the capacities it engages, and the contexts in which it occurs. Here we propose and examine five fundamental constraints that bear on theories of how music and some of its features may have originated. First, cultural transmission, bringing the formal powers of cultural as contrasted with Darwinian evolution to bear on its contents. Second, generativity, i.e. the fact that music generates infinite pattern diversity by finite means. Third, vocal production learning, without which there can be no human singing. Fourth, entrainment with perfect synchrony, without which there is neither rhythmic ensemble music nor rhythmic dancing to music. And fifth, the universal propensity of humans to gather occasionally to sing and dance together in a group, which suggests a motivational basis endemic to our biology. We end by considering the evolutionary context within which these constraints had to be met in the genesis of human musicality. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  5. Measurement Errors and Uncertainties Theory and Practice

    CERN Document Server

    Rabinovich, Semyon G

    2006-01-01

    Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...

  6. Infusing fundamental cause theory with features of Pierre Bourdieu's theory of symbolic power.

    Science.gov (United States)

    Veenstra, Gerry

    2018-02-01

    The theory of fundamental causes is one of the more influential attempts to provide a theoretical infrastructure for the strong associations between indicators of socioeconomic status (education, income, occupation) and health. It maintains that people of higher socioeconomic status have greater access to flexible resources such as money, knowledge, prestige, power, and beneficial social connections that they can use to reduce their risks of morbidity and mortality and minimize the consequences of disease once it occurs. However, several key aspects of the theory remain underspecified, compromising its ability to provide truly compelling explanations for socioeconomic health inequalities. In particular, socioeconomic status is an assembly of indicators that do not necessarily cohere in a straightforward way, the flexible resources that disproportionately accrue to higher status people are not clearly defined, and the distinction between socioeconomic status and resources is ambiguous. I attempt to address these definitional issues by infusing fundamental cause theory with features of a well-known theory of socioeconomic stratification in the sociological literature-Pierre Bourdieu's theory of symbolic power.

  7. Coarsening in Solid Liquid Systems: A Verification of Fundamental Theory

    Science.gov (United States)

    Thompson, John D.

    Coarsening is a process that occurs in nearly all multi-phase materials in which the total energy of a system is reduced through the reduction of total interfacial energy. The theoretical description of this process is of central importance to materials design, yet remains controversial. In order to directly compare experiment to theoretical predictions, low solid volume fraction PbSn alloys were coarsened in a microgravity environment aboard the International Space Station (ISS) as part of the Coarsening in Solid Liquid Mixtures (CSLM) project. PbSn samples with solid volume fractions of 15%, 20% and 30% were characterized in 2D and 3D using mechanical serial sectioning. The systems were observed in the self-similar regime predicted by theory and the particle size and particle density obeyed the temporal power laws predicted by theory. However, the magnitudes of the rate constants governing those temporal laws as well as the forms of the particle size distributions were not described well by theoretical predictions. Additionally, in the 30% solid volume fraction system, the higher volume fraction results in a non-spherical particle shape and a more closely packed spatial distribution. The presence of slow particle motion induced by vibrations on the ISS is presented as an explanation for this discrepancy. To model the effect of this particle motion, the Akaiwa-Voorhees multiparticle diffusion simulations are modified to treat coarsening in the presence of a small convection term, such as that of sedimentation, corresponding to low Peclet numbers. The simulations indicate that the particle size dependent velocity of the sedimentation increases the rate at which the system coarsens. This is due to the larger particles traveling farther than normal, resulting in them encountering more small particles, which favors their growth. Additionally, sedimentation resulted in broader PSDs with a peak located at the average particle size. When the simulations are modified to

  8. The Kadomtsev-Petviashvili equations and fundamental string theory

    International Nuclear Information System (INIS)

    Gilbert, G.

    1988-01-01

    In this paper the infinite sequence of non-linear partial differential equations known as the Kadomtsev-Petviashvili equations is described in simple terms and possible applications to a fundamental description of interacting strings are addressed. Lines of research likely to prove useful in formulating a description of non-perturbative string configurations are indicated. (orig.)

  9. Fundamentals of robotic mechanical systems theory, methods, and algorithms

    CERN Document Server

    Angeles, Jorge

    2014-01-01

    The 4th edition includes updated and additional examples and exercises on the core fundamental concepts of mechanics, robots, and kinematics of serial robots. New images of CAD models and physical robots help to motivate concepts being introduced. Each chapter of the book can be read independetly of others as it addresses a seperate issue in robotics.

  10. Relativistic quantum chemistry the fundamental theory of molecular science

    CERN Document Server

    Reiher, Markus

    2014-01-01

    Einstein proposed his theory of special relativity in 1905. For a long time it was believed that this theory has no significant impact on chemistry. This view changed in the 1970s when it was realized that (nonrelativistic) Schrödinger quantum mechanics yields results on molecular properties that depart significantly from experimental results. Especially when heavy elements are involved, these quantitative deviations can be so large that qualitative chemical reasoning and understanding is affected. For this to grasp the appropriate many-electron theory has rapidly evolved. Nowadays relativist

  11. Elementary Concepts and Fundamental Laws of the Theory of Heat

    Science.gov (United States)

    de Oliveira, Mário J.

    2018-03-01

    The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.

  12. Hybrid Fundamental Solution Based Finite Element Method: Theory and Applications

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2015-01-01

    Full Text Available An overview on the development of hybrid fundamental solution based finite element method (HFS-FEM and its application in engineering problems is presented in this paper. The framework and formulations of HFS-FEM for potential problem, plane elasticity, three-dimensional elasticity, thermoelasticity, anisotropic elasticity, and plane piezoelectricity are presented. In this method, two independent assumed fields (intraelement filed and auxiliary frame field are employed. The formulations for all cases are derived from the modified variational functionals and the fundamental solutions to a given problem. Generation of elemental stiffness equations from the modified variational principle is also described. Typical numerical examples are given to demonstrate the validity and performance of the HFS-FEM. Finally, a brief summary of the approach is provided and future trends in this field are identified.

  13. Atom Interferometry for Fundamental Physics and Gravity Measurements in Space

    Science.gov (United States)

    Kohel, James M.

    2012-01-01

    Laser-cooled atoms are used as freefall test masses. The gravitational acceleration on atoms is measured by atom-wave interferometry. The fundamental concept behind atom interferometry is the quantum mechanical particle-wave duality. One can exploit the wave-like nature of atoms to construct an atom interferometer based on matter waves analogous to laser interferometers.

  14. Theory of Effectiveness Measurement

    National Research Council Canada - National Science Library

    Bullock, Richard K

    2006-01-01

    .... Currently, however, there is no formal foundation for formulating effectiveness measures. This research presents a new framework for effectiveness measurement from both a theoretical and practical view...

  15. Tales of the quantum understanding physics' most fundamental theory

    CERN Document Server

    Hobson, Art

    2017-01-01

    Everybody has heard that we live in a world made of atoms. But far more fundamentally, we live in a universe made of quanta. Many things are not made of atoms: light, radio waves, electric current, magnetic fields, Earth's gravitational field, not to mention exotica such a neutron stars, black holes, dark energy, and dark matter. But everything, including atoms, is made of highly unified or "coherent" bundles of energy called "quanta" that (like everything else) obey certain rules. In the case of the quantum, these rules are called "quantum physics." This is a book about quanta and their unexpected, some would say peculiar, behavior--tales, if you will, of the quantum. The quantum has developed the reputation of being capricious, bewildering, even impossible to understand. The peculiar habits of quanta are certainly not what we would have expected to find at the foundation of physical reality, but these habits are not necessarily bewildering and not at all impossible or paradoxical. This book explains those h...

  16. Complex analysis fundamentals of the classical theory of functions

    CERN Document Server

    Stalker, John

    1998-01-01

    This clear, concise introduction to the classical theory of one complex variable is based on the premise that "anything worth doing is worth doing with interesting examples." The content is driven by techniques and examples rather than definitions and theorems. This self-contained monograph is an excellent resource for a self-study guide and should appeal to a broad audience. The only prerequisite is a standard calculus course. The first chapter deals with a beautiful presentation of special functions. . . . The third chapter covers elliptic and modular functions. . . in much more detail, and from a different point of view, than one can find in standard introductory books. . . . For [the] subjects that are omitted, the author has suggested some excellent references for the reader who wants to go through these topics. The book is read easily and with great interest. It can be recommended to both students as a textbook and to mathematicians and physicists as a useful reference. ---Mathematical Reviews Mainly or...

  17. A fundamental study of ''contribution'' transport theory and channel theory applications

    International Nuclear Information System (INIS)

    Williams, M.L.

    1992-01-01

    The objective of this three-year study is to develop a technique called ''channel theory'' that can be used in interpreting particle transport analysis such as frequently required in radiation shielding design and assessment. Channel theory is a technique used to provide insight into the mechanisms by which particles emitted from a source are transported through a complex system and register a response on some detector. It is based on the behavior of a pseudo particle called a ''contributon,'' which is the response carrier through space and energy channels that connect the source and detector. ''Contributons'' are those particles among all the ones contained in the system which will eventually contribute some amount of response to the detector. The specific goals of this projects are to provide a more fundamental theoretical understanding of the method, and to develop computer programs to apply the techniques to practical problems encountered in radiation transport analysis. The overall project can be divided into three components to meet these objectives: (a) Theoretical Development, (b) Code Development, and (c) Sample Applications. During the present third year of this study, an application of contributon theory to the analysis of radiation heating in a nuclear rocket has been completed, and a paper on the assessment of radiation damage response of an LWR pressure vessel and analysis of radiation propagation through space and energy channels in air at the Hiroshima weapon burst was accepted for publication. A major effort was devoted to developing a new ''Contributon Monte Carlo'' method, which can improve the efficiency of Monte Carlo calculations of radiation transport by tracking only contributons. The theoretical basis for Contributon Monte Carlo has been completed, and the implementation and testing of the technique is presently being performed

  18. Measurement of quantum memory effects and its fundamental limitations

    Science.gov (United States)

    Wittemer, Matthias; Clos, Govinda; Breuer, Heinz-Peter; Warring, Ulrich; Schaetz, Tobias

    2018-02-01

    We discuss that the nature of projective measurements in quantum mechanics can lead to a nontrivial bias in non-Markovianity measures, quantifying the flow of information between a system and its environment. Consequently, in the current form, envisioned applications are fundamentally limited. In our trapped-ion system, we precisely quantify such bias and perform local quantum probing to demonstrate corresponding limitations. The combination of extended measures and our scalable experimental approach can provide a versatile reference, relevant for understanding more complex systems.

  19. Modeling, Measurements, and Fundamental Database Development for Nonequilibrium Hypersonic Aerothermodynamics

    Science.gov (United States)

    Bose, Deepak

    2012-01-01

    The design of entry vehicles requires predictions of aerothermal environment during the hypersonic phase of their flight trajectories. These predictions are made using computational fluid dynamics (CFD) codes that often rely on physics and chemistry models of nonequilibrium processes. The primary processes of interest are gas phase chemistry, internal energy relaxation, electronic excitation, nonequilibrium emission and absorption of radiation, and gas-surface interaction leading to surface recession and catalytic recombination. NASAs Hypersonics Project is advancing the state-of-the-art in modeling of nonequilibrium phenomena by making detailed spectroscopic measurements in shock tube and arcjets, using ab-initio quantum mechanical techniques develop fundamental chemistry and spectroscopic databases, making fundamental measurements of finite-rate gas surface interactions, implementing of detailed mechanisms in the state-of-the-art CFD codes, The development of new models is based on validation with relevant experiments. We will present the latest developments and a roadmap for the technical areas mentioned above

  20. Theory of Effectiveness Measurement

    Science.gov (United States)

    2006-09-01

    Operations Research Society, January 2003. Hill, Raymond , Gregory A. McIntyre, Thomas R. Tighe, and Richard K. Bullock, “Some Experiments with Agent...Measurement, Physica-Verlag, Wurzburg, 1971. Pinker, Aron , Aryeh H. Samuel, and Robert Batcher, “On Measures of Effectiveness,” PHALANX, pp 8-12

  1. Theory of Effectiveness Measurement

    National Research Council Canada - National Science Library

    Bullock, Richard K

    2006-01-01

    Effectiveness measures provide decision makers feedback on the impact of deliberate actions and affect critical issues such as allocation of scarce resources, as well as whether to maintain or change existing strategy...

  2. Scattering lengths in SU(2) gauge theory with two fundamental fermions

    DEFF Research Database (Denmark)

    Arthur, R.; Drach, V.; Hansen, Martin Rasmus Lundquist

    2014-01-01

    We investigate non perturbatively scattering properties of Goldstone Bosons in an SU(2) gauge theory with two Wilson fermions in the fundamental representation. Such a theory can be used to build extensions of the Standard Model that unifies Technicolor and pseudo Goldstone composite Higgs models...

  3. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory

    Energy Technology Data Exchange (ETDEWEB)

    Kerner, Boris S. [Physics of Transportation and Traffic, University Duisburg-Essen, 47048 Duisburg (Germany)

    2015-03-10

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  4. Towards the Fundamental Quantum Limit of Linear Measurements of Classical Signals.

    Science.gov (United States)

    Miao, Haixing; Adhikari, Rana X; Ma, Yiqiu; Pang, Belinda; Chen, Yanbei

    2017-08-04

    The quantum Cramér-Rao bound (QCRB) sets a fundamental limit for the measurement of classical signals with detectors operating in the quantum regime. Using linear-response theory and the Heisenberg uncertainty relation, we derive a general condition for achieving such a fundamental limit. When applied to classical displacement measurements with a test mass, this condition leads to an explicit connection between the QCRB and the standard quantum limit that arises from a tradeoff between the measurement imprecision and quantum backaction; the QCRB can be viewed as an outcome of a quantum nondemolition measurement with the backaction evaded. Additionally, we show that the test mass is more a resource for improving measurement sensitivity than a victim of the quantum backaction, which suggests a new approach to enhancing the sensitivity of a broad class of sensors. We illustrate these points with laser interferometric gravitational-wave detectors.

  5. Fundamental questions before recording or measuring functioning and disability.

    Science.gov (United States)

    Madden, Ros; Fortune, Nicola; Cheeseman, Danielle; Mpofu, Elias; Bundy, Anita

    2013-06-01

    This paper seeks to contribute to thoughtful description, recording and measurement of functioning, by discussing some fundamental questions to consider before starting, framed as: why, what, how and who. Generic literature on measurement methods and the more specialised literature on application of the ICF over the last decade inform the consideration of these questions. The context of recording or measurement is examined, including the moral and legal framework of the UN Convention on the Rights of Persons with Disabilities (UNCRPD) and the technical framework of the International Classification of Functioning, Disability and Health (ICF). Whatever the setting in which describing, recording or measuring is being undertaken - in policy development, service planning and management, clinical management or population health monitoring - determining the purpose is the key starting point. Purpose (why) frames the consideration of content (what), method (how) and source (who). Many generic measurement methods can be applied in the disability field, but there are challenges particular to the field. The perspectives of people with disabilities and "patients" require consideration, especially with the trend to person-centred care and the social justice principles emanating from the UNCRPD. Considering these basic questions is a pre-requisite to meaningful recording and measurement of functioning and disability. Future challenges include: incorporating environmental factors into measurement; setting thresholds on the disability spectrum; and combining the views of the person concerned with those of various professionals.

  6. SU(2) Gauge Theory with Two Fundamental Flavours: a Minimal Template for Model Building

    CERN Document Server

    Arthur, Rudy; Hansen, Martin; Hietanen, Ari; Pica, Claudio; Sannino, Francesco

    2016-01-01

    We investigate the continuum spectrum of the SU(2) gauge theory with $N_f=2$ flavours of fermions in the fundamental representation. This model provides a minimal template which is ideal for a wide class of Standard Model extensions featuring novel strong dynamics that range from composite (Goldstone) Higgs theories to several intriguing types of dark matter candidates, such as the SIMPs. We improve our previous lattice analysis [1] by adding more data at light quark masses, at two additional lattice spacings, by determining the lattice cutoff via a Wilson flow measure of the $w_0$ parameter, and by measuring the relevant renormalisation constants non-perturbatively in the RI'-MOM scheme. Our results for the lightest isovector states in the vector and axial channels, in units of the pseudoscalar decay constant, are $m_V/F_{\\rm{PS}}\\sim 13.1(2.2)$ and $m_A/F_{\\rm{PS}}\\sim 14.5(3.6)$ (combining statistical and systematic errors). In the context of the composite (Goldstone) Higgs models, our result for the spin-...

  7. An ultraviolet chiral theory of the top for the fundamental composite (Goldstone) Higgs

    Energy Technology Data Exchange (ETDEWEB)

    Cacciapaglia, Giacomo, E-mail: g.cacciapaglia@ipnl.in2p3.fr [Univ Lyon, Université Lyon 1, CNRS/IN2P3, IPNL, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Sannino, Francesco, E-mail: sannino@cp3.dias.sdu.dk [CP" 3-Origins and the Danish IAS, University of Southern Denmark, Campusvej 55, DK-5230 Odense M (Denmark)

    2016-04-10

    We introduce a scalar-less anomaly free chiral gauge theory that serves as natural ultraviolet completion of models of fundamental composite (Goldstone) Higgs dynamics. The new theory is able to generate the top mass and furthermore features a built-in protection mechanism that naturally suppresses the bottom mass. At low energies the theory predicts new fractionally charged fermions, and a number of four-fermion operators that, besides being relevant for the generation of the top mass, also lead to an intriguing phenomenology for the new states predicted by the theory.

  8. Measuring Financial Risk using Extreme Value Theory: evidence from Pakistan

    OpenAIRE

    Qayyum, Abdul; Nawaz, Faisal

    2010-01-01

    The purpose of the paper is to show some methods of extreme value theory through analysis of Pakistani financial data. It also introduced the fundamental of extreme value theory as well as practical aspects for estimating and assessing financial models for tail related risk measures.

  9. The quantum theory of measurement

    CERN Document Server

    Busch, Paul; Mittelstaedt, Peter

    1996-01-01

    The amazing accuracy in verifying quantum effects experimentally has recently renewed interest in quantum mechanical measurement theory. In this book the authors give within the Hilbert space formulation of quantum mechanics a systematic exposition of the quantum theory of measurement. Their approach includes the concepts of unsharp objectification and of nonunitary transformations needed for a unifying description of various detailed investigations. The book addresses advanced students and researchers in physics and philosophy of science. In this second edition Chaps. II-IV have been substantially rewritten. In particular, an insolubility theorem for the objectification problem has been formulated in full generality, which includes unsharp object observables and unsharp pointers.

  10. Measurement and probability a probabilistic theory of measurement with applications

    CERN Document Server

    Rossi, Giovanni Battista

    2014-01-01

    Measurement plays a fundamental role both in physical and behavioral sciences, as well as in engineering and technology: it is the link between abstract models and empirical reality and is a privileged method of gathering information from the real world. Is it possible to develop a single theory of measurement for the various domains of science and technology in which measurement is involved? This book takes the challenge by addressing the following main issues: What is the meaning of measurement? How do we measure? What can be measured? A theoretical framework that could truly be shared by scientists in different fields, ranging from physics and engineering to psychology is developed. The future in fact will require greater collaboration between science and technology and between different sciences. Measurement, which played a key role in the birth of modern science, can act as an essential interdisciplinary tool and language for this new scenario. A sound theoretical basis for addressing key problems in mea...

  11. Dreams of a Final Theory-The Search for the Fundamental Laws of ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 2. Dreams of a Final Theory – The Search for the Fundamental Laws of Nature. V Balakrishnan. Book Review Volume 3 Issue 2 February 1998 pp 83-85. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Atomic spectroscopy and highly accurate measurement: determination of fundamental constants

    International Nuclear Information System (INIS)

    Schwob, C.

    2006-12-01

    This document reviews the theoretical and experimental achievements of the author concerning highly accurate atomic spectroscopy applied for the determination of fundamental constants. A pure optical frequency measurement of the 2S-12D 2-photon transitions in atomic hydrogen and deuterium has been performed. The experimental setting-up is described as well as the data analysis. Optimized values for the Rydberg constant and Lamb shifts have been deduced (R = 109737.31568516 (84) cm -1 ). An experiment devoted to the determination of the fine structure constant with an aimed relative uncertainty of 10 -9 began in 1999. This experiment is based on the fact that Bloch oscillations in a frequency chirped optical lattice are a powerful tool to transfer coherently many photon momenta to the atoms. We have used this method to measure accurately the ratio h/m(Rb). The measured value of the fine structure constant is α -1 = 137.03599884 (91) with a relative uncertainty of 6.7*10 -9 . The future and perspectives of this experiment are presented. This document presented before an academic board will allow his author to manage research work and particularly to tutor thesis students. (A.C.)

  13. Magnetocardiogram measured by fundamental mode orthogonal fluxgate array

    Science.gov (United States)

    Karo, Hikaru; Sasada, Ichiro

    2015-05-01

    Magnetocardiography (MCG) of healthy volunteers has been measured by using a fundamental mode orthogonal fluxgate magnetometer array of 32 channels in a magnetic shielded room (MSR). Sensor heads, which are employed, consist of a 45 mm long U-shaped amorphous wire core and a 1000-turn solenoid pick-up coil of 30 mm in length and 3 mm in outer diameter. The excitation current of 100 kHz with large dc bias current is fed directly into wire cores, which are connected in series, whereas the signal detection circuit is provided to each of the sensor heads. A special technique to avoid mutual interaction between sensor heads is implemented, where all the sensor heads are excited synchronously by using a single ac source. A 2-D array having 32 sensors with 4 cm grid spacing was used to measure MCG signals inside an MSR. Measured data from each channel were first filtered (0.16-100 Hz pass band), then averaged for 2 min synchronously with electrocardiogram's peaks taken from both hands. Noise remaining after the average is about 1.8 pTrms for the band-width of 0.16-100 Hz. The QRS complex and the T-wave are clearly detected.

  14. A theory of international bioethics: multiculturalism, postmodernism, and the bankruptcy of fundamentalism.

    Science.gov (United States)

    Baker, Robert

    1998-09-01

    The first of two articles analyzing the justifiability of international bioethical codes and of cross-cultural moral judgments reviews "moral fundamentalism," the theory that cross-cultural moral judgments and international bioethical codes are justified by certain "basic" or "fundamental" moral priniciples that are universally accepted in all cultures and eras. Initially propounded by the judges at the 1947 Nuremberg Tribunal, moral fundamentalism has become the received justification of international bioethics, and of cross-temporal and cross-cultural moral judgments. Yet today we are said to live in a multicultural and postmodern world. This article assesses the challenges that multiculturalism and postmodernism pose to fundamentalism and concludes that these challenges render the position philosophically untenable, thereby undermining the received conception of the foundations of international bioethics. The second article, which follows, offers an alternative model -- a model of negotiated moral order -- as a viable justification for international bioethics and for transcultural and transtemporal moral judgments.

  15. Physical Quantities, Measurement Sets, Theories

    Science.gov (United States)

    Viallefond, F.

    2012-09-01

    A methodology is proposed to develop efficient, robust and expressive data models. The idea is to transform objects described using our human language into mathematical objects which can then be used efficiently in information systems. This is done using topological spaces and algebras to model data types. Technically it is implemented using parametric polymorphism. Two examples are shown, 1) a simple well known object, the physical quantities, and 2) a data-base object, the measurement sets which bind the measurements to their experimental contexts. This leads to theories. The result is high expressiveness by formulating equations and data base operations by means of λ calculi. The theory of the measurement set encapsulates the relational model. Using topoi it is a generalization, a category above the sets.

  16. Measurements of Fundamental Fluid Physics of SNF Storage Canisters

    Energy Technology Data Exchange (ETDEWEB)

    Condie, Keith Glenn; Mc Creery, Glenn Ernest; McEligot, Donald Marinus

    2001-09-01

    With the University of Idaho, Ohio State University and Clarksean Associates, this research program has the long-term goal to develop reliable predictive techniques for the energy, mass and momentum transfer plus chemical reactions in drying / passivation (surface oxidation) operations in the transfer and storage of spent nuclear fuel (SNF) from wet to dry storage. Such techniques are needed to assist in design of future transfer and storage systems, prediction of the performance of existing and proposed systems and safety (re)evaluation of systems as necessary at later dates. Many fuel element geometries and configurations are accommodated in the storage of spent nuclear fuel. Consequently, there is no one generic fuel element / assembly, storage basket or canister and, therefore, no single generic fuel storage configuration. One can, however, identify generic flow phenomena or processes which may be present during drying or passivation in SNF canisters. The objective of the INEEL tasks was to obtain fundamental measurements of these flow processes in appropriate parameter ranges.

  17. Applied Physics of Carbon Nanotubes Fundamentals of Theory, Optics and Transport Devices

    CERN Document Server

    Rotkin, Slava V

    2005-01-01

    The book describes the state-of-the-art in fundamental, applied and device physics of nanotubes, including fabrication, manipulation and characterization for device applications; optics of nanotubes; transport and electromechanical devices and fundamentals of theory for applications. This information is critical to the field of nanoscience since nanotubes have the potential to become a very significant electronic material for decades to come. The book will benefit all all readers interested in the application of nanotubes, either in their theoretical foundations or in newly developed characterization tools that may enable practical device fabrication.

  18. Two-colour QCD at finite fundamental quark-number density and related theories

    International Nuclear Information System (INIS)

    Hands, S.J.; Kogut, J.B.; Morrison, S.E.; Sinclair, D.K.

    2001-01-01

    We are simulating SU(2) Yang-Mills theory with four flavours of dynamical quarks in the fundamental representation of SU(2) 'colour' at finite chemical potential, μ for quark number, as a model for QCD at finite baryon number density. In particular we observe that for μ large enough this theory undergoes a phase transition to a state with a diquark condensate which breaks quark-number symmetry. In this phase we examine the spectrum of light scalar and pseudoscalar bosons and see evidence for the Goldstone boson associated with this spontaneous symmetry breaking. This theory is closely related to QCD at finite chemical potential for isospin, a theory which we are now studying for SU(3) colour

  19. Two-colour QCD at finite fundamental quark-number density and related theories

    International Nuclear Information System (INIS)

    Hands, S. J.; Kogut, J. B.; Morrison, S. E.; Sinclair, D. K.

    2000-01-01

    We are simulating SU(2) Yang-Mills theory with four flavours of dynamical quarks in the fundamental representation of SU(2) colour at finite chemical potential, p for quark number, as a model for QCD at finite baryon number density. In particular we observe that for p large enough this theory undergoes a phase transition to a state with a diquark condensate which breaks quark-number symmetry. In this phase we examine the spectrum of light scalar and pseudoscalar bosons and see evidence for the Goldstone boson associated with this spontaneous symmetry breaking. This theory is closely related to QCD at finite chemical potential for isospin, a theory which we are now studying for SU(3) colour

  20. Positivism and Constitutional Post- positivism : A Debate on Breast Theory of Fundamental Rights

    Directory of Open Access Journals (Sweden)

    Matheus Felipe de Castro

    2016-05-01

    Full Text Available This article, based on the theoretical framework of the philosophy of praxis, is to discuss the strained relations between power and justice in the enforcement of fundamental rights, making a comparison between the theoretical concepts of Hans Kelsen and Robert Alexy. In this sense, are compared the thoughts of these two authors, emphasizing the central role that power has the legal conception of the first as opposed to the theory of justice that animates the legal conceptions of the second. We discuss how this tension that appears in the theoretical confrontation of the two authors is actually a moment of real, but constitutes a dialectical interaction which must be observed and deciphered in the concrete application of the law. It concludes with the search of separation from what is real from what is in this ideological debate, seeking to deepen the debate on fundamental rights as the core of modern structural theory of law.

  1. Social Security in Light of the Theory of the Fundamental Rights: A Right of Legal Personality

    Directory of Open Access Journals (Sweden)

    Edgar Dener Rodrigues

    2015-12-01

    Full Text Available This article is a study on the theory of Fundamental Rights and the Social Security System currently existing in this country, with the objective to approach the classification of the fundamental rights in its main dimensions and characteristics, assigning Social Security among those named second dimension. This is because the pension system consists in a set of principles that imposes a duty of material rendering to the State in the face of its beneficiaries, when exposed to certain social risks, which subject the person to a situation of social vulnerability. Social Security presents itself, thus as a genuine right of legal personality, for tries to confer means of livelihood upon a human being. Along these lines it tried, from theoretical and legislative research, to identify the concepts of fundamental rights, social security and social risk, combining with its foresight in the legal system, in order to identify the appropriate legal protection that the institute of Social Security must have in light of the theory of fundamental rights.

  2. Positivism and Constitutional Post- positivism : A Debate on Breast Theory of Fundamental Rights

    OpenAIRE

    Castro, Matheus Felipe de; Sanches, Samyra Hay dêe Dal Farra Naspolini

    2016-01-01

    This article, based on the theoretical framework of the philosophy of praxis, is to discuss the strained relations between power and justice in the enforcement of fundamental rights, making a comparison between the theoretical concepts of Hans Kelsen and Robert Alexy. In this sense, are compared the thoughts of these two authors, emphasizing the central role that power has the legal conception of the first as opposed to the theory of justice that animates the legal conceptions of the second. ...

  3. Introduction to probability and measure theories

    International Nuclear Information System (INIS)

    Partasarati, K.

    1983-01-01

    Chapters of probability and measured theories are presented. The Borele images of spaces with the measure into each other and in separate metric spaces are studied. The Kolmogorov theorem on the continuation of probabilies is drawn from the theorem on the measure continuation to the projective limits of spaces with measure. The integration theory is plotted, measures on multiplications of spaces are studied. The theory of conventional mathematical expectations by projections in Hilbert space is presented. In conclusion, the theory of weak convergence of measures of elements of the theory of characteristic functions and the theory of invariant and quasi-invariant measures on groups and homogeneous spaces is given

  4. Scaling properties of SU(2) gauge theory with mixed fundamental-adjoint action

    CERN Document Server

    Rinaldi, Enrico; Lucini, Biagio; Patella, Agostino; Rago, Antonio

    2012-01-01

    We study the phase diagram of the SU(2) lattice gauge theory with fundamental-adjoint Wilson plaquette action. We confirm the presence of a first order bulk phase transition and we estimate the location of its end-point in the bare parameter space. If this point is second order, the theory is one of the simplest realizations of a lattice gauge theory admitting a continuum limit at finite bare couplings. All the relevant gauge observables are monitored in the vicinity of the fixed point with very good control over finite-size effects. The scaling properties of the low-lying glueball spectrum are studied while approaching the end-point in a controlled manner.

  5. Considerations about the Influence of Values and Fundamental rigths over the Procedural law Theory

    Directory of Open Access Journals (Sweden)

    Eduardo de Avelar Lamy

    2014-12-01

    Full Text Available http://dx.doi.org/10.5007/2177-7055.2014v35n69p301 The study analyses the influence of the norms that regulate the fundamental rights and it´s values began to have over norms that concern about procedural law and about procedural law theory, especially before the Brazilian legal system, discussing, at the final part, the possibility of considering values in procedural norms.

  6. A fundamental special-relativistic theory valid for all real-valued speeds

    Directory of Open Access Journals (Sweden)

    Vedprakash Sewjathan

    1984-01-01

    Full Text Available This paper constitutes a fundamental rederivation of special relativity based on the c-invariance postulate but independent of the assumption ds′2=±ds2 (Einstein [1], Kittel et al [2], Recami [3], the equivalence principle, homogeneity of space-time, isotropy of space, group properties and linearity of space-time transformations or the coincidence of the origins of inertial space-time frames. The mathematical formalism is simpler than Einstein's [4] and Recami's [3]. Whilst Einstein's subluminal and Recami's superluminal theories are rederived in this paper by further assuming the equivalence principle and “mathematical inverses” [4,3], this paper derives (independent of these assumptions with physico-mathematical motivation an alternate singularity-free special-relativistic theory which replaces Einstein's factor [1/(1−V2/c2]12 and Recami's extended-relativistic factor [1/(V2/c2−1]12 by [(1−(V2/c2n/(1−V2/c2]12, where n equals the value of (m(V/m02 as |V|→c. In this theory both Newton's and Einstein's subluminal theories are experimentally valid on account of negligible terms. This theory implies that non-zero rest mass luxons will not be detected as ordinary non-zero rest mass bradyons because of spatial collapse, and non-zero rest mass tachyons are undetectable because they exist in another cosmos, resulting in a supercosmos of matter, with the possibility of infinitely many such supercosmoses, all moving forward in time. Furthermore this theory is not based on any assumption giving rise to the twin paradox controversy. The paper concludes with a discussion of the implications of this theory for general relativity.

  7. Boolean Approach to Dichotomic Quantum Measurement Theories

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, K. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Nakamura, T. [Keio University, Yokohama (Japan); Batle, J. [Universitat de les Illes Balears, Balearic Islands (Spain); Abdalla, S. [King Abdulaziz University Jeddah, Jeddah (Saudi Arabia); Farouk, A. [Al-Zahra College for Women, Muscat (Egypt)

    2017-02-15

    Recently, a new measurement theory based on truth values was proposed by Nagata and Nakamura [Int. J. Theor. Phys. 55, 3616 (2016)], that is, a theory where the results of measurements are either 0 or 1. The standard measurement theory accepts a hidden variable model for a single Pauli observable. Hence, we can introduce a classical probability space for the measurement theory in this particular case. Additionally, we discuss in the present contribution the fact that projective measurement theories (the results of which are either +1 or −1) imply the Bell, Kochen, and Specker (BKS) paradox for a single Pauli observable. To justify our assertion, we present the BKS theorem in almost all the two-dimensional states by using a projective measurement theory. As an example, we present the BKS theorem in two-dimensions with white noise. Our discussion provides new insight into the quantum measurement problem by using this measurement theory based on the truth values.

  8. Impact of Neutrino Oscillation Measurements on Theory

    Energy Technology Data Exchange (ETDEWEB)

    Murayama, Hitoshi

    2003-11-30

    Neutrino oscillation data had been a big surprise to theorists, and indeed they have ongoing impact on theory. I review what the impact has been, and what measurements will have critical impact on theory in the future.

  9. Justification for measurement equation: a fundamental issue in theoretical metrology

    Directory of Open Access Journals (Sweden)

    Aleksander V. Prokopov

    2013-11-01

    Full Text Available A review and a critical analysis of the specialized literature on justification for the measurement equation and an estimation of a methodical error (uncertainty of the measurement result are presented in the paper, and some prospects for solving of the issue are discussed herein.

  10. Justification for measurement equation: a fundamental issue in theoretical metrology

    OpenAIRE

    Aleksander V. Prokopov

    2013-01-01

    A review and a critical analysis of the specialized literature on justification for the measurement equation and an estimation of a methodical error (uncertainty) of the measurement result are presented in the paper, and some prospects for solving of the issue are discussed herein.

  11. Commencement measurements giving fundamental surface tension determinations in tensiometry

    International Nuclear Information System (INIS)

    Carbery, D; Morrin, D; O'Rourke, B; McMillan, N D; O'Neill, M; Riedel, S; Pringuet, P; Smith, S R P

    2011-01-01

    This study provides experimental testing of a ray-tracing model of the tensiotrace that explores the measurement potential of a well-defined optical position in the tensiotrace signal known as the 'commencement'. This point is defined as the first measureable optical coupling in the fiber drophead between source and collector fibers for light injected inside a growing drop. Tensiotrace ray-tracing model is briefly introduced. Empirical relationships of commencement measures from a wide-ranging study are presented. A number of conclusions can be drawn from the successful linking of computer predictions to these experimental relationships.

  12. The theory of Lebesgue measure and integration

    CERN Document Server

    Hartman, S; Sneddon, I N; Stark, M; Ulam, S

    1961-01-01

    The Theory of Lebesgue Measure and Integration deals with the theory of Lebesgue measure and integration and introduces the reader to the theory of real functions. The subject matter comprises concepts and theorems that are now considered classical, including the Yegorov, Vitali, and Fubini theorems. The Lebesgue measure of linear sets is discussed, along with measurable functions and the definite Lebesgue integral.Comprised of 13 chapters, this volume begins with an overview of basic concepts such as set theory, the denumerability and non-denumerability of sets, and open sets and closed sets

  13. The First Fundamental Theorem of Invariant Theory for the Orthosymplectic Supergroup

    Science.gov (United States)

    Lehrer, G. I.; Zhang, R. B.

    2017-01-01

    We give an elementary and explicit proof of the first fundamental theorem of invariant theory for the orthosymplectic supergroup by generalising the geometric method of Atiyah, Bott and Patodi to the supergroup context. We use methods from super-algebraic geometry to convert invariants of the orthosymplectic supergroup into invariants of the corresponding general linear supergroup on a different space. In this way, super Schur-Weyl-Brauer duality is established between the orthosymplectic supergroup of superdimension ( m|2 n) and the Brauer algebra with parameter m - 2 n. The result may be interpreted either in terms of the group scheme OSp( V) over C, where V is a finite dimensional super space, or as a statement about the orthosymplectic Lie supergroup over the infinite dimensional Grassmann algebra {Λ}. We take the latter point of view here, and also state a corresponding theorem for the orthosymplectic Lie superalgebra, which involves an extra invariant generator, the super-Pfaffian.

  14. Quantum theory of measurements as quantum decision theory

    International Nuclear Information System (INIS)

    Yukalov, V I; Sornette, D

    2015-01-01

    Theory of quantum measurements is often classified as decision theory. An event in decision theory corresponds to the measurement of an observable. This analogy looks clear for operationally testable simple events. However, the situation is essentially more complicated in the case of composite events. The most difficult point is the relation between decisions under uncertainty and measurements under uncertainty. We suggest a unified language for describing the processes of quantum decision making and quantum measurements. The notion of quantum measurements under uncertainty is introduced. We show that the correct mathematical foundation for the theory of measurements under uncertainty, as well as for quantum decision theory dealing with uncertain events, requires the use of positive operator-valued measure that is a generalization of projection-valued measure. The latter is appropriate for operationally testable events, while the former is necessary for characterizing operationally uncertain events. In both decision making and quantum measurements, one has to distinguish composite nonentangled events from composite entangled events. Quantum probability can be essentially different from classical probability only for entangled events. The necessary condition for the appearance of an interference term in the quantum probability is the occurrence of entangled prospects and the existence of an entangled strategic state of a decision maker or of an entangled statistical state of a measuring device

  15. Measurement of Fiber Saturation Point of Wood Using Differential Scanning Calorimetry: Measurement Fundamentals and Experimental Results

    Directory of Open Access Journals (Sweden)

    asghar tarmian

    2017-02-01

    Full Text Available In this research, the measurement fundamentals of fiber saturation point (FSP using differential scanning calorimetry (DSC method were explained. This method is based on the assumption that free water is frozen but bound water remains unfrozen in low temperatures. Thus, the FSP can be calculated by measuring the enthalpy of melting of frozen wet samples. This method measures the amount of energy absorbed or released by a sample when it is heated or cooled. Results showed that the DSC method may yield a higher FSP value compared to the widely accepted value of 30%, depending on the wood species. Both thermal and chemical (acetylation modification methods reduced the FSP but in the acetylation method, there was no linear correlation between the weight gain percentage (WPG and FSP.

  16. The two Archival Theories by John Roberts: a contribution to the fundamentals of the field.

    Directory of Open Access Journals (Sweden)

    Shirley Carvalhêdo Franco

    2017-10-01

    Full Text Available Introduction: review of the literature on archival science fundamentals. Goal: to present some of the points addressed by authors specialized in records and archival science as they relate to the "principle of provenance" and to the concept of "fund", and the problems mentioned by some of these authors, especially the thought of the American archivist John Roberts. Methodology: analyze of 23 scientific works, including books and articles, and covering the period between 1922 and 2015. Results: a concise chart organized by "year", "author", "title of the work" and "country of origin of the Author", dividing classical and contemporary authors. Conclusion: classical authors reaffirm the validity of archival foundations, contributing to the construct of the discipline; conversely, however, contemporary authors criticize theconcept of fund, propose its reformulation, and aim to expand the understanding of the principle of provenance, for example, by proposing the "notion of ramification". Among the contemporary authors, John Roberts, an American professor specialized in Records and Archival Science, produced a set of three texts which criticise consecrated elements of archival theory. Based on his answers to the three questions, he presented the genesis of his ideas and formulated a synthesis of two archival theories denominated as "necessary" and the other "superfluous".

  17. Measuring signal generators theory & design

    CERN Document Server

    Rybin, Yuriy K

    2014-01-01

    The book brings together the following issues: Theory of deterministic, random and discrete signals reproducible in oscillatory systems of generators; Generation of periodic signals with a specified spectrum, harmonic distortion factor and random signals with specified probability density function and spectral density; Synthesis of oscillatory system structures; Analysis of oscillatory systems with non-linear elements and oscillation amplitude stabilization systems; It considers the conditions and criteria of steady-state modes in signal generators on active four-pole elements with unidirectional and bidirectional transmission of signals and on two-pole elements; analogues of Barkhausen criteria; Optimization of oscillatory system structures by harmonic distortion level, minimization of a frequency error and set-up time of the steady state mode; Theory of construction of random signal generators; Construction of discrete and digital signal generators; Practical design of main units of generators; Practical bl...

  18. Evaluating fundamentals of care: The development of a unit-level quality measurement and improvement programme.

    Science.gov (United States)

    Parr, Jenny M; Bell, Jeanette; Koziol-McLain, Jane

    2018-01-02

    The project aimed to develop a unit-level quality measurement and improvement programme using evidence-based fundamentals of care. Feedback from patients, families, whānau, staff and audit data in 2014 indicated variability in the delivery of fundamental aspects of care such as monitoring, nutrition, pain management and environmental cleanliness at a New Zealand District Health Board. A general inductive approach was used to explore the fundamentals of care and design a measurement and improvement programme, the Patient and Whānau Centred Care Standards (PWCCS), focused on fundamental care. Five phases were used to explore the evidence, and design and test a measurement and improvement framework. Nine identified fundamental elements of care were used to define expected standards of care and develop and test a measurement and improvement framework. Four six-monthly peer reviews have been undertaken since June 2015. Charge Nurse Managers used results to identify quality improvements. Significant improvement was demonstrated overall, in six of the 27 units, in seven of the nine standards and three of the four measures. In all, 89% (n = 24) of units improved their overall result. The PWCCS measurement and improvement framework make visible nursing fundamentals of care in line with continuous quality improvement to increase quality of care. Delivering fundamentals of care is described by nurses as getting ?back to basics'. Patient and family feedback supports the centrality of fundamentals of care to their hospital experience. Implementing a unit-level fundamentals of care quality measurement and improvement programme clarifies expected standards of care, highlights the contribution of fundamentals of care to quality and provides a mechanism for ongoing improvements. © 2018 John Wiley & Sons Ltd.

  19. Discrete time interval measurement system: fundamentals, resolution and errors in the measurement of angular vibrations

    International Nuclear Information System (INIS)

    Gómez de León, F C; Meroño Pérez, P A

    2010-01-01

    The traditional method for measuring the velocity and the angular vibration in the shaft of rotating machines using incremental encoders is based on counting the pulses at given time intervals. This method is generically called the time interval measurement system (TIMS). A variant of this method that we have developed in this work consists of measuring the corresponding time of each pulse from the encoder and sampling the signal by means of an A/D converter as if it were an analog signal, that is to say, in discrete time. For this reason, we have denominated this method as the discrete time interval measurement system (DTIMS). This measurement system provides a substantial improvement in the precision and frequency resolution compared with the traditional method of counting pulses. In addition, this method permits modification of the width of some pulses in order to obtain a mark-phase on every lap. This paper explains the theoretical fundamentals of the DTIMS and its application for measuring the angular vibrations of rotating machines. It also displays the required relationship between the sampling rate of the signal, the number of pulses of the encoder and the rotating velocity in order to obtain the required resolution and to delimit the methodological errors in the measurement

  20. ANALYSIS OF PUBLIC COURT-ORDERED-DEBT DISCLOSURE: INFLUENCE OF LEGISLATION AND FUNDAMENTALS OF ACCOUNTING THEORY

    Directory of Open Access Journals (Sweden)

    Lucas Oliveira Gomes Ferreira

    2012-03-01

    Full Text Available The purpose of the present study is to analyze the accounting disclosure of judicial payments warrants (precatórios, issued when governmental entities are found liable for pecuniary awards in lawsuits according to accounting theory, and to verify if the current legislation interferes in the accounting treatment of these instruments. In this sense, we performed a documental and literature review about the legal framework and accounting procedures adopted, as well gathered data from the National Treasury Secretariat Data Collection System (SISTN in the period 2004-2009 and consulted a study carried out by the Supreme Court (STF in 2004. The study’s justification is based on the perception that over than a half of judicial payment warrants are not registered in the public accounts. Consequently, whereas these warrants (i vested rights of the plaintiffs and (ii debts of the public entity, the lack of accounting disclosure jeopardizes both the beneficiary, whose right is not reflected in the public accounts, thus casting doubt on the expectation to receive payment, and government managers and society, who do not have reliable information that allows effective management. The innovation of this paper consists of discussing identification of the appropriate moment of the generating event of the underlying debts and the proposal of disclosure considering the risk classification. In conclusion, the influence of the current legislation and the failure to observe accounting fundamentals are among the likely factors that have affected the proper accounting of judicial payment warrants within the Brazilian public administration.

  1. Fundamental Theories and Key Technologies for Smart and Optimal Manufacturing in the Process Industry

    Directory of Open Access Journals (Sweden)

    Feng Qian

    2017-04-01

    Full Text Available Given the significant requirements for transforming and promoting the process industry, we present the major limitations of current petrochemical enterprises, including limitations in decision-making, production operation, efficiency and security, information integration, and so forth. To promote a vision of the process industry with efficient, green, and smart production, modern information technology should be utilized throughout the entire optimization process for production, management, and marketing. To focus on smart equipment in manufacturing processes, as well as on the adaptive intelligent optimization of the manufacturing process, operating mode, and supply chain management, we put forward several key scientific problems in engineering in a demand-driven and application-oriented manner, namely: ① intelligent sensing and integration of all process information, including production and management information; ② collaborative decision-making in the supply chain, industry chain, and value chain, driven by knowledge; ③ cooperative control and optimization of plant-wide production processes via human-cyber-physical interaction; and ④ life-cycle assessments for safety and environmental footprint monitoring, in addition to tracing analysis and risk control. In order to solve these limitations and core scientific problems, we further present fundamental theories and key technologies for smart and optimal manufacturing in the process industry. Although this paper discusses the process industry in China, the conclusions in this paper can be extended to the process industry around the world.

  2. Quantum decision theory as quantum theory of measurement

    International Nuclear Information System (INIS)

    Yukalov, V.I.; Sornette, D.

    2008-01-01

    We present a general theory of quantum information processing devices, that can be applied to human decision makers, to atomic multimode registers, or to molecular high-spin registers. Our quantum decision theory is a generalization of the quantum theory of measurement, endowed with an action ring, a prospect lattice and a probability operator measure. The algebra of probability operators plays the role of the algebra of local observables. Because of the composite nature of prospects and of the entangling properties of the probability operators, quantum interference terms appear, which make actions noncommutative and the prospect probabilities nonadditive. The theory provides the basis for explaining a variety of paradoxes typical of the application of classical utility theory to real human decision making. The principal advantage of our approach is that it is formulated as a self-consistent mathematical theory, which allows us to explain not just one effect but actually all known paradoxes in human decision making. Being general, the approach can serve as a tool for characterizing quantum information processing by means of atomic, molecular, and condensed-matter systems

  3. Quantum measure and integration theory

    International Nuclear Information System (INIS)

    Gudder, Stan

    2009-01-01

    This article begins with a review of quantum measure spaces. Quantum forms and indefinite inner-product spaces are then discussed. The main part of the paper introduces a quantum integral and derives some of its properties. The quantum integral's form for simple functions is characterized and it is shown that the quantum integral generalizes the Lebesgue integral. A bounded, monotone convergence theorem for quantum integrals is obtained and it is shown that a Radon-Nikodym-type theorem does not hold for quantum measures. As an example, a quantum-Lebesgue integral on the real line is considered.

  4. Foams theory, measurements, and applications

    CERN Document Server

    Khan, Saad A

    1996-01-01

    This volume discusses the physics and physical processes of foam and foaming. It delineates various measurement techniques for characterizing foams and foam properties as well as the chemistry and application of foams. The use of foams in the textile industry, personal care products, enhanced oil recovery, firefighting and mineral floatation are highlighted, and the connection between the microstructure and physical properties of foam are detailed. Coverage includes nonaqueous foams and silicone antifoams, and more.

  5. Immersed in media telepresence theory, measurement & technology

    CERN Document Server

    Lombard, Matthew; Freeman, Jonathan; IJsselsteijn, Wijnand; Schaevitz, Rachel J

    2015-01-01

    Highlights key research currently being undertaken within the field of telepresence, providing the most detailed account of the field to date, advancing our understanding of a fundamental property of all media - the illusion of presence; the sense of "being there" inside a virtual environment, with actual or virtual others. This collection has been put together by leading international scholars from America, Europe, and Asia. Together, they describe the state-of-the-art in presence theory, research and technology design for an advanced academic audience. Immersed in Media provides research t

  6. Theory of Bessel Functions of High Rank - I: Fundamental Bessel Functions

    OpenAIRE

    Qi, Zhi

    2014-01-01

    In this article we introduce a new category of special functions called fundamental Bessel functions arising from the Voronoi summation formula for $\\mathrm{GL}_n (\\mathbb{R})$. The fundamental Bessel functions of rank one and two are the oscillatory exponential functions $e^{\\pm i x}$ and the classical Bessel functions respectively. The main implements and subjects of our study of fundamental Bessel functions are their formal integral representations and Bessel equations.

  7. Measure theory and fine properties of functions

    CERN Document Server

    Evans, Lawrence Craig

    2015-01-01

    Measure Theory and Fine Properties of Functions, Revised Edition provides a detailed examination of the central assertions of measure theory in n-dimensional Euclidean space. The book emphasizes the roles of Hausdorff measure and capacity in characterizing the fine properties of sets and functions. Topics covered include a quick review of abstract measure theory, theorems and differentiation in ℝn, Hausdorff measures, area and coarea formulas for Lipschitz mappings and related change-of-variable formulas, and Sobolev functions as well as functions of bounded variation.The text provides complete proofs of many key results omitted from other books, including Besicovitch's covering theorem, Rademacher's theorem (on the differentiability a.e. of Lipschitz functions), area and coarea formulas, the precise structure of Sobolev and BV functions, the precise structure of sets of finite perimeter, and Aleksandrov's theorem (on the twice differentiability a.e. of convex functions).This revised edition includes countl...

  8. Measuring uncertainty within the theory of evidence

    CERN Document Server

    Salicone, Simona

    2018-01-01

    This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone’s Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining ...

  9. Ocean Ambient Noise Measurement and Theory

    CERN Document Server

    Carey, William M

    2011-01-01

    This book develops the theory of ocean ambient noise mechanisms and measurements, and also describes general noise characteristics and computational methods.  It concisely summarizes the vast ambient noise literature using theory combined with key representative results.  The air-sea boundary interaction zone is described in terms of non-dimensional variables requisite for future experiments.  Noise field coherency, rare directional measurements, and unique basin scale computations and methods are presented.  The use of satellite measurements in these basin scale models is demonstrated.  Finally, this book provides a series of appendices giving in-depth mathematical treatments.  With its complete and careful discussions of both theory and experimental results, this book will be of the greatest interest to graduate students and active researchers working in fields related to ambient noise in the ocean.

  10. The Quest for a Fundamental Theory of Physics - Rise and Demise of the Field Paradigm

    NARCIS (Netherlands)

    Holman, M.

    2014-01-01

    Quite remarkably, the two physical theories that describe extremely well physical phenomena on the largest and smallest distance scales in our universe, viz. general relativity and quantum theory, respectively, are radically disparate. Both theories are now almost a century old and have passed with

  11. Lattice study for conformal windows of SU(2) and SU(3) gauge theories with fundamental fermions

    CERN Document Server

    Huang, Cynthia Y.-H.; Lin, C.-J. David; Ogawa, Kenji; Ohki, Hiroshi; Ramos, Alberto; Rinaldi, Enrico

    2015-10-30

    We present our investigation of SU(2) gauge theory with 8 flavours, and SU(3) gauge theory with 12 flavours. For the SU(2) case, at strong bare coupling, $\\beta \\lesssim 1.45$, the distribution of the lowest eigenvalue of the Dirac operator can be described by chiral random matrix theory for the Gaussian symplectic ensemble. Our preliminary result indicates that the chiral phase transition in this theory is of bulk nature. For the SU(3) theory, we use high-precision lattice data to perform the step-scaling study of the coupling, $g_{{\\rm GF}}$, in the Gradient Flow scheme. We carefully examine the reliability of the continuum extrapolation in the analysis, and conclude that the scaling behaviour of this SU(3) theory is not governed by possible infrared conformality at $g_{{\\rm GF}}^{2} \\lesssim 6$.

  12. Field algebras in quantum theory with indefinite metric. III. Spectrum of modular operator and Tomita's fundamental theorem

    International Nuclear Information System (INIS)

    Dadashyan, K.Yu.; Khoruzhii, S.S.

    1987-01-01

    The construction of a modular theory for weakly closed J-involutive algebras of bounded operators on Pontryagin spaces is continued. The spectrum of the modular operator Δ of such an algebra is investigated, the existence of a strongly continuous J-unitary group is established and, under the condition that the spectrum lies in the right half-plane, Tomita's fundamental theorem is proved

  13. Is Education a Fundamental Right? People's Lay Theories About Intellectual Potential Drive Their Positions on Education.

    Science.gov (United States)

    Savani, Krishna; Rattan, Aneeta; Dweck, Carol S

    2017-09-01

    Does every child have a fundamental right to receive a high-quality education? We propose that people's beliefs about whether "nearly everyone" or "only some people" have high intellectual potential drive their positions on education. Three studies found that the more people believed that nearly everyone has high potential, the more they viewed education as a fundamental human right. Furthermore, people who viewed education as a fundamental right, in turn (a) were more likely to support the institution of free public education, (b) were more concerned upon learning that students in the country were not performing well academically compared with students in peer nations, and (c) were more likely to support redistributing educational funds more equitably across wealthier and poorer school districts. The studies show that people's beliefs about intellectual potential can influence their positions on education, which can affect the future quality of life for countless students.

  14. Dreams of a Final Theory - The Search for the Fundamental Laws of ...

    Indian Academy of Sciences (India)

    fashion. In the euphoria induced by these successes, many physicists began to feel that a 'theory of everything', a 'final' theory, was in the offing. These initial expectations have certainly not been borne out. However, some progress ... make their appearance, and may even be justified on purely technical grounds. However ...

  15. Fundamental Flaws In The Derivation Of Stevens' Law For Taste Within Norwich's Entropy Theory of Perception

    International Nuclear Information System (INIS)

    Nizami, Lance

    2010-01-01

    Norwich's Entropy Theory of Perception (1975-present) is a general theory of perception, based on Shannon's Information Theory. Among many bold claims, the Entropy Theory presents a truly astounding result: that Stevens' Law with an Index of 1, an empirical power relation of direct proportionality between perceived taste intensity and stimulus concentration, arises from theory alone. Norwich's theorizing starts with several extraordinary hypotheses. First, 'multiple, parallel receptor-neuron units' without collaterals 'carry essentially the same message to the brain', i.e. the rate-level curves are identical. Second, sensation is proportional to firing rate. Third, firing rate is proportional to the taste receptor's 'resolvable uncertainty'. Fourth, the 'resolvable uncertainty' is obtained from Shannon's Information Theory. Finally, 'resolvable uncertainty' also depends upon the microscopic thermodynamic density fluctuation of the tasted solute. Norwich proves that density fluctuation is density variance, which is proportional to solute concentration, all based on the theory of fluctuations in fluid composition from Tolman's classic physics text, 'The Principles of Statistical Mechanics'. Altogether, according to Norwich, perceived taste intensity is theoretically proportional to solute concentration. Such a universal rule for taste, one that is independent of solute identity, personal physiological differences, and psychophysical task, is truly remarkable and is well-deserving of scrutiny. Norwich's crucial step was the derivation of density variance. That step was meticulously reconstructed here. It transpires that the appropriate fluctuation is Tolman's mean-square fractional density fluctuation, not density variance as used by Norwich. Tolman's algebra yields a 'Stevens Index' of -1 rather than 1. As 'Stevens Index' empirically always exceeds zero, the Index of -1 suggests that it is risky to infer psychophysical laws of sensory response from information theory

  16. Two-ion theory of energy coupling in ATP synthesis rectifies a fundamental flaw in the governing equations of the chemiosmotic theory.

    Science.gov (United States)

    Nath, Sunil

    2017-11-01

    The vital coupled processes of oxidative phosphorylation and photosynthetic phosphorylation synthesize molecules of adenosine-5'-triphosphate (ATP), the universal biological energy currency, and sustain all life on our planet. The chemiosmotic theory of energy coupling in oxidative and photophosphorylation was proposed by Mitchell >50years ago. It has had a contentious history, with part of the accumulated body of experimental evidence supporting it, and part of it in conflict with the theory. Although the theory was strongly criticized by many prominent scientists, the controversy has never been resolved. Here, the mathematical steps of Mitchell's original derivation leading to the principal equation of the chemiosmotic theory are scrutinized, and a fundamental flaw in them has been identified. Surprisingly, this flaw had not been detected earlier. Discovery of such a defect negates, or at least considerably weakens, the theoretical foundations on which the chemiosmotic theory is based. Ad hoc or simplistic ways to remedy this defect are shown to be scientifically unproductive and sterile. A novel two-ion theory of biological energy coupling salvages the situation by rectifying the fundamental flaw in the chemiosmotic theory, and the governing equations of the new theory have been shown to accurately quantify and predict extensive recent experimental data on ATP synthesis by F 1 F O -ATP synthase without using adjustable parameters. Some major biological implications arising from the new thinking are discussed. The principles of energy transduction and coupling proposed in the new paradigm are shown to be of a very general and universal nature. It is concluded that the timely availability after a 25-year research struggle of Nath's torsional mechanism of energy transduction and ATP synthesis is a rational alternative that has the power to solve the problems arising from the past, and also meet present and future challenges in this important interdisciplinary field

  17. Accurate Estimation of Low Fundamental Frequencies from Real-Valued Measurements

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2013-01-01

    In this paper, the difficult problem of estimating low fundamental frequencies from real-valued measurements is addressed. The methods commonly employed do not take the phenomena encountered in this scenario into account and thus fail to deliver accurate estimates. The reason for this is that the......In this paper, the difficult problem of estimating low fundamental frequencies from real-valued measurements is addressed. The methods commonly employed do not take the phenomena encountered in this scenario into account and thus fail to deliver accurate estimates. The reason...... for this is that they employ asymptotic approximations that are violated when the harmonics are not well-separated in frequency, something that happens when the observed signal is real-valued and the fundamental frequency is low. To mitigate this, we analyze the problem and present some exact fundamental frequency estimators...... that are aimed at solving this problem. These esti- mators are based on the principles of nonlinear least-squares, harmonic fitting, optimal filtering, subspace orthogonality, and shift-invariance, and they all reduce to already published methods for a high number of observations. In experiments, the methods...

  18. Theory of Self- vs. Externally-Regulated LearningTM: Fundamentals, Evidence, and Applicability

    Science.gov (United States)

    de la Fuente-Arias, Jesús

    2017-01-01

    The Theory of Self- vs. Externally-Regulated LearningTM has integrated the variables of SRL theory, the DEDEPRO model, and the 3P model. This new Theory has proposed: (a) in general, the importance of the cyclical model of individual self-regulation (SR) and of external regulation stemming from the context (ER), as two different and complementary variables, both in combination and in interaction; (b) specifically, in the teaching-learning context, the relevance of different types of combinations between levels of self-regulation (SR) and of external regulation (ER) in the prediction of self-regulated learning (SRL), and of cognitive-emotional achievement. This review analyzes the assumptions, conceptual elements, empirical evidence, benefits and limitations of SRL vs. ERL Theory. Finally, professional fields of application and future lines of research are suggested. PMID:29033872

  19. Gauges and functional measures in quantum gravity I: Einstein theory

    Energy Technology Data Exchange (ETDEWEB)

    Ohta, N. [Department of Physics, Kindai University,Higashi-Osaka, Osaka 577-8502 (Japan); Percacci, R. [International School for Advanced Studies,via Bonomea 265, 34136 Trieste (Italy); INFN, Sezione di Trieste,Trieste (Italy); Pereira, A.D. [International School for Advanced Studies,via Bonomea 265, 34136 Trieste (Italy); Universidade Federal Fluminense, Instituto de Física, Campus da Praia Vermelha, Avenida General Milton Tavares de Souza s/n, 24210-346, Niterói, RJ (Brazil); Max Planck Institute for Gravitational Physics (Albert Einstein Institute),Am Mühlenberg 1, Potsdam 14476 (Germany)

    2016-06-20

    We perform a general computation of the off-shell one-loop divergences in Einstein gravity, in a two-parameter family of path integral measures, corresponding to different ways of parametrizing the graviton field, and a two-parameter family of gauges. Trying to reduce the gauge- and measure-dependence selects certain classes of measures and gauges respectively. There is a choice of two parameters (corresponding to the exponential parametrization and the partial gauge condition that the quantum field be traceless) that automatically eliminates the dependence on the remaining two parameters and on the cosmological constant. We observe that the divergences are invariant under a Z{sub 2} “duality” transformation that (in a particularly important special case) involves the replacement of the densitized metric by a densitized inverse metric as the fundamental quantum variable. This singles out a formulation of unimodular gravity as the unique “self-dual” theory in this class.

  20. Historical-systematic fundaments of the Trinitarian theory of the liturgical event

    Directory of Open Access Journals (Sweden)

    Robert Woźniak

    2011-12-01

    Full Text Available The object of present research is to develop some fundamental traces of the Trinitarian understanding of the Christian liturgy. The article attempts to point out to the fundamental coordinates of Trinitarian comprehension of the liturgy from the historical perspective. In order to do this, it traces the links between first formulations of Trinitarian faith and early development of the Christian liturgy. The argument starts with consideration of some new biblical approaches to the phenomena of early Christian cult seen in its theological (Christological and Trinitarian constellation (Bauckham, Hurtado. After this preliminary biblical-theological inquiry, some fundamental patristic texts are taken into account. The last stage of investigation is presentation of Second Vatican Council’s account of the theology of liturgy which proofs itself to be openly Trinitarian.

  1. Measurement Models for Reasoned Action Theory

    OpenAIRE

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...

  2. Geometric measure theory a beginner's guide

    CERN Document Server

    Morgan, Frank

    1995-01-01

    Geometric measure theory is the mathematical framework for the study of crystal growth, clusters of soap bubbles, and similar structures involving minimization of energy. Morgan emphasizes geometry over proofs and technicalities, and includes a bibliography and abundant illustrations and examples. This Second Edition features a new chapter on soap bubbles as well as updated sections addressing volume constraints, surfaces in manifolds, free boundaries, and Besicovitch constant results. The text will introduce newcomers to the field and appeal to mathematicians working in the field.

  3. Measurement-induced two-qubit entanglement in a bad cavity: Fundamental and practical considerations

    DEFF Research Database (Denmark)

    Julsgaard, Brian; Mølmer, Klaus

    2012-01-01

    An entanglement-generating protocol is described for two qubits coupled to a cavity field in the bad-cavity limit. By measuring the amplitude of a field transmitted through the cavity, an entangled spin-singlet state can be established probabilistically. Both fundamental limitations and practical...... measurement schemes are discussed, and the influence of dissipative processes and inhomogeneities in the qubits are analyzed. The measurement-based protocol provides criteria for selecting states with an infidelity scaling linearly with the qubit-decoherence rate....

  4. Measurement-induced two-qubit entanglement in a bad cavity: Fundamental and practical considerations

    Science.gov (United States)

    Julsgaard, Brian; Mølmer, Klaus

    2012-03-01

    An entanglement-generating protocol is described for two qubits coupled to a cavity field in the bad-cavity limit. By measuring the amplitude of a field transmitted through the cavity, an entangled spin-singlet state can be established probabilistically. Both fundamental limitations and practical measurement schemes are discussed, and the influence of dissipative processes and inhomogeneities in the qubits are analyzed. The measurement-based protocol provides criteria for selecting states with an infidelity scaling linearly with the qubit-decoherence rate.

  5. FUNDAMENTALS OF THE THEORY OF VENTILLATION PROCESSES IN THE STEAM TURBINES TPP

    Directory of Open Access Journals (Sweden)

    V. M. Neuimin

    2015-01-01

    Full Text Available  The article proposes the theoretical framework of ventilation processes emerging and going on in the stages of TPP steam turbines during the operating regimes with small-quantity volumetric flow rates in the low-pressure cylinder. The basic theory includes new physicomathematical models for estimating the ventilating capacity losses and ventilation heatings-up of the steam and the air-gas channel of the turbine; search and investigation of the factors causing the increased momental loads on the blade wheels of the finale stages which are likely to lead to destruction of the rotating blades. The paper renders the practical results of utilizing the theoretical framework of ventilation processes.The author obtains a new mathematical relation for high-accuracy assessment of the ventilating capacity losses accounting for all the diversification of parameters defining the level of these losses (it is established that the Coriolis force contributes twice as much to the ventilating capacity losses as the centrifugal force. Seven ordinary formulae obtained on its basis provide a separate stage ventilation-losses immediate evaluation (with rotation blades of the finale stage not unwinding from the turning, with rotation blades of the finale and intermediate stages unwinding from the turning, in the turbine altogether-vapor-evacuated including by readings of the regular instruments located at the connecters of the exhaust part of the lowpressure cylinder.As the cornerstone of the new ventilation heating-up evaluation system the author lays two experimentally established facts: the ventilating capacity losses are practically constant at working steam negligible volumetric flow rates; symmetrical ventilating flows in the blade channel mingle entirely to the moment of their split up at the periphery. This renders possible estimating the complete enthalpy increment of the steam being discharged from a stage in relation to the enthalpy of the steam being

  6. Fundamentals of the fuzzy logic-based generalized theory of decisions

    CERN Document Server

    Aliev, Rafik Aziz

    2013-01-01

    Every day decision making and decision making in complex human-centric systems are characterized by imperfect decision-relevant information. Main drawback of the existing decision theories is namely incapability to deal with imperfect information and modeling vague preferences. Actually, a paradigm of non-numerical probabilities in decision making has a long history and arose also in Keynes’s analysis of uncertainty. There is a need for further generalization – a move to decision theories with perception-based imperfect information described in NL. The languages of new decision models for human-centric systems should be not languages based on binary logic but human-centric computational schemes able to operate on NL-described information. Development of new theories is now possible due to an increased computational power of information processing systems which allows for computations with imperfect information, particularly, imprecise and partially true information, which are much more complex than comput...

  7. Establishment of a room temperature molten salt capability to measure fundamental thermodynamic properties of actinide elements

    International Nuclear Information System (INIS)

    Smith, W.H.; Costa, D.A.

    1998-01-01

    This is the final report of a six-month, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). The goal of this work was to establish a capability for the measurement of fundamental thermodynamic properties of actinide elements in room temperature molten salts. This capability will be used to study in detail the actinide chloro- and oxo- coordination chemistries that dominate in the chloride-based molten salt media. Uranium will be the first actinide element under investigation

  8. The basics of information security understanding the fundamentals of InfoSec in theory and practice

    CERN Document Server

    Andress, Jason

    2014-01-01

    As part of the Syngress Basics series, The Basics of Information Security provides you with fundamental knowledge of information security in both theoretical and practical aspects. Author Jason Andress gives you the basic knowledge needed to understand the key concepts of confidentiality, integrity, and availability, and then dives into practical applications of these ideas in the areas of operational, physical, network, application, and operating system security. The Basics of Information Security gives you clear-non-technical explanations of how infosec works and how to apply these princi

  9. Ends, fundamental tones and capacity of minimal submanifolds via extrinsic comparison theory

    DEFF Research Database (Denmark)

    Gimeno, Vicent; Markvorsen, Steen

    2015-01-01

    We study the volume of extrinsic balls and the capacity of extrinsic annuli in minimal submanifolds which are properly immersed with controlled radial sectional curvatures into an ambient manifold with a pole. The key results are concerned with the comparison of those volumes and capacities with ...... with the corresponding entities in a rotationally symmetric model manifold. Using the asymptotic behavior of the volumes and capacities we then obtain upper bounds for the number of ends as well as estimates for the fundamental tone of the submanifolds in question....

  10. Measurement Theory, Nomological Machine And Measurement Uncertainties (In Classical Physics

    Directory of Open Access Journals (Sweden)

    Ave Mets

    2012-12-01

    Full Text Available Measurement is said to be the basis of exact sciences as the process of assigning numbers to matter (things or their attributes, thus making it possible to apply the mathematically formulated laws of nature to the empirical world. Mathematics and empiria are best accorded to each other in laboratory experiments which function as what Nancy Cartwright calls nomological machine: an arrangement generating (mathematical regularities. On the basis of accounts of measurement errors and uncertainties, I will argue for two claims: 1 Both fundamental laws of physics, corresponding to ideal nomological machine, and phenomenological laws, corresponding to material nomological machine, lie, being highly idealised relative to the empirical reality; and also laboratory measurement data do not describe properties inherent to the world independently of human understanding of it. 2 Therefore the naive, representational view of measurement and experimentation should be replaced with a more pragmatic or practice-based view.

  11. Distribution of the energy levels of individual interface traps and a fundamental refinement in charge pumping theory

    Science.gov (United States)

    Tsuchiya, Toshiaki; Lenahan, Patrick M.

    2017-03-01

    We carried out a unique and systematic characterization of single amphoteric Si/SiO2 interface traps using the charge pumping (CP) method. As a result, we obtained the distribution of the energy levels of these traps for the first time. The distribution is reasonably similar to that of the P b0 density of states reported previously. By considering the essential nature of these traps (i.e., those with two energy levels), factors depending on the energy levels, and the Coulomb interactions between traps, we fundamentally corrected the conventional CP theory.

  12. Applications of measure theory to statistics

    CERN Document Server

    Pantsulaia, Gogi

    2016-01-01

    This book aims to put strong reasonable mathematical senses in notions of objectivity and subjectivity for consistent estimations in a Polish group by using the concept of Haar null sets in the corresponding group. This new approach – naturally dividing the class of all consistent estimates of an unknown parameter in a Polish group into disjoint classes of subjective and objective estimates – helps the reader to clarify some conjectures arising in the criticism of null hypothesis significance testing. The book also acquaints readers with the theory of infinite-dimensional Monte Carlo integration recently developed for estimation of the value of infinite-dimensional Riemann integrals over infinite-dimensional rectangles. The book is addressed both to graduate students and to researchers active in the fields of analysis, measure theory, and mathematical statistics.

  13. Views of a devil's advocate -- Fundamental challenges to effective field theory treatments of nuclear physics

    International Nuclear Information System (INIS)

    Cohen, T.D.

    1998-04-01

    The physics goals of the effective field theory program for nuclear phenomena are outlined. It is pointed out that there are multiple schemes for implementing EFT and it is presently not clear if any of these schemes is viable. Most of the applications of effective field theory ideas have been on nucleon-nucleon scattering. It is argued that this is little more than curve fitting and that other quantities need to be calculated to test the ideas. It is shown that EFT methods work well for certain bound state properties of the deuteron electric form factor. However, it is also shown that this success depends sensitively on the fact that the majority of the probability of the deuteron's wave function is beyond the range of the potential. This circumstance is special to the deuteron suggesting that it will be very difficult to achieve the same kinds of success for tightly bound nuclei

  14. Education and liberty: the habitus formation as a fundamental critical element to the theory of emancipation

    Directory of Open Access Journals (Sweden)

    Adreana Dulcina Platt

    2017-11-01

    Full Text Available The study evaluates the cumulative path traveled by the subject originally submitted to the state of nature (hominization to the condition of ‘human being’ made of instrumental complexity and cognitive (humanization elements. Education becomes the constitution axis from hominization to humanization, through the assumptions of incorporation of habitus of human nature. Through Historical Materialist Theory, we described habitus as the exercise of practices repeated and incorporated into the formation of the subject, thus becoming a ‘second nature’. The Theory of Emancipation described in Kant and Horkheimer and Adorno, we check the status of liberty of the subjects by the incorporation of a practice primarily belonging to the world of reason (ethics and aesthetics or by negative educational action.

  15. Nuclear Fission: from more phenomenology and adjusted parameters to more fundamental theory and increased predictive power

    Science.gov (United States)

    Bulgac, Aurel; Jin, Shi; Magierski, Piotr; Roche, Kenneth; Schunck, Nicolas; Stetcu, Ionel

    2017-11-01

    Two major recent developments in theory and computational resources created the favorable conditions for achieving a microscopic description of fission dynamics in classically allowed regions of the collective potential energy surface, almost eighty years after its discovery in 1939 by Hahn and Strassmann [1]. The first major development was in theory, the extension of the Time-Dependent Density Functional Theory (TDDFT) [2-5] to superfluid fermion systems [6]. The second development was in computing, the emergence of powerful enough supercomputers capable of solving the complex systems of equations describing the time evolution in three dimensions without any restrictions of hundreds of strongly interacting nucleons. Thus the conditions have been created to renounce phenomenological models and incomplete microscopic treatments with uncontrollable approximations and/or assumptions in the description of the complex dynamics of fission. Even though the available nuclear energy density functionals (NEDFs) are phenomenological still, their accuracy is improving steadily and the prospects of being able to perform calculations of the nuclear fission dynamics and to predict many properties of the fission fragments, otherwise not possible to extract from experiments.

  16. Fundamental cause theory, technological innovation, and health disparities: the case of cholesterol in the era of statins.

    Science.gov (United States)

    Chang, Virginia W; Lauderdale, Diane S

    2009-09-01

    Although fundamental cause theory has been highly influential in shaping the research literature on health disparities, there have been few empirical demonstrations of the theory, particularly in dynamic perspective. In this study, we examine how income disparities in cholesterol levels have changed with the emergence of statins, an expensive and potent new drug technology. Using nationally representative data from 1976 to 2004, we find that income gradients for cholesterol were initially positive, but then reversed and became negative in the era of statin use. While the advantaged were previously more likely to have high levels of cholesterol, they are now less likely. We consider our case study against a broader theoretical framework outlining the relationship between technology innovation and health disparities. We find that the influence of technologies on socioeconomic disparities is subject to two important modifiers: (1) the nature of the technological change and (2) the extent of its diffusion and adoption.

  17. Influence of the time conventions and of the corresponding measuring conditions on the fundamental equations of relativistic thermodynamics and hydrodynamics

    International Nuclear Information System (INIS)

    Arzelies, H.

    1977-01-01

    We may imagine two measuring conditions: we adopt simultaneity in only one reference system (NS theory) or in every reference system (S theory). According to the adopted convention we obtain in hydrodynamics the usual equation (S theory) or the equation previously proposed by the author (NS theory). The operational definition of the measuring conditions seems to favour the new equation. (author)

  18. The Theory and Fundamentals of Bioimpedance Analysis in Clinical Status Monitoring and Diagnosis of Diseases

    Directory of Open Access Journals (Sweden)

    Sami F. Khalil

    2014-06-01

    Full Text Available Bioimpedance analysis is a noninvasive, low cost and a commonly used approach for body composition measurements and assessment of clinical condition. There are a variety of methods applied for interpretation of measured bioimpedance data and a wide range of utilizations of bioimpedance in body composition estimation and evaluation of clinical status. This paper reviews the main concepts of bioimpedance measurement techniques including the frequency based, the allocation based, bioimpedance vector analysis and the real time bioimpedance analysis systems. Commonly used prediction equations for body composition assessment and influence of anthropometric measurements, gender, ethnic groups, postures, measurements protocols and electrode artifacts in estimated values are also discussed. In addition, this paper also contributes to the deliberations of bioimpedance analysis assessment of abnormal loss in lean body mass and unbalanced shift in body fluids and to the summary of diagnostic usage in different kinds of conditions such as cardiac, pulmonary, renal, and neural and infection diseases.

  19. Quantum theory from a nonlinear perspective Riccati equations in fundamental physics

    CERN Document Server

    Schuch, Dieter

    2018-01-01

    This book provides a unique survey displaying the power of Riccati equations to describe reversible and irreversible processes in physics and, in particular, quantum physics. Quantum mechanics is supposedly linear, invariant under time-reversal, conserving energy and, in contrast to classical theories, essentially based on the use of complex quantities. However, on a macroscopic level, processes apparently obey nonlinear irreversible evolution equations and dissipate energy. The Riccati equation, a nonlinear equation that can be linearized, has the potential to link these two worlds when applied to complex quantities. The nonlinearity can provide information about the phase-amplitude correlations of the complex quantities that cannot be obtained from the linearized form. As revealed in this wide ranging treatment, Riccati equations can also be found in many diverse fields of physics from Bose-Einstein-condensates to cosmology. The book will appeal to graduate students and theoretical physicists interested in ...

  20. Fundamental Frequencies of Vibration of Footbridges in Portugal: From In Situ Measurements to Numerical Modelling

    Directory of Open Access Journals (Sweden)

    C. S. Oliveira

    2014-01-01

    Full Text Available Since 1995, we have been measuring the in situ dynamic characteristics of different types of footbridges built in Portugal (essentially steel and precast reinforced concrete decks with single spans running from 11 to 110 m long, using expedite exciting and measuring techniques. A database has been created, containing not only the fundamental dynamic characteristics of those structures (transversal, longitudinal, and vertical frequencies but also their most important geometric and mechanical properties. This database, with 79 structures organized into 5 main typologies, allows the setting of correlations of fundamental frequencies as a negative power function of span lengths L  (L-0.6 to L-1.4. For 63 footbridges of more simple geometry, it was possible to obtain these correlations by typology. A few illustrative cases representing the most common typologies show that linear numerical models can reproduce the in situ measurements with great accuracy, not only matching the frequencies of vibration but also the amplitudes of motion caused by several pedestrian load patterns.

  1. Fundamental measurement by in-line typed high-precision polarization lidar

    Science.gov (United States)

    Shiina, Tatsuo; Miyamoto, Masakazu; Umaki, Dai; Noguchi, Kazuo; Fukuchi, Tetsuo

    2008-12-01

    An in-line typed new concept lidar system for high precision polarization measurement was developed. A specially designed polarization-independent optical circulator, which was composed by Gran laser prisms and highly transparent Faraday rotators, was developed. Its isolation between the orthogonal polarizations was improved up to more than 30 dB. It is sufficient to detect small rotation of the polarization plane of the propagating beam caused by lightning discharges due to the Faraday effect. The rotation angle of the polarization plane is estimated by the differential detection between the orthogonal polarization components of the lidar echoes. The in-line optics enables near range measurement from the near range of >30 m with the narrow field of view of 0.17 mrad. The fundamental measurements of lidar echoes in near and far fields, and low cloud activities were examined.

  2. Testing the Standard Model and Fundamental Symmetries in Nuclear Physics with Lattice QCD and Effective Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Walker-Loud, Andre [College of William and Mary, Williamsburg, VA (United States)

    2016-10-14

    The research supported by this grant is aimed at probing the limits of the Standard Model through precision low-energy nuclear physics. The work of the PI (AWL) and additional personnel is to provide theory input needed for a number of potentially high-impact experiments, notably, hadronic parity violation, Dark Matter direct detection and searches for permanent electric dipole moments (EDMs) in nucleons and nuclei. In all these examples, a quantitative understanding of low-energy nuclear physics from the fundamental theory of strong interactions, Quantum Chromo-Dynamics (QCD), is necessary to interpret the experimental results. The main theoretical tools used and developed in this work are the numerical solution to QCD known as lattice QCD (LQCD) and Effective Field Theory (EFT). This grant is supporting a new research program for the PI, and as such, needed to be developed from the ground up. Therefore, the first fiscal year of this grant, 08/01/2014-07/31/2015, has been spent predominantly establishing this new research effort. Very good progress has been made, although, at this time, there are not many publications to show for the effort. After one year, the PI accepted a job at Lawrence Berkeley National Laboratory, so this final report covers just a single year of five years of the grant.

  3. Measurement Models for Reasoned Action Theory.

    Science.gov (United States)

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  4. Light Scattering Tests of Fundamental Theories of Transport Properties in the Critical Region

    Science.gov (United States)

    Gammon, R. W.; Moldover, M. R.

    1985-01-01

    The objective of this program is to measure the decay rates of critical density fluctuations in a simple fluid (xenon) very near its liquid-vapor critical point using laser light scattering and photon correlation spectroscopy. Such experiments have been severely limited on Earth by the presence of gravity which causes large density gradients in the sample when the compressibility diverges approaching the critical point. The goal is to measure decay rates deep in the critical region where the scaled wavevector is the order of 1000. This will require loading the sample to 0.01% of the critical density and taking data as close as 3 microKelvin to the critical temperature (Tc = 289.72 K). Other technical problems have to be addressed such as multiple scattering and the effect of wetting layers. The ability to avoid multiple scattering by using a thin sample (100 microns) was demonstrated, as well as a temperature history which can avoid wetting layers satisfactory temperature control and measurement, and accurate sample loading. Thus the questions of experimental art are solved leaving the important engineering tasks of mounting the experiment to maintain alignment during flight and automating the state-of-the-art temperature bridges for microcomputer control of the experiment.

  5. Intense Interactions of Molecules with a Short-Wavelength Electromagnetic Radiation Field: I. The Fundamentals of the Nonadiabatic Theory

    Science.gov (United States)

    Pegarkov, A. I.

    2001-07-01

    The intense interactions between short-wavelength (SW) electromagnetic radiation with a wavelength λ ≥ 1 Å and intensity up to 1014 W/cm2 and simple and polyatomic molecules are studied with the coherent excitations of high-lying Rydberg and autoionizing states taken into account. The Hamiltonian of a system "molecule + SW radiation" is obtained by using the methods of quantum electrodynamics. Conditions for the applicability of the dipole approximation to describe the interactions of molecules with radiation of the UV, VUV, XUV, and soft X-ray range are found. The fundamentals of the theory of resonance scattering of SW radiation from diatomic, triatomic, and symmetric-and asymmetric-top polyatomic molecules are outlined.

  6. Educacion Fundamental Integral #2: Teoria y Aplicacion en el Caso de ACPO. (Fundamental Integral Education #2: Theory and Application in the Case of ACPO.)

    Science.gov (United States)

    Alarcon, Hernando Bernal

    Educacion Fundamental Integral (EFI) is an educational process which aims to help Colombia's rural population to improve their living conditions. EFI adapts to the concrete circumstances of the person in his own environment. Objectives of EFI are to make the rural people: responsible for the work necessary for their own development; work together;…

  7. Fundamental length

    International Nuclear Information System (INIS)

    Pradhan, T.

    1975-01-01

    The concept of fundamental length was first put forward by Heisenberg from purely dimensional reasons. From a study of the observed masses of the elementary particles known at that time, it is sumrised that this length should be of the order of magnitude 1 approximately 10 -13 cm. It was Heisenberg's belief that introduction of such a fundamental length would eliminate the divergence difficulties from relativistic quantum field theory by cutting off the high energy regions of the 'proper fields'. Since the divergence difficulties arise primarily due to infinite number of degrees of freedom, one simple remedy would be the introduction of a principle that limits these degrees of freedom by removing the effectiveness of the waves with a frequency exceeding a certain limit without destroying the relativistic invariance of the theory. The principle can be stated as follows: It is in principle impossible to invent an experiment of any kind that will permit a distintion between the positions of two particles at rest, the distance between which is below a certain limit. A more elegant way of introducing fundamental length into quantum theory is through commutation relations between two position operators. In quantum field theory such as quantum electrodynamics, it can be introduced through the commutation relation between two interpolating photon fields (vector potentials). (K.B.)

  8. ACADEMIC TRAINING: Low Energy Experiments that Measure Fundamental Constants and Test Basic Symmetries

    CERN Multimedia

    Françoise Benz

    2002-01-01

    17, 18, 19 , 21 June LECTURE SERIES from 11.00 to 12.00 hrs - Auditorium, bldg. 500 Low Energy Experiments that Measure Fundamental Constants and Test Basic Symmetries by G. GABRIELSE / Professor of Physics and Chair of the Harvard Physics Department, Spokesperson for the ATRAP Collaboration Lecture 1: Particle Traps: the World's Tiniest Accelerators A single elementary particle, or a single ion, can be confined in a tiny accelerator called a particle trap. A single electron was held this way for more than ten months, and antiprotons for months. Mass spectroscopy of exquisite precision is possible with such systems. CERN's TRAP Collaboration thereby compared the charge-to-mass ratios of the antiproton and proton to a precision of 90 parts per trillion, by far the most stringent CPT test done with a baryon system. The important ratio of the masses of the electron and proton have been similarly measured, as have a variety of ions masses, and the neutron mass is most accurately known from such measurements. An i...

  9. Philosophy of mathematics set theory, measuring theories, and nominalism

    CERN Document Server

    Preyer, Gerhard

    2008-01-01

    One main interest of philosophy is to become clear about the assumptions, premisses and inconsistencies of our thoughts and theories. And even for a formal language like mathematics it is controversial if consistency is acheivable or necessary like the articles in the firt part of the publication show. Also the role of formal derivations, the role of the concept of apriority, and the intuitions of mathematical principles and properties need to be discussed. The second part is a contribution on nominalistic and platonistic views in mathematics, like the ""indispensability argument"" of W. v. O.

  10. Detailed examination of 'standard elementary particle theories' based on measurement with Tristan

    International Nuclear Information System (INIS)

    Kamae, Tsuneyoshi

    1989-01-01

    The report discusses possible approaches to detailed analysis of 'standard elementary particle theories' on the basis of measurements made with Tristan. The first section of the report addresses major elementary particles involved in the 'standard theories'. The nature of the gauge particles, leptons, quarks and Higgs particle are briefly outlined. The Higgs particle and top quark have not been discovered, though the Higgs particle is essential in the Weiberg-Salam theory. Another important issue in this field is the cause of the collapse of the CP symmetry. The second section deals with problems which arise in universalizing the concept of the 'standard theories'. What are required to solve these problems include the discovery of supersymmetric particles, discovery of conflicts in the 'standard theories', and accurate determination of fundamental constants used in the 'standard theories' by various different methods. The third and fourth sections address the Weinberg-Salam theory and quantum chromodynamics (QCD). There are four essential parameters for the 'standard theories', three of which are associated with the W-S theory. The mass of the W and Z bosons measured in proton-antiproton collision experiments is compared with that determined by applying the W-S theory to electron-positron experiments. For QCD, it is essential to determine the lambda constant. (N.K.)

  11. A relativistic theory for continuous measurement of quantum fields

    International Nuclear Information System (INIS)

    Diosi, L.

    1990-04-01

    A formal theory for the continuous measurement of relativistic quantum fields is proposed. The corresponding scattering equations were derived. The proposed formalism reduces to known equations in the Markovian case. Two recent models for spontaneous quantum state reduction have been recovered in the framework of this theory. A possible example of the relativistic continuous measurement has been outlined in standard Quantum Electrodynamics. The continuous measurement theory possesses an alternative formulation in terms of interacting quantum and stochastic fields. (author) 23 refs

  12. ELECTROMAGNETIC FIELD MEASUREMENT OF FUNDAMENTAL AND HIGHER-ORDER MODES FOR 7-CELL CAVITY OF PETRA-II

    Energy Technology Data Exchange (ETDEWEB)

    Kawashima, Y.; Blednykh, A.; Cupolo, J.; Davidsaver, M.; Holub, B.; Ma, H.; Oliva, J.; Rose, J.; Sikora, R.; Yeddulla, M.

    2011-03-28

    The booster synchrotron for NSLS-II will include a 7-cell PETRA cavity, which was manufactured for the PETRA-II project at DESY. The cavity fundamental frequency operates at 500 MHz. In order to verify the impedances of the fundamental and higher-order modes (HOM), which were calculated by computer code, we measured the magnitude of the electromagnetic field of the fundamental acceleration mode and HOM using the bead-pull method. To keep the cavity body temperature constant, we used a chiller system to supply cooling water at 20 degrees C. The bead-pull measurement was automated with a computer. We encountered some issues during the measurement process due to the difficulty in measuring the electromagnetic field magnitude in a multi-cell cavity. We describe the method and apparatus for the field measurement, and the obtained results.

  13. Speech task effects on acoustic measure of fundamental frequency in Cantonese-speaking children.

    Science.gov (United States)

    Ma, Estella P-M; Lam, Nina L-N

    2015-12-01

    Speaking fundamental frequency (F0) is a voice measure frequently used to document changes in vocal performance over time. Knowing the intra-subject variability of speaking F0 has implications on its clinical usefulness. The present study examined the speaking F0 elicited from three speech tasks in Cantonese-speaking children. The study also compared the variability of speaking F0 elicited from different speech tasks. Fifty-six vocally healthy Cantonese-speaking children (31 boys and 25 girls) aged between 7.0 and 10.11 years participated. For each child, speaking F0 was elicited using speech tasks at three linguistic levels (sustained vowel /a/ prolongation, reading aloud a sentence and passage). Two types of variability, within-session (trial-to-trial) and across-session (test-retest) variability, were compared across speech tasks. Significant differences in mean speaking F0 values were found between speech tasks. Mean speaking F0 value elicited from sustained vowel phonations was significantly higher than those elicited from the connected speech tasks. The variability of speaking F0 was higher in sustained vowel prolongation than that in connected speech. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Quantum Measurements using Diamond Spins : From Fundamental Tests to Long-Distance Teleportation

    NARCIS (Netherlands)

    Hanson, R.

    2014-01-01

    Spin qubits in diamond provide an excellent platform both for fundamental tests and for realizing extended quantum networks . Here we present our latest results, including the deterministic teleportation over three meters.

  15. Electromagnetic and quantum measurements a bitemporal neoclassical theory

    CERN Document Server

    Wessel-Berg, Tore

    2001-01-01

    It is a pleasure to write a foreword for Professor Tore Wessel-Berg's book, "Electromagnetic and Quantum Measurements: A Bitemporal Neoclassical Theory." This book appeals to me for several reasons. The most important is that, in this book, Wessel-Berg breaks from the pack. The distinguished astrophysicist Thomas Gold has written about the pressures on scientists to move in tight formation, to avoid having their legs nipped by the sheepdogs of science. This book demonstrates that Wessel-Berg is willing to take that risk. I confess that I do not sufficiently understand this book to be able to either agree or disagree with its thesis. Nevertheless, Wessel-Berg makes very cogent arguments for setting out on his journey. The basic equations of physics are indeed time-reversible. Our experience, that leads us to the concept of an "arrow of time," is derived from macro­ scopic phenomena, not from fundamental microscopic phenomena. For this reason, it makes very good sense to explore the consequences of treating mi...

  16. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  17. Path-integral measure for topological field theories

    International Nuclear Information System (INIS)

    Cugliandolo, L.F.; Lozano, G.; Schaposnik, F.A.

    1990-01-01

    We discuss how the dependence of the path-integral measure on the metric affects the properties of topological quantum field theories. We show that the choice of an invariant measure (under general coordinate transformations) preserves the topological character of these theories. We also discuss how topological invariants should be computed within this approach. (orig.)

  18. Fundamental ecology is fundamental.

    Science.gov (United States)

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. The New Unified Theory of ATP Synthesis/Hydrolysis and Muscle Contraction, Its Manifold Fundamental Consequences and Mechanistic Implications and Its Applications in Health and Disease

    Directory of Open Access Journals (Sweden)

    Sunil Nath

    2008-09-01

    Full Text Available Complete details of the thermodynamics and molecular mechanisms of ATP synthesis/hydrolysis and muscle contraction are offered from the standpoint of the torsional mechanism of energy transduction and ATP synthesis and the rotation-uncoiling-tilt (RUT energy storage mechanism of muscle contraction. The manifold fundamental consequences and mechanistic implications of the unified theory for oxidative phosphorylation and muscle contraction are explained. The consistency of current mechanisms of ATP synthesis and muscle contraction with experiment is assessed, and the novel insights of the unified theory are shown to take us beyond the binding change mechanism, the chemiosmotic theory and the lever arm model. It is shown from first principles how previous theories of ATP synthesis and muscle contraction violate both the first and second laws of thermodynamics, necessitating their revision. It is concluded that the new paradigm, ten years after making its first appearance, is now perfectly poised to replace the older theories. Finally, applications of the unified theory in cell life and cell death are outlined and prospects for future research are explored. While it is impossible to cover each and every specific aspect of the above, an attempt has been made here to address all the pertinent details and what is presented should be sufficient to convince the reader of the novelty, originality, breakthrough nature and power of the unified theory, its manifold fundamental consequences and mechanistic implications, and its applications in health and disease.

  20. Aligning the Measurement of Microbial Diversity with Macroecological Theory

    Energy Technology Data Exchange (ETDEWEB)

    Stegen, James C.; Hurlbert, Allen H.; Bond-Lamberty, Ben; Chen, Xingyuan; Anderson, Carolyn G.; Chu, Rosalie K.; Dini-Andreote, Francisco; Fansler, Sarah J.; Hess, Nancy J.; Tfaily, Malak

    2016-09-23

    The number of microbial operational taxonomic units (OTUs) within a community is akin to species richness within plant/animal (‘macrobial’) systems. A large literature documents OTU richness patterns, drawing comparisons to macrobial theory. There is, however, an unrecognized fundamental disconnect between OTU richness and macrobial theory: OTU richness is commonly estimated on a per-individual basis, while macrobial richness is estimated per-area. Furthermore, the range or extent of sampled environmental conditions can strongly influence a study’s outcomes and conclusions, but this is not commonly addressed when studying OTU richness. Here we (i) propose a new sampling approach that estimates OTU richness per-mass of soil, which results in strong support for species energy theory, (ii) use data reduction to show how support for niche conservatism emerges when sampling across a restricted range of environmental conditions, and (iii) show how additional insights into drivers of OTU richness can be generated by combining different sampling methods while simultaneously considering patterns that emerge by restricting the range of environmental conditions. We propose that a more rigorous connection between microbial ecology and macrobial theory can be facilitated by exploring how changes in OTU richness units and environmental extent influence outcomes of data analysis. While fundamental differences between microbial and macrobial systems persist (e.g., species concepts), we suggest that closer attention to units and scale provide tangible and immediate improvements to our understanding of the processes governing OTU richness and how those processes relate to drivers of macrobial species richness.

  1. On the theory of SODAR measurement techniques

    DEFF Research Database (Denmark)

    Antoniou, I.; Ejsing Jørgensen, Hans; Bradley, S.

    2003-01-01

    of measuring both the wind speed distribution with height and the wind direction. At the same time the SODAR presents a number of serious drawbacks such asthe low number of measurements per time period, the dependence of the ability to measure on the atmospheric conditions and the difficulty of measuring...... at higher wind speeds due to either background noise or the neutral condition of the atmosphere. Withinthe WISE project (EU project number NNE5-2001-297), a number of work packages have been defined in order to deal with the SODAR. The present report is the result of the work package 1. Within this package...

  2. Quantum theory of successive projective measurements

    International Nuclear Information System (INIS)

    Johansen, Lars M.

    2007-01-01

    We show that a quantum state may be represented as the sum of a joint probability and a complex quantum modification term. The joint probability and the modification term can both be observed in successive projective measurements. The complex modification term is a measure of measurement disturbance. A selective phase rotation is needed to obtain the imaginary part. This leads to a complex quasiprobability: The Kirkwood distribution. We show that the Kirkwood distribution contains full information about the state if the two observables are maximal and complementary. The Kirkwood distribution gives another picture of state reduction. In a nonselective measurement, the modification term vanishes. A selective measurement leads to a quantum state as a non-negative conditional probability. We demonstrate the special significance of the Schwinger basis

  3. Elimination of Multiple Estimation for Fault Location in Radial Power Systems by Using Fundamental Single-End Measurements

    NARCIS (Netherlands)

    Morales-Espana, G.; Mora-Floréz, J.; Vargas-Torres, H.

    2009-01-01

    This paper presents a conceptual approach for eliminating the multiple estimation problem of impedance-based fault location methods applied to power distribution systems, using the available measurements of current and voltage fundamentals at the power substation. Three test systems are used to

  4. Fundamental frequency and voice perturbation measures in smokers and non-smokers: An acoustic and perceptual study

    Science.gov (United States)

    Freeman, Allison

    This research examined the fundamental frequency and perturbation (jitter % and shimmer %) measures in young adult (20-30 year-old) and middle-aged adult (40-55 year-old) smokers and non-smokers; there were 36 smokers and 36 non-smokers. Acoustic analysis was carried out utilizing one task: production of sustained /a/. These voice samples were analyzed utilizing Multi-Dimensional Voice Program (MDVP) software, which provided values for fundamental frequency, jitter %, and shimmer %.These values were analyzed for trends regarding smoking status, age, and gender. Statistical significance was found regarding the fundamental frequency, jitter %, and shimmer % for smokers as compared to non-smokers; smokers were found to have significantly lower fundamental frequency values, and significantly higher jitter % and shimmer % values. Statistical significance was not found regarding fundamental frequency, jitter %, and shimmer % for age group comparisons. With regard to gender, statistical significance was found regarding fundamental frequency; females were found to have statistically higher fundamental frequencies as compared to males. However, the relationships between gender and jitter % and shimmer % lacked statistical significance. These results indicate that smoking negatively affects voice quality. This study also examined the ability of untrained listeners to identify smokers and non-smokers based on their voices. Results of this voice perception task suggest that listeners are not accurately able to identify smokers and non-smokers, as statistical significance was not reached. However, despite a lack of significance, trends in data suggest that listeners are able to utilize voice quality to identify smokers and non-smokers.

  5. Transic time measures in scattering theory

    International Nuclear Information System (INIS)

    MacMillan, L.W.; Osborn, T.A.

    1980-01-01

    This paper studies the time evolution of state vectors that are the solutions of the time-dependent Schroedinger equation, characterized by a Hamiltonian h. We employ trace-theorem methods to prove that the transit time of state vectors through a finite space region, Σ, may be used to construct a family in the energy variable, epsilon, of unique, positive, trace-class operators. The matrix elements of these operators, give the transit time of any vector through Σ, It is proved that the trace of these operators, for a fixed energy epsilon, provide a function which simultaneously gives the sum of all orbital transit times through region Σ and represents the state density of all vectors that have support on Σ and energy epsilon. We use the transit-time operators to recover the usual theory of time delay for single-channel scattering systems. In the process we extend the known results on time delay to include scattering by fixed impurities in a periodic medium

  6. Measuring Theory of Mind in Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Brewer, Neil; Young, Robyn L.; Barnett, Emily

    2017-01-01

    Deficits in Theory of Mind (ToM)--the ability to interpret others' beliefs, intentions and emotions--undermine the ability of individuals with Autism Spectrum Disorder (ASD) to interact in socially normative ways. This study provides psychometric data for the Adult-Theory of Mind (A-ToM) measure using video-scenarios based in part on Happé's…

  7. MEASUREMENT OF FINANCIAL REPORTING QUALITY BASED ON IFRS CONCEPTUAL FRAMEWORK’S FUNDAMENTAL QUALITATIVE CHARACTERISTICS

    OpenAIRE

    Alexios KYTHREOTIS

    2014-01-01

    The IASB creates the standards and the conceptual framework in an attempt to create higher quality financial statements. Through this article, the extent to which this objective has been achieved is examined. An important characteristic of this research is the fact that the quality of financial statements is examined in light of the Conceptual Framework. Specifically, the two fundamental qualitative characteristics - relevance and faithful representation (reliability) - set by the IAS Committ...

  8. On music performance, theories, measurement en diversity

    NARCIS (Netherlands)

    Timmers, R.; Honing, H.J.

    2002-01-01

    Measurement of musical performances is of interest to studies in musicology, music psychology and music performance practice, but in general it has not been considered the main issue: when analyzing Western classical music, these disciplines usually focus on the score rather than the performance.

  9. Team synergies in sport: Theory and measures

    Directory of Open Access Journals (Sweden)

    Duarte Araújo

    2016-09-01

    Full Text Available Individual players act as a coherent unit during team sports performance, forming a team synergy. A synergy is a collective property of a task-specific organization of individuals, such that the degrees of freedom of each individual in the system are coupled, enabling the degrees of freedom of different individuals to co-regulate each other. Here, we present an explanation for the emergence of such collective behaviors, indicating how these can be assessed and understood through the measurement of key system properties that exist, considering the contribution of each individual and beyond These include: to (i dimensional compression, a process resulting in independent degree of freedom being coupled so that the synergy has fewer degrees of freedom than the set of components from which it arises; (ii reciprocal compensation, if one element do not produce its function, other elements should display changes in their contributions so that task goals are still attained; (iii interpersonal linkages, the specific contribution of each element to a group task; and (iv, degeneracy, structurally different components performing a similar, but not necessarily identical, function with respect to context. A primary goal of our analysis is to highlight the principles and tools required to understand coherent and dynamic team behaviors, as well as the performance conditions that make such team synergies possible, through perceptual attunement to shared affordances in individual performers. A key conclusion is that teams can be trained to perceive how to use and share specific affordances, explaining how individual’s behaviours self-organize into a group synergy.Ecological dynamics explanations of team behaviors can transit beyond mere ratification of sport performance, providing a comprehensive conceptual framework to guide the implementation of diagnostic measures by sport scientists, sport psychologists and performance analysts.

  10. Team Synergies in Sport: Theory and Measures

    Science.gov (United States)

    Araújo, Duarte; Davids, Keith

    2016-01-01

    Individual players act as a coherent unit during team sports performance, forming a team synergy. A synergy is a collective property of a task-specific organization of individuals, such that the degrees of freedom of each individual in the system are coupled, enabling the degrees of freedom of different individuals to co-regulate each other. Here, we present an explanation for the emergence of such collective behaviors, indicating how these can be assessed and understood through the measurement of key system properties that exist, considering the contribution of each individual and beyond These include: to (i) dimensional compression, a process resulting in independent degree of freedom being coupled so that the synergy has fewer degrees of freedom than the set of components from which it arises; (ii) reciprocal compensation, if one element do not produce its function, other elements should display changes in their contributions so that task goals are still attained; (iii) interpersonal linkages, the specific contribution of each element to a group task; and (iv), degeneracy, structurally different components performing a similar, but not necessarily identical, function with respect to context. A primary goal of our analysis is to highlight the principles and tools required to understand coherent and dynamic team behaviors, as well as the performance conditions that make such team synergies possible, through perceptual attunement to shared affordances in individual performers. A key conclusion is that teams can be trained to perceive how to use and share specific affordances, explaining how individual’s behaviors self-organize into a group synergy. Ecological dynamics explanations of team behaviors can transit beyond mere ratification of sport performance, providing a comprehensive conceptual framework to guide the implementation of diagnostic measures by sport scientists, sport psychologists and performance analysts. Complex adaptive systems, synergies, group

  11. Fundamentals of differential beamforming

    CERN Document Server

    Benesty, Jacob; Pan, Chao

    2016-01-01

    This book provides a systematic study of the fundamental theory and methods of beamforming with differential microphone arrays (DMAs), or differential beamforming in short. It begins with a brief overview of differential beamforming and some popularly used DMA beampatterns such as the dipole, cardioid, hypercardioid, and supercardioid, before providing essential background knowledge on orthogonal functions and orthogonal polynomials, which form the basis of differential beamforming. From a physical perspective, a DMA of a given order is defined as an array that measures the differential acoustic pressure field of that order; such an array has a beampattern in the form of a polynomial whose degree is equal to the DMA order. Therefore, the fundamental and core problem of differential beamforming boils down to the design of beampatterns with orthogonal polynomials. But certain constraints also have to be considered so that the resulting beamformer does not seriously amplify the sensors’ self noise and the mism...

  12. Clinical outcome measurement: Models, theory, psychometrics and practice.

    Science.gov (United States)

    McClimans, Leah; Browne, John; Cano, Stefan

    In the last decade much has been made of the role that models play in the epistemology of measurement. Specifically, philosophers have been interested in the role of models in producing measurement outcomes. This discussion has proceeded largely within the context of the physical sciences, with notable exceptions considering measurement in economics. However, models also play a central role in the methods used to develop instruments that purport to quantify psychological phenomena. These methods fall under the umbrella term 'psychometrics'. In this paper, we focus on Clinical Outcome Assessments (COAs) and discuss two measurement theories and their associated models: Classical Test Theory (CTT) and Rasch Measurement Theory. We argue that models have an important role to play in coordinating theoretical terms with empirical content, but to do so they must serve: 1) as a representation of the measurement interaction; and 2) in conjunction with a theory of the attribute in which we are interested. We conclude that Rasch Measurement Theory is a more promising approach than CTT in these regards despite the latter's popularity with health outcomes researchers. Copyright © 2017. Published by Elsevier Ltd.

  13. Particle Size Measurements From the First Fundamentals of Ice Crystal Icing Physics Test in the NASA Propulsion Systems Laboratory

    Science.gov (United States)

    King, Michael C.; Bachalo, William; Kurek, Andrzej

    2017-01-01

    This paper presents particle measurements by the Artium Technologies, Inc. Phase Doppler Interferometer and High Speed Imaging instruments from the first Fundamental Ice Crystal Icing Physics test conducted in the NASA Propulsion Systems Laboratory. The work focuses on humidity sweeps at a larger and a smaller median volumetric diameter. The particle size distribution, number density, and water content measured by the Phase Doppler Interferometer and High Speed Imaging instruments from the sweeps are presented and compared. The current capability for these two instruments to measure and discriminate ICI conditions is examined.

  14. Personality theory, abnormal psychology, and psychological measurement. A psychological behaviorism.

    Science.gov (United States)

    Staats, A W

    1993-01-01

    Behaviorism, because it has not had a theory of personality, has been separated from the rest of psychology, unable in large part to draw from or contribute to it. Traditional psychology has not had a theory of personality that says what personality is, how it comes about, or how it functions. An antagonism has resulted that weakens rather than complements each tradition. Psychological behaviorism presents a new type of theory of personality. Derived from experimentation, it is constructed from basic theories of emotion, language, and sensory-motor behavior. It says personality is composed of learned basic behavioral repertoires (BBRs) that affect behavior. Personality measurement instruments are analyzed in terms of the BBRs, beginning the behaviorization of this field and calling for much additional research. These multilevel developments are then basic in psychological behaviorism's theory of abnormal behavior and of clinical treatment. The approach opens many new avenues of empirical and theoretical work.

  15. The perturbation theory in the fundamental mode. Its application to the analysis of neutronic experiments involving small amounts of materials in fast neutron multiplying media

    International Nuclear Information System (INIS)

    Remsak, Stanislav.

    1975-01-01

    The formalism of the perturbation theory at the first order, is developed in its simplest form: diffusion theory in the fundamental mode and then the more complex formalism of the transport theory in the fundamental mode. A comparison shows the effect of the angular correlation between the fine structures of the flux and its adjoint function, the difference in the treatment of neutron leakage phenomena, and the existence of new terms in the perturbation formula, entailing a reactivity representation in the diffusion theory that is not quite exact. Problems of using the formalism developed are considered: application of the multigroup formalism, transients of the flux and its adjoint function, validity of the first order approximation etc. A detailed analysis allows the formulation of a criterion specifying the validity range. Transients occuring in the reference medium are also treated. A set of numerical tests for determining a method of elimination of transient effects is presented. Some differential experiments are then discussed: sodium blowdown in enriched uranium or plutonium cores, experiments utilizing some structural materials (iron and oxygen) and plutonium sample oscillations. The Cadarache version II program was systematically used but the analysis of the experiments of plutonium sample oscillation in Ermine required the Cadarache version III program [fr

  16. Measurement Invariance: A Foundational Principle for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…

  17. Theory and Application of an Economic Performance Measure of Risk

    NARCIS (Netherlands)

    C. Niu (Cuizhen); X. Guo (Xu); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung)

    2017-01-01

    textabstractHomm and Pigorsch (2012a) use the Aumann and Serrano index to develop a new economic performance measure (EPM), which is well known to have advantages over other measures. In this paper, we extend the theory by constructing a one-sample confidence interval of EPM, and construct

  18. Complexity measurement based on information theory and kolmogorov complexity.

    Science.gov (United States)

    Lui, Leong Ting; Terrazas, Germán; Zenil, Hector; Alexander, Cameron; Krasnogor, Natalio

    2015-01-01

    In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules.

  19. Fundamentals of overlay measurement and inspection using scanning electron-microscope

    Science.gov (United States)

    Kato, T.; Okagawa, Y.; Inoue, O.; Arai, K.; Yamaguchi, S.

    2013-04-01

    Scanning electron-microscope (SEM) has been successfully applied to CD measurement as promising tools for qualifying and controlling quality of semiconductor devices in in-line manufacturing process since 1985. Furthermore SEM is proposed to be applied to in-die overlay monitor in the local area which is too small to be measured by optical overlay measurement tools any more, when the overlay control limit is going to be stringent and have un-ignorable dependence on device pattern layout, in-die location, and singular locations in wafer edge, etc. In this paper, we proposed new overlay measurement and inspection system to make an effective use of in-line SEM image, in consideration of trade-off between measurement uncertainty and measurement pattern density in each SEM conditions. In parallel, we make it clear that the best hybrid overlay metrology is in considering each tool's technology portfolio.

  20. Hydrogen-Helium Mixtures: Fundamental Measurements, Neutral Droplet Buoyancy, Evaporation, and Boiling

    Data.gov (United States)

    National Aeronautics and Space Administration — Research groups at the Marshall Space Flight Center (MSFC) and the Kennedy Space Center (KSC) have contacted our laboratory in need of experimental measurements for...

  1. Fundamental Attributes of the Theory of Consumption in the Work of Jean Baudrillard, Pierre Bourdieu, and George Ritzer

    Directory of Open Access Journals (Sweden)

    Sanja Stanić

    2016-07-01

    Full Text Available The paper presents elements of the theories of J. Baudrillard, P. Bourdieu, and G. Ritzer that are relevant for the context of the development of social thought on the phenomenon of consumption. Although the three authors held different perspectives on consumption, they shared the notion of an increasingly important role of consumption in the time and society in which they acted. In their work, consumption was analysed as a salient determinant of social life. The theories of those authors are presented here as dominant within the given periods of the development of social theory on consumption. Baudrillard recognised consumption as a new and important issue, and his criticism of the consumer society was far ahead of the time in which he wrote. In his view, consumer goods are signs and consumption is a type of language. In analogy with Marx's concept of means of production, Baudrillard proposed the concept of means of consumption as consumer sites that are a synthesis of abundance and calculation. Bourdieu constructed a class theory of consumption based on cultural practices. He considered consumer behaviour to be an expression of class position. Class is determined by its position in the system of differences or distinctions from other classes based on cultural practices, objects, and taste. In Ritzer's work, consumption becomes a powerful driving force of contemporary society, with the ultimate purpose of profit-making. He explained changes in contemporary society by the process of McDonaldization that has been increasingly spreading to various areas of social life. Transformations in the structures and interpersonal relationships in contemporary society were explained by Ritzer as a change from the old to the new means of consumption. The concluding part of the paper provides an overview of the three authors' theories in the context of consumer society, emphasising their contribution to the body of theoretical analyses of the phenomenon of

  2. Passage from the fundamental tensor gsub(μv) of the gravitation theory to the field structure gsub(μv) of the unified theories

    International Nuclear Information System (INIS)

    Rao, J.R.; Tiwari, R.N.

    1974-01-01

    A theorem on obtaining exact solutions for a particular field structure from those of vacuum field equations of general theory as well as from some simpler solutions of unified theories is derived. With the help of this result the most general solution for the particular field structure is developed from the already known simpler solutions. The physical implications of this theorem in relation to some of the parallel work of other authors is discussed. (author)

  3. Measurement Conversions: English and Metric Systems. Fundamentals of Occupational Mathematics. Module 9.

    Science.gov (United States)

    Engelbrecht, Nancy; And Others

    This module is the ninth in a series of 12 learning modules designed to teach occupational mathematics. Blocks of informative material and rules are followed by examples and practice problems. The solutions to the practice problems are found at the end of the module. Specific topics covered include measurement conversions, the English system of…

  4. A novel quantitative histochemical assay to measure endogenous substrate concentrations in tissue sections. Fundamental aspects

    NARCIS (Netherlands)

    Koopdonk-Kool, J. M.; van Noorden, C. J.

    1995-01-01

    A quantitative histochemical assay has been developed for measurement of endogenous substrate concentrations in cryostat sections using a colorimetric visualization technique. Model sections of frozen gelatin solutions with known concentrations of glucose-6-phosphate (G6P) were sandwiched with a

  5. a Global Shear Velocity Model of the Upper Mantle from New Fundamental and Higher Rayleigh Mode Measurements

    Science.gov (United States)

    Debayle, E.; Ricard, Y. R.

    2011-12-01

    We present a global SV-wave tomographic model of the upper mantle, built from a new dataset of fundamental and higher mode Rayleigh waveforms. We use an extension of the automated waveform inversion approach of Debayle (1999) designed to improve the extraction of fundamental and higher mode information from a single surface wave seismogram. The improvement is shown to be significant in the transition zone structure which is constrained by the higher modes. The new approach is fully automated and can be run on a Beowulf computer to process massive surface wave dataset. It has been used to match successfully over 350 000 fundamental and higher mode Rayleigh waveforms, corresponding to about 20 millions of new measurements extracted from the seismograms. For each seismogram, we obtain a path average shear velocity and quality factor model, and a set of fundamental and higher mode dispersion and attenuation curves compatible with the recorded waveform. The set of dispersion curves provides a global database for future finite frequency inversion. Our new 3D SV-wave tomographic model takes into account the effect of azimuthal anisotropy and is constrained with a lateral resolution of several hundred kilometers and a vertical resolution of a few tens of kilometers. In the uppermost 200 km, our model shows a very strong correlation with surface tectonics. The slow velocity signature of mid-oceanic ridges extend down to ~100 km depth while the high velocity signature of cratons vanishes below 200 km depth. At depth greater than 400 km, the pattern of seismic velocities appear relatively homogeneous at large scale, except for high velocity slabs which produce broad high velocity regions within the transition zone. Although resolution is still good, the region between 200 and 400 km is associated with a complex pattern of seismic heterogeneities showing no simple correlation with the shallower or deeper structure.

  6. Dependence and Fundamentality

    Directory of Open Access Journals (Sweden)

    Justin Zylstra

    2014-12-01

    Full Text Available I argue that dependence is neither necessary nor sufficient for relative fundamentality. I then introduce the notion of 'likeness in nature' and provide an account of relative fundamentality in terms of it and the notion of dependence. Finally, I discuss some puzzles that arise in Aristotle's Categories, to which the theory developed is applied.

  7. Nonperturbative theory of weak pre- and post-selected measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kofman, Abraham G., E-mail: kofmana@gmail.com; Ashhab, Sahel; Nori, Franco

    2012-11-01

    This paper starts with a brief review of the topic of strong and weak pre- and post-selected (PPS) quantum measurements, as well as weak values, and afterwards presents original work. In particular, we develop a nonperturbative theory of weak PPS measurements of an arbitrary system with an arbitrary meter, for arbitrary initial states of the system and the meter. New and simple analytical formulas are obtained for the average and the distribution of the meter pointer variable. These formulas hold to all orders in the weak value. In the case of a mixed preselected state, in addition to the standard weak value, an associated weak value is required to describe weak PPS measurements. In the linear regime, the theory provides the generalized Aharonov–Albert–Vaidman formula. Moreover, we reveal two new regimes of weak PPS measurements: the strongly-nonlinear regime and the inverted region (the regime with a very large weak value), where the system-dependent contribution to the pointer deflection decreases with increasing the measurement strength. The optimal conditions for weak PPS measurements are obtained in the strongly-nonlinear regime, where the magnitude of the average pointer deflection is equal or close to the maximum. This maximum is independent of the measurement strength, being typically of the order of the pointer uncertainty. In the optimal regime, the small parameter of the theory is comparable to the overlap of the pre- and post-selected states. We show that the amplification coefficient in the weak PPS measurements is generally a product of two qualitatively different factors. The effects of the free system and meter Hamiltonians are discussed. We also estimate the size of the ensemble required for a measurement and identify optimal and efficient meters for weak measurements. Exact solutions are obtained for a certain class of the measured observables. These solutions are used for numerical calculations, the results of which agree with the theory

  8. On the Interpretation of Measurement Within the Quantum Theory

    Science.gov (United States)

    Cooper, Leon N.; Van Vechten, Deborah

    1969-01-01

    In interpretation of the process of measurement is proposed which can be placed wholly within the quantum theory. The entire system including the apparatus and even the mind of the observer can be considered to develop according to the Schrodinger equation. (RR)

  9. The Tevatron tune tracker pll - theory, implementation and measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tan, Cheng-Yang; /Fermilab

    2004-12-01

    The Tevatron tune tracker is based on the idea that the transverse phase response of the beam can be measured quickly and accurately enough to allow us to track the betatron tune with a phase locked loop (PLL). The goal of this paper is to show the progress of the PLL project at Fermilab. We will divide this paper into three parts: theory, implementation and measurements. In the theory section, we will use a simple linear model to show that our design will track the betatron tune under conditions that occur in the Tevatron. In the implementation section we will break down and examine each part of the PLL and in some cases calculate the actual PLL parameters used in our system from beam measurements. And finally in the measurements section we will show the results of the PLL performance.

  10. Low-energy experiments that measure fundamental constants and test basic symmetries

    CERN Multimedia

    CERN. Geneva. Audiovisual Unit

    2002-01-01

    Cold Antihydrogen: Are We There? Cold antihydrogen offers the possibility to precisely compare the structure of antihydrogen and hydrogen atoms, using the well developed tools of laser spectroscopy with antihydrogen atoms cold enough to be trapped in the minimum of a magnetic field gradient. Progress made at CERN's new Antiproton Decelerator will be discussed, along with goals and aspirations, such as measuring the anti-Rydberg constant. ATRAP has observed and studied the interaction of low energy antiprotons and positrons for more than a year, and ATHENA hopes to soon make antiprotons and positrons to interact as well.

  11. Quantum dissipative systems from theory of continuous measurements

    International Nuclear Information System (INIS)

    Mensky, Michael B.; Stenholm, Stig

    2003-01-01

    We apply the restricted-path-integral (RPI) theory of non-minimally disturbing continuous measurements for correct description of frictional Brownian motion. The resulting master equation is automatically of the Lindblad form, so that the difficulties typical of other approaches do not exist. In the special case of harmonic oscillator the known familiar master equation describing its frictionally driven Brownian motion is obtained. A thermal reservoir as a measuring environment is considered

  12. Fundamental and biotechnological applications of neutron scattering measurements for macromolecular dynamics.

    Science.gov (United States)

    Tehei, Moeava; Daniel, Roy; Zaccai, Giuseppe

    2006-09-01

    To explore macromolecular dynamics on the picosecond timescale, we used neutron spectroscopy. First, molecular dynamics were analyzed for the hyperthermophile malate dehydrogenase from Methanococcus jannaschii and a mesophilic homologue, the lactate dehydrogenase from Oryctolagus cunniculus muscle. Hyperthermophiles have elaborate molecular mechanisms of adaptation to extremely high temperature. Using a novel elastic neutron scattering approach that provides independent measurements of the global flexibility and of the structural resilience (rigidity), we have demonstrated that macromolecular dynamics represents one of these molecular mechanisms of thermoadaptation. The flexibilities were found to be similar for both enzymes at their optimal activity temperature and the resilience is higher for the hyperthermophilic protein. Secondly, macromolecular motions were examined in a native and immobilized dihydrofolate reductase (DHFR) from Escherichia coli. The immobilized mesophilic enzyme has increased stability and decreased activity, so that its properties are changed to resemble those of the thermophilic enzyme. Are these changes reflected in dynamical behavior? For this study, we performed quasielastic neutron scattering measurements to probe the protein motions. The residence time is 7.95 ps for the native DHFR and 20.36 ps for the immobilized DHFR. The average height of the potential barrier to local motions is therefore increased in the immobilized DHFR, with a difference in activation energy equal to 0.54 kcal/mol, which is, using the theoretical rate equation, of the same order than expected from calculation.

  13. High-Speed Photography and Digital Optical Measurement Techniques for Geomaterials: Fundamentals and Applications

    Science.gov (United States)

    Xing, H. Z.; Zhang, Q. B.; Braithwaite, C. H.; Pan, B.; Zhao, J.

    2017-06-01

    Geomaterials (i.e. rock, sand, soil and concrete) are increasingly being encountered and used in extreme environments, in terms of the pressure magnitude and the loading rate. Advancing the understanding of the mechanical response of materials to impact loading relies heavily on having suitable high-speed diagnostics. One such diagnostic is high-speed photography, which combined with a variety of digital optical measurement techniques can provide detailed insights into phenomena including fracture, impact, fragmentation and penetration in geological materials. This review begins with a brief history of high-speed imaging. Section 2 discusses of the current state of the art of high-speed cameras, which includes a comparison between charge-coupled device and complementary metal-oxide semiconductor sensors. The application of high-speed photography to geomechanical experiments is summarized in Sect. 3. Section 4 is concerned with digital optical measurement techniques including photoelastic coating, Moiré, caustics, holographic interferometry, particle image velocimetry, digital image correlation and infrared thermography, in combination with high-speed photography to capture transient phenomena. The last section provides a brief summary and discussion of future directions in the field.

  14. Addressing the Lack of Measurement Invariance for the Measure of Acceptance of the Theory of Evolution

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2013-01-01

    The Measure of Acceptance of the Theory of Evolution (MATE) was constructed to be a single-factor instrument that assesses an individual's overall acceptance of evolutionary theory. The MATE was validated and the scores resulting from the MATE were found to be reliable for the population of inservice high school biology teachers. However, many…

  15. Readability of Orthopaedic Patient-reported Outcome Measures: Is There a Fundamental Failure to Communicate?

    Science.gov (United States)

    Perez, Jorge L; Mosher, Zachary A; Watson, Shawna L; Sheppard, Evan D; Brabston, Eugene W; McGwin, Gerald; Ponce, Brent A

    2017-08-01

    Patient-reported outcome measures (PROMs) are increasingly used to quantify patients' perceptions of functional ability. The American Medical Association and NIH suggest patient materials be written at or below 6th to 8th grade reading levels, respectively, yet one recent study asserts that few PROMs comply with these recommendations, and suggests that the majority of PROMs are written at too high of a reading level for self-administered patient use. Notably, this study was limited in its use of only one readability algorithm, although there is no commonly accepted, standard readability algorithm for healthcare-related materials. Our study, using multiple readability equations and heeding equal weight to each, hopes to yield a broader, all-encompassing estimate of readability, thereby offering a more accurate assessment of the readability of orthopaedic PROMS. (1) What proportion of orthopaedic-related PROMs and orthopaedic-related portions of the NIH Patient Reported Outcomes Measurement Information System (PROMIS ® ) are written at or below the 6th and 8th grade levels? (2) Is there a correlation between the number of questions in the PROM and reading level? (3) Using systematic edits based on guidelines from the Centers for Medicare and Medicaid Services, what proportion of PROMs achieved American Medical Association and NIH-recommended reading levels? Eighty-six (86) independent, orthopaedic and general wellness PROMs, drawn from commonly referenced orthopaedic websites and prior studies, were chosen for analysis. Additionally, owing to their increasing use in orthopaedics, four relevant short forms, and 11 adult, physical health question banks from the PROMIS ® , were included for analysis. All documents were analyzed for reading grade levels using 19 unique readability algorithms. Descriptive statistics were performed using SPSS Version 22.0. The majority of the independent PROMs (64 of 86; 74%) were written at or below the 6th grade level, with 81 of 86

  16. Review of the fundamental theories behind small angle X-ray scattering, molecular dynamics simulations, and relevant integrated application

    Directory of Open Access Journals (Sweden)

    Lauren Boldon

    2015-02-01

    Full Text Available In this paper, the fundamental concepts and equations necessary for performing small angle X-ray scattering (SAXS experiments, molecular dynamics (MD simulations, and MD-SAXS analyses were reviewed. Furthermore, several key biological and non-biological applications for SAXS, MD, and MD-SAXS are presented in this review; however, this article does not cover all possible applications. SAXS is an experimental technique used for the analysis of a wide variety of biological and non-biological structures. SAXS utilizes spherical averaging to produce one- or two-dimensional intensity profiles, from which structural data may be extracted. MD simulation is a computer simulation technique that is used to model complex biological and non-biological systems at the atomic level. MD simulations apply classical Newtonian mechanics’ equations of motion to perform force calculations and to predict the theoretical physical properties of the system. This review presents several applications that highlight the ability of both SAXS and MD to study protein folding and function in addition to non-biological applications, such as the study of mechanical, electrical, and structural properties of non-biological nanoparticles. Lastly, the potential benefits of combining SAXS and MD simulations for the study of both biological and non-biological systems are demonstrated through the presentation of several examples that combine the two techniques.

  17. The calculus of variations on jet bundles as a universal approach for a variational formulation of fundamental physical theories

    Directory of Open Access Journals (Sweden)

    Musilová Jana

    2016-12-01

    Full Text Available As widely accepted, justified by the historical developments of physics, the background for standard formulation of postulates of physical theories leading to equations of motion, or even the form of equations of motion themselves, come from empirical experience. Equations of motion are then a starting point for obtaining specific conservation laws, as, for example, the well-known conservation laws of momenta and mechanical energy in mechanics. On the other hand, there are numerous examples of physical laws or equations of motion which can be obtained from a certain variational principle as Euler-Lagrange equations and their solutions, meaning that the \\true trajectories" of the physical systems represent stationary points of the corresponding functionals.

  18. Assessment of Student Performance for Course Examination Using Rasch Measurement Model: A Case Study of Information Technology Fundamentals Course

    Directory of Open Access Journals (Sweden)

    Amir Mohamed Talib

    2018-01-01

    Full Text Available This paper describes a measurement model that is used to measure the student performance in the final examination of Information Technology (IT Fundamentals (IT280 course in the Information Technology (IT Department, College of Computer & Information Sciences (CCIS, Al-Imam Mohammad Ibn Saud Islamic University (IMSIU. The assessment model is developed based on students’ mark entries of final exam results for the second year IT students, which are compiled and tabulated for evaluation using Rasch Measurement Model, and it can be used to measure the students’ performance towards the final examination of the course. A study on 150 second year students (male = 52; female = 98 was conducted to measure students’ knowledge and understanding for IT280 course according to the three level of Bloom’s Taxonomy. The results concluded that students can be categorized as poor (10%, moderate (42%, good (18%, and successful (24% to achieve Level 3 of Bloom’s Taxonomy. This study shows that the students’ performance for the set of IT280 final exam questions was comparatively good. The result generated from this study can be used to guide us to determine the appropriate improvement of teaching method and the quality of question prepared.

  19. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  20. Educational measurement for applied researchers theory into practice

    CERN Document Server

    Wu, Margaret; Jen, Tsung-Hau

    2016-01-01

    This book is a valuable read for a diverse group of researchers and practitioners who analyze assessment data and construct test instruments. It focuses on the use of classical test theory (CTT) and item response theory (IRT), which are often required in the fields of psychology (e.g. for measuring psychological traits), health (e.g. for measuring the severity of disorders), and education (e.g. for measuring student performance), and makes these analytical tools accessible to a broader audience. Having taught assessment subjects to students from diverse backgrounds for a number of years, the three authors have a wealth of experience in presenting educational measurement topics, in-depth concepts and applications in an accessible format. As such, the book addresses the needs of readers who use CTT and IRT in their work but do not necessarily have an extensive mathematical background. The book also sheds light on common misconceptions in applying measurement models, and presents an integrated approach to differ...

  1. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    International Nuclear Information System (INIS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-01-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  2. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    Energy Technology Data Exchange (ETDEWEB)

    Egan, James; McMillan, Normal; Denieffe, David, E-mail: eganj@itcarlow.ie [IT Carlow (Ireland)

    2011-08-17

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  3. INTERPRETATION OF FUNDAMENTAL RIGHTS: SOME THOUGHTS ON THE THEORY OF WEIGHTS AND THE PRINCIPLE OF PROPORTION IN THE WORK OF R. ALEXY

    Directory of Open Access Journals (Sweden)

    Nuria Belloso Martín

    2016-12-01

    Full Text Available The interpretation of fundamental rights offers a rich debate in relation to what may be the best criteria for interpretation. Robert Alexy theory regarding the weighting is analyzed. They discuss some of the criticisms and objections that have been made regarding the weighting and the principle of reasonableness: whether it is a rational procedure for the application of legal rules or a mere rhetorical subterfuge, useful to justify certain judicial decisions. This leads to another problem such as how to apply the principles of constitutional courts by weighting and if they have legitimacy to do so. Several authors have argued that the weighting is nothing more than an arbitrary and Solomonic judgment and that therefore neither the judges nor the Constitutional Court have sufficient constitutional legitimacy to implement the principles by this procedure.

  4. Bulk measurements of messy chemistries are needed for a theory of the origins of life

    Science.gov (United States)

    Guttenberg, Nicholas; Virgo, Nathaniel; Chandru, Kuhan; Scharf, Caleb; Mamajanov, Irena

    2017-11-01

    A feature of many of the chemical systems plausibly involved in the origins of terrestrial life is that they are complex and messy-producing a wide range of compounds via a wide range of mechanisms. However, the fundamental behaviour of such systems is currently not well understood; we do not have the tools to make statistical predictions about such complex chemical networks. This is, in part, due to a lack of quantitative data from which such a theory could be built; specifically, functional measurements of messy chemical systems. Here, we propose that the pantheon of experimental approaches to the origins of life should be expanded to include the study of `functional measurements'-the direct study of bulk properties of chemical systems and their interactions with other compounds, the formation of structures and other behaviours, even in cases where the precise composition and mechanisms are unknown. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  5. Density measurement using gamma radiation - theory and application

    International Nuclear Information System (INIS)

    Springer, E.K.

    1979-01-01

    There are still widespread uncertainties about the use and safety of gamma radiation in industries. This paper describes, by the example of radiometric density measurement, the theory of gamma radiation. The differences and advantages of both types of detectors, the ionization chamber and the scintillation counter, are discussed. The degree of accuracy which can be expected from the radiometric density meter will be defined, and the inter-relationship: source strength - measuring range - measuring length(normally the pipe diameter) in relation to the measuring accuracy required will be explained in detail. The use of radioactive material requires the permission of the Atomic Energy Board. The formalities involved to receive a user's licence and the implementations of safety standards set by the local authorities are discussed in depth [af

  6. A short course on measure and probability theories

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.

  7. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    Energy Technology Data Exchange (ETDEWEB)

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  8. DOE Fundamentals Handbook: Electrical Science, Volume 1

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of electrical theory, terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive components; batteries; AC and DC voltage regulators; transformers; and electrical test instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  9. DOE Fundamentals Handbook: Electrical Science, Volume 4

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of electrical theory, terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive transformers; and electrical test components; batteries; AC and DC voltage regulators; instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  10. From Theory to Practice: Measuring end-of-life communication quality using multiple goals theory.

    Science.gov (United States)

    Van Scoy, L J; Scott, A M; Reading, J M; Chuang, C H; Chinchilli, V M; Levi, B H; Green, M J

    2017-05-01

    To describe how multiple goals theory can be used as a reliable and valid measure (i.e., coding scheme) of the quality of conversations about end-of-life issues. We analyzed conversations from 17 conversations in which 68 participants (mean age=51years) played a game that prompted discussion in response to open-ended questions about end-of-life issues. Conversations (mean duration=91min) were audio-recorded and transcribed. Communication quality was assessed by three coders who assigned numeric scores rating how well individuals accomplished task, relational, and identity goals in the conversation. The coding measure, which results in a quantifiable outcome, yielded strong reliability (intra-class correlation range=0.73-0.89 and Cronbach's alpha range=0.69-0.89 for each of the coded domains) and validity (using multilevel nonlinear modeling, we detected significant variability in scores between games for each of the coded domains, all p-values theory-based measure of end-of-life conversation quality that is superior to other methods of measuring communication quality. Our description of the coding method enables researches to adapt and apply this measure to communication interventions in other clinical contexts. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Understanding modern physics by symmetry. A new approach to the fundamental theories; Durch Symmetrie die moderne Physik verstehen. Ein neuer Zugang zu den fundamentalen Theorien

    Energy Technology Data Exchange (ETDEWEB)

    Schwichtenberg, Jakob

    2017-09-01

    The following topics are dealt with: Special relativity theory, theory of Lie groups, the Lagrang formalism for field theories, quantum operators, quantum wave equations, the theory of interactions, quantum mechanics, quantum field theory, classical mechanics, electrodynamics. (HSI)

  12. Fundamentals of structural dynamics

    CERN Document Server

    Craig, Roy R

    2006-01-01

    From theory and fundamentals to the latest advances in computational and experimental modal analysis, this is the definitive, updated reference on structural dynamics.This edition updates Professor Craig's classic introduction to structural dynamics, which has been an invaluable resource for practicing engineers and a textbook for undergraduate and graduate courses in vibrations and/or structural dynamics. Along with comprehensive coverage of structural dynamics fundamentals, finite-element-based computational methods, and dynamic testing methods, this Second Edition includes new and e

  13. Quantum Measurement Theory in Gravitational-Wave Detectors

    Directory of Open Access Journals (Sweden)

    Stefan L. Danilishin

    2012-04-01

    Full Text Available The fast progress in improving the sensitivity of the gravitational-wave detectors, we all have witnessed in the recent years, has propelled the scientific community to the point at which quantum behavior of such immense measurement devices as kilometer-long interferometers starts to matter. The time when their sensitivity will be mainly limited by the quantum noise of light is around the corner, and finding ways to reduce it will become a necessity. Therefore, the primary goal we pursued in this review was to familiarize a broad spectrum of readers with the theory of quantum measurements in the very form it finds application in the area of gravitational-wave detection. We focus on how quantum noise arises in gravitational-wave interferometers and what limitations it imposes on the achievable sensitivity. We start from the very basic concepts and gradually advance to the general linear quantum measurement theory and its application to the calculation of quantum noise in the contemporary and planned interferometric detectors of gravitational radiation of the first and second generation. Special attention is paid to the concept of the Standard Quantum Limit and the methods of its surmounting.

  14. Plasma scattering of electromagnetic radiation theory and measurement techniques

    CERN Document Server

    Froula, Dustin H; Luhmann, Neville C Jr; Sheffield, John

    2011-01-01

    This work presents one of the most powerful methods of plasma diagnosis in exquisite detail to guide researchers in the theory and measurement techniques of light scattering in plasmas. Light scattering in plasmas is essential in the research and development of fusion energy, environmental solutions, and electronics.Referred to as the "Bible" by researchers the work encompasses fusion and industrial applications essential in plasma research. It is the only comprehensive resource specific to the plasma scattering technique. It provides a wide-range of experimental examples and discussion of the

  15. Interpreting Measures of Fundamental Movement Skills and Their Relationship with Health-Related Physical Activity and Self-Concept

    Science.gov (United States)

    Jarvis, Stuart; Williams, Morgan; Rainer, Paul; Jones, Eleri Sian; Saunders, John; Mullen, Richard

    2018-01-01

    The aims of this study were to determine proficiency levels of fundamental movement skills using cluster analysis in a cohort of U.K. primary school children; and to further examine the relationships between fundamental movement skills proficiency and other key aspects of health-related physical activity behavior. Participants were 553 primary…

  16. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  17. Postpartum Physical Activity: Measuring Theory of Planned Behavior Constructs

    Science.gov (United States)

    Hales, Derek; Evenson, Kelly R.; Wen, Fang; Wilcox, Sara

    2012-01-01

    Objective top develop and examine evidence for factor validity and longitudinal invariance of scales used to measure theory of planned behavior constructs applied to physical activity. Methods Self-report questionnaires were administered at 3- (n=267) and 12-months (n=333) postpartum. Longitudinal data were available for 185 of these women. Results A single factor model fit data from the normative beliefs, perceived behavioral control, and behavioral beliefs scales. Attitude and control beliefs were found to be multidimensional. Longitudinal invariance of all scales was supported. Conclusions Each scale had strong validity evidence. Future research using these measures will help identify areas for intervention and reveal how changes in constructs influence physical activity over time. PMID:20218751

  18. Radiology fundamentals

    CERN Document Server

    Singh, Harjit

    2011-01-01

    ""Radiology Fundamentals"" is a concise introduction to the dynamic field of radiology for medical students, non-radiology house staff, physician assistants, nurse practitioners, radiology assistants, and other allied health professionals. The goal of the book is to provide readers with general examples and brief discussions of basic radiographic principles and to serve as a curriculum guide, supplementing a radiology education and providing a solid foundation for further learning. Introductory chapters provide readers with the fundamental scientific concepts underlying the medical use of imag

  19. Experimental Test of the “Special State” Theory of Quantum Measurement

    Directory of Open Access Journals (Sweden)

    Lawrence S. Schulman

    2012-04-01

    Full Text Available An experimental test of the “special state” theory of quantum measurement is proposed. It should be feasible with present-day laboratory equipment and involves a slightly elaborated Stern–Gerlach setup. The “special state” theory is conservative with respect to quantum mechanics, but radical with respect to statistical mechanics, in particular regarding the arrow of time. In this article background material is given on both quantum measurement and statistical mechanics aspects. For example, it is shown that future boundary conditions would not contradict experience, indicating that the fundamental equal-a-priori-probability assumption at the foundations of statistical mechanics is far too strong (since future conditioning reduces the class of allowed states. The test is based on a feature of this theory that was found necessary in order to recover standard (Born probabilities in quantum measurements. Specifically, certain systems should have “noise” whose amplitude follows the long-tailed Cauchy distribution. This distribution is marked by the occasional occurrence of extremely large signals as well as a non-self-averaging property. The proposed test is a variant of the Stern–Gerlach experiment in which protocols are devised, some of which will require the presence of this noise, some of which will not. The likely observational schemes would involve the distinction between detection and non-detection of that “noise”. The signal to be detected (or not would be either single photons or electric fields (and related excitations in the neighborhood of the ends of the magnets.

  20. Fundamentals of plasma physics

    CERN Document Server

    Bittencourt, J A

    1986-01-01

    A general introduction designed to present a comprehensive, logical and unified treatment of the fundamentals of plasma physics based on statistical kinetic theory. Its clarity and completeness make it suitable for self-learning and self-paced courses. Problems are included.

  1. The Yang-Mills gradient flow and SU(3) gauge theory with 12 massless fundamental fermions in a colour-twisted box

    CERN Document Server

    Lin, C -J David; Ramos, Alberto

    2015-01-01

    We perform the step-scaling investigation of the running coupling constant, using the gradient-flow scheme, in SU(3) gauge theory with twelve massless fermions in the fundamental representation. The Wilson plaquette gauge action and massless unimproved staggered fermions are used in the simulations. Our lattice data are prepared at high accuracy, such that the statistical error for the renormalised coupling, g_GF, is at the subpercentage level. To investigate the reliability of the continuum extrapolation, we employ two different lattice discretisations to obtain g_GF. For our simulation setting, the corresponding gauge-field averaging radius in the gradient flow has to be almost half of the lattice size, in order to have this extrapolation under control. We can determine the renormalisation group evolution of the coupling up to g^2_GF ~ 6, before the onset of the bulk phase structure. In this infrared regime, the running of the coupling is significantly slower than the two-loop perturbative prediction, altho...

  2. APPLICATION OF EXTREME VALUE THEORY AND FUNDAMENTAL ANALYSIS IN LONG-SHORT STRATEGIES: AN ANALYSIS OF PAIR TRADINGS IN THE BRAZILIAN MARKET

    Directory of Open Access Journals (Sweden)

    Danilo Soares Monte-Mor

    2014-09-01

    Full Text Available In recent decades, the number of funds has increased which aim to explore market inefficiencies through arbitrage strategies, among which the long-short strategy stands out. A large part of the analyses used to obtain the pair tradings, however, does not consider the extreme deviations that exist in the interdependence process between the assets involved and the firms’ operational quality indicators. The Extreme Value Theory and Fundamental Analysis were used in this study to model the series of the asset pair price indices obtained based on the accounting indicator structure proposed by Piotroski (2000. These approaches permitted considering companies with positive signs of profitability, an operational capital structure and efficiency, besides distributions that are capable of capturing the extreme co-movements associated with the selected pair tradings. Based on this model, a new quantitative approach was created for the long-short strategy, called the GEV Long-Short. The obtained results suggest that the best adjustment of the extreme quantiles through the extreme value distribution can provide more refined probabilistic support for the return to the average to justify the possibility of long-short arbitrage.

  3. Cancer: Towards a general theory of the target: All successful cancer therapies, actual or potential, are reducible to either (or both) of two fundamental strategies.

    Science.gov (United States)

    Vincent, Mark D

    2017-09-01

    General theories (GT) are reductionist explications of apparently independent facts. Here, in reviewing the literature, I develop a GT to simplify the cluttered landscape of cancer therapy targets by revealing they cluster parsimoniously according to only a few underlying principles. The first principle is that targets can be only exploited by either or both of two fundamentally different approaches: causality-inhibition, and 'acausal' recognition of some marker or signature. Nonetheless, each approach must achieve both of two separate goals, efficacy (reduction in cancer burden) and selectivity (sparing of normal cells); if the mechanisms are known, this provides a definition of rational treatment. The second principle is target fragmentation, whereby the target may perform up to three categoric functions (cytoreduction, modulation, cytoprotection), potentially mediated by physically different target molecules, even on different cell types, or circulating freely. This GT remains incomplete until the minimal requirements for cure, or alternatively, proof that cure is impossible, become predictable. © 2017 The Authors. BioEssays Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  4. Fundamentals of ultrasonic phased arrays

    CERN Document Server

    Schmerr, Lester W

    2014-01-01

    This book describes in detail the physical and mathematical foundations of ultrasonic phased array measurements.?The book uses linear systems theory to develop a comprehensive model of the signals and images that can be formed with phased arrays. Engineers working in the field of ultrasonic nondestructive evaluation (NDE) will find in this approach a wealth of information on how to design, optimize and interpret ultrasonic inspections with phased arrays. The fundamentals and models described in the book will also be of significant interest to other fields, including the medical ultrasound and

  5. Individual differences in fundamental social motives.

    Science.gov (United States)

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Interrelation of the concept of “uncertainty” in the information theory and theory of measurements

    Directory of Open Access Journals (Sweden)

    Kopp Vadim

    2017-01-01

    Full Text Available The problem of determining the optimal number of multiple measurements based on the type of the error distribution density of the measuring means is considered. The laws of distribution of a random variable are obtained, they ensure the extreme values of dispersion at specified entropy. It is indicated that the given law corresponds to the known compositional one which ensures maximum entropy under the restrictions on variation limits of the random variable and at specified dispersion. In particular, under certain conditions, it corresponds to an abridged normal, uniform and bimodal laws. It is shown that two dispersion values correspond to a single entropy value. This effect is provided due to the fact that the random variable is concentrated on a finite interval. The theorem on the interrelation of entropy and dispersion of a random variable is proved which allows us to reconcile the concept of uncertainty used in the information theory with the same concept used in the up-to-date standards of technical measurements. It is shown that an abridged normal distribution law providing maximum of entropy at specified dispersion, at the same time provides the minimum of dispersion at specified entropy, and the bimodal law provides the maximum of entropy at the maximum of dispersion. The conclusions are based on the solution of two variational problems with isoperimetric constraints. The results of modeling allowing to evaluate the correctness of the conclusions are presented.

  7. Fundamental Astronomy

    CERN Document Server

    Karttunen, Hannu; Oja, Heikki; Poutanen, Markku; Donner, Karl Johan

    2007-01-01

    Fundamental Astronomy gives a well-balanced and comprehensive introduction to the topics of classical and modern astronomy. While emphasizing both the astronomical concepts and the underlying physical principles, the text provides a sound basis for more profound studies in the astronomical sciences. The fifth edition of this successful undergraduate textbook has been extensively modernized and extended in the parts dealing with the Milky Way, extragalactic astronomy and cosmology as well as with extrasolar planets and the solar system (as a consequence of recent results from satellite missions and the new definition by the International Astronomical Union of planets, dwarf planets and small solar-system bodies). Furthermore a new chapter on astrobiology has been added. Long considered a standard text for physical science majors, Fundamental Astronomy is also an excellent reference and entrée for dedicated amateur astronomers.

  8. Torsor Theory of Physical Quantities and their Measurement

    Directory of Open Access Journals (Sweden)

    Domotor Zoltan

    2017-08-01

    Full Text Available The principal objective of this paper is to provide a torsor theory of physical quantities and basic operations thereon. Torsors are introduced in a bottom-up fashion as actions of scale transformation groups on spaces of unitized quantities. In contrast, the shortcomings of other accounts of quantities that proceed in a top-down axiomatic manner are also discussed. In this paper, quantities are presented as dual counterparts of physical states. States serve as truth-makers of metrological statements about quantity values and are crucial in specifying alternative measurement units for base quantities. For illustration and ease of presentation, the classical notions of length, time, and instantaneous velocity are used as primordial examples. It is shown how torsors provide an effective description of the structure of quantities, systems of quantities, and transformations between them. Using the torsor framework, time-dependent quantities and their unitized derivatives are also investigated. Lastly, the torsor apparatus is applied to deterministic measurement of quantities.

  9. One-Group Perturbation Theory Applied to Measurements with Void

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Rolf

    1966-09-15

    Formulas suitable for evaluating progressive as well as single rod substitution measurements are derived by means of one-group perturbation theory. The diffusion coefficient may depend on direction and position. By using the buckling concept one can derive expressions which are quite simple and the perturbed flux can be taken into account in a comparatively simple way. By using an unconventional definition of cells a transition region is introduced quite logically. Experiments with voids around metal rods, diam. 3.05 cm, have been analysed. The agreement between extrapolated and directly measured buckling values is excellent, the buckling difference between lattices with water-filled and voided shrouds being 0. 263 {+-} 0.015/m{sup 2} and 0.267 {+-} 0.005/m{sup 2} resp. From single-rod experiments differences between diffusion coefficients are determined to {delta}D{sub r}/D = 0.083 {+-} 0.004 and {delta}D{sub z}/D = 0.120 {+-} 0.018. With air-filled shrouds there is consequently anisotropy in the neutron diffusion and we have (D{sub z}/D{sub r}){sub air} = 1.034 {+-} 0.020.

  10. One-Group Perturbation Theory Applied to Measurements with Void

    International Nuclear Information System (INIS)

    Persson, Rolf

    1966-09-01

    Formulas suitable for evaluating progressive as well as single rod substitution measurements are derived by means of one-group perturbation theory. The diffusion coefficient may depend on direction and position. By using the buckling concept one can derive expressions which are quite simple and the perturbed flux can be taken into account in a comparatively simple way. By using an unconventional definition of cells a transition region is introduced quite logically. Experiments with voids around metal rods, diam. 3.05 cm, have been analysed. The agreement between extrapolated and directly measured buckling values is excellent, the buckling difference between lattices with water-filled and voided shrouds being 0. 263 ± 0.015/m 2 and 0.267 ± 0.005/m 2 resp. From single-rod experiments differences between diffusion coefficients are determined to δD r /D = 0.083 ± 0.004 and δD z /D = 0.120 ± 0.018. With air-filled shrouds there is consequently anisotropy in the neutron diffusion and we have (D z /D r ) air = 1.034 ± 0.020

  11. Fundamentals of magnetism

    CERN Document Server

    Reis, Mario

    2013-01-01

    The Fundamentals of Magnetism is a truly unique reference text, that explores the study of magnetism and magnetic behavior with a depth that no other book can provide. It covers the most detailed descriptions of the fundamentals of magnetism providing an emphasis on statistical mechanics which is absolutely critical for understanding magnetic behavior. The books covers the classical areas of basic magnetism, including Landau Theory and magnetic interactions, but features a more concise and easy-to-read style. Perfect for upper-level graduate students and industry researchers, The Fu

  12. Fundamental Movement Skill Proficiency and Body Composition Measured by Dual Energy X-Ray Absorptiometry in Eight-Year-Old Children

    Science.gov (United States)

    Slotte, Sari; Sääkslahti, Arja; Metsämuuronen, Jari; Rintala, Pauli

    2015-01-01

    Objective: The main aim was to examine the association between fundamental movement skills (FMS) and objectively measured body composition using dual energy X-ray absorptiometry (DXA). Methods: A study of 304 eight-year-old children in Finland. FMS were assessed with the "Test of gross motor development," 2nd ed. Total body fat…

  13. Marketing fundamentals.

    Science.gov (United States)

    Redmond, W H

    2001-01-01

    This chapter outlines current marketing practice from a managerial perspective. The role of marketing within an organization is discussed in relation to efficiency and adaptation to changing environments. Fundamental terms and concepts are presented in an applied context. The implementation of marketing plans is organized around the four P's of marketing: product (or service), promotion (including advertising), place of delivery, and pricing. These are the tools with which marketers seek to better serve their clients and form the basis for competing with other organizations. Basic concepts of strategic relationship management are outlined. Lastly, alternate viewpoints on the role of advertising in healthcare markets are examined.

  14. Construct Validity of Measures of Becker's Side Bet Theory.

    Science.gov (United States)

    Shore, Lynn M.; Tetrick, Lois E.; Shore, Ted H.; Barksdale, Kevin

    2000-01-01

    Becker's side bet theory (remaining in a job because of perceived costs of leaving) was tested using data from 327 working business students. Three factors were most consistent with the theory: bureaucratic organization, nonwork-related concerns, and adjustment to social position. Attachment to the organization was significantly linked to tangible…

  15. DOE Fundamentals Handbook: Electrical Science, Volume 2

    International Nuclear Information System (INIS)

    1992-06-01

    The Electrical Science Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding terminology, and application. The handbook includes information on alternating current (AC) and direct current (DC) theory, circuits, motors, and generators; AC power and reactive components; batteries; AC and DC voltage regulators; transformers; and electrical test instruments and measuring devices. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility electrical equipment

  16. Fundamentals of turbomachines

    CERN Document Server

    Dick, Erik

    2015-01-01

    This book explores the working principles of all kinds of turbomachines. The same theoretical framework is used to analyse the different machine types. Fundamentals are first presented and theoretical concepts are then elaborated for particular machine types, starting with the simplest ones.For each machine type, the author strikes a balance between building basic understanding and exploring knowledge of practical aspects. Readers are invited through challenging exercises to consider how the theory applies to particular cases and how it can be generalised.   The book is primarily meant as a course book. It teaches fundamentals and explores applications. It will appeal to senior undergraduate and graduate students in mechanical engineering and to professional engineers seeking to understand the operation of turbomachines. Readers will gain a fundamental understanding of turbomachines. They will also be able to make a reasoned choice of turbomachine for a particular application and to understand its operation...

  17. Arguing against fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  18. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, A.; Murthy, S.

    2007-06-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d ( d = 3, . . . , 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings. (author)

  19. Simple atoms: QED tests and fundamental constants

    International Nuclear Information System (INIS)

    Karshenboim, S.G.

    2002-01-01

    Full text: Study of simple atoms can be performed theoretically and experimentally with a high accuracy and a comparison of theory and experiment provide us with several high precision tests of bound state QED. Theory cannot actually leads to a figure to compare with experiment, but it only can present some measurable quantities in terms of fundamental and auxiliary constants. That offers an opportunity to obtain new accurate values of some fundamental constants. Theory of simple atoms is based on Quantum electrodynamics but also involves an essential part of nuclear and particle physics. A significant part of experiments are related to high-resolution laser spectroscopy. Present status of the precision physics of simple atoms is presented in detail. We overview a comparison of the theory of such atoms, bound state QED, and the experiment. In particular, we consider the hyperfine structure in light atoms and the g-factor of a bound electron in hydrogen-like ions at low and medium Z. We discuss a project on optical measurement of of 2s hyperfine interval in atomic hydrogen. We also pay attention to determination of the fundamental constants from study of simple atoms. The constants under consideration includes alpha, electron-to-proton mass ratio and electron-to-muon mass ratio

  20. Fundamentals of photonics

    CERN Document Server

    Saleh, Bahaa E A

    2007-01-01

    Now in a new full-color edition, Fundamentals of Photonics, Second Edition is a self-contained and up-to-date introductory-level textbook that thoroughly surveys this rapidly expanding area of engineering and applied physics. Featuring a logical blend of theory and applications, coverage includes detailed accounts of the primary theories of light, including ray optics, wave optics, electromagnetic optics, and photon optics, as well as the interaction of photons and atoms, and semiconductor optics. Presented at increasing levels of complexity, preliminary sections build toward more advan

  1. Fundamental concepts in Particle Physics course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    The course will provide an introduction to some of the basic theoretical techniques used to describe the fundamental particles and their interactions. Of central importance to our understanding of these forces are the underlying symmetries of nature and I will review the nature of these symmetries and how they are used to build a predictive theory. I discuss how the combination of quantum mechanics and relativity leads to the quantum field theory (QFT) description of the states of matter and their interactions. The Feynman rules used to determine the QFT predictions for experimentally measurable processes are derived and applied to the calculation of decay widths and cross sections.

  2. Fundamental measure theory for the electric double layer : implications for blue-energy harvesting and water desalination

    NARCIS (Netherlands)

    Hartel, Andreas; Janssen, Mathijs; Samin, Sela; van Roij, Rene

    2015-01-01

    Capacitive mixing (CAPMIX) and capacitive deionization (CDI) are promising candidates for harvesting clean, renewable energy and for the energy efficient production of potable water, respectively. Both CAPMIX and CDI involve water-immersed porous carbon (supercapacitors) electrodes at voltages of

  3. Wellness: A Review of Theory and Measurement for Counselors

    Science.gov (United States)

    Roscoe, Lauren J.

    2009-01-01

    Wellness is considered the paradigm of counseling and development (J. E. Myers, 1991, 1992). However, researchers have failed to agree on a definition or on the dimensional structure of wellness. Furthermore, existing quantitative wellness instruments are inadequate for capturing the complexity of wellness. The author reviews wellness theory and…

  4. Loss aversion under prospect theory: A parameter-free measurement

    NARCIS (Netherlands)

    M. Abdellaoui (Mohammed); H. Bleichrodt (Han); C. Paraschiv (Corina)

    2007-01-01

    textabstractA growing body of qualitative evidence shows that loss aversion, a phenomenon formalized in prospect theory, can explain a variety of field and experimental data. Quantifications of loss aversion are, however, hindered by the absence of a general preference-based method to elicit the

  5. A critical analysis of the quantum theory of measurement

    International Nuclear Information System (INIS)

    Fer, F.

    1984-01-01

    Keeping strictly in the positivist and probabilistic, hence hilbertian frame of Quantum Mechanics, the author tries to ascertain whether or not Quantum Mechanics, starting from its axioms, reaches the aim of any physical theory, that is, comparison with experiment. The answer is: no, as long as it keeps close to the existing axiomatics, and also to accurate mathematics. (Auth.)

  6. Loss Aversion under Prospect Theory: a Parameter-Free Measurement

    NARCIS (Netherlands)

    H. Bleichrodt (Han); M. Abdellaoui (Mohammed); C. Paraschiv (Corina)

    2007-01-01

    textabstractA growing body of qualitative evidence shows that loss aversion, a phenomenon formalized in prospect theory, can explain a variety of field and experimental data. Quantifications of loss aversion are, however, hindered by the absence of a general preference-based method to elicit the

  7. The theory, practice, and measurement of Music Therapy

    DEFF Research Database (Denmark)

    Moore, Kimberly Sena; Hanson-Abromeit, Deanna; Magee, Wendy L.

    2013-01-01

    Music therapy is a clinical healthcare discipline that draws its evidence base from music neuroscience and psychology to improve the health and well-being in individuals from varied clinical populations. Working with individuals across the lifespan, evidence-based therapeutic methods are developed...... from an understanding of music perception and cognition. Given the diversity of practice, there are several key challenges for the discipline. One is developing a theory-based clinical and research approach. This supports a deeper understanding of the complex music stimulus and therapeutic interactions...... of interest. This symposium will bring together some of the latest research from the discipline of music therapy relating to the clinical needs of complex neurological and psychiatric populations. The papers offer diverse perspectives reflecting interdisciplinary influences on the theory and practice of music...

  8. Confidence Measurement in the Light of Signal Detection Theory

    Directory of Open Access Journals (Sweden)

    Sébastien eMassoni

    2014-12-01

    Full Text Available We compare three alternative methods for eliciting retrospective confidence in the context of a simple perceptual task: the Simple Confidence Rating (a direct report on a numerical scale, the Quadratic Scoring Rule (a post-wagering procedure and the Matching Probability (a generalization of the no-loss gambling method. We systematically compare the results obtained with these three rules to the theoretical confidence levels that can be inferred from performance in the perceptual task using Signal Detection Theory. We find that the Matching Probability provides better results in that respect. We conclude that Matching Probability is particularly well suited for studies of confidence that use Signal Detection Theory as a theoretical framework.

  9. Nitric oxide in cerebral vasospasm: theories, measurement, and treatment.

    Science.gov (United States)

    Siuta, Michael; Zuckerman, Scott L; Mocco, J

    2013-01-01

    In recent decades, a large body of research has focused on the role of nitric oxide (NO) in the development of cerebral vasospasm (CV) following subarachnoid hemorrhage (SAH). Literature searches were therefore conducted regarding the role of NO in cerebral vasospasm, specifically focusing on NO donors, reactive nitrogen species, and peroxynitrite in manifestation of vasospasm. Based off the assessment of available evidence, two competing theories are reviewed regarding the role of NO in vasospasm. One school of thought describes a deficiency in NO due to scavenging by hemoglobin in the cisternal space, leading to an NO signaling deficit and vasospastic collapse. A second hypothesis focuses on the dysfunction of nitric oxide synthase, an enzyme that synthesizes NO, and subsequent generation of reactive nitrogen species. Both theories have strong experimental evidence behind them and hold promise for translation into clinical practice. Furthermore, NO donors show definitive promise for preventing vasospasm at the angiographic and clinical level. However, NO augmentation may also cause systemic hypotension and worsen vasospasm due to oxidative distress. Recent evidence indicates that targeting NOS dysfunction, for example, through erythropoietin or statin administration, also shows promise at preventing vasospasm and neurotoxicity. Ultimately, the role of NO in neurovascular disease is complex. Neither of these theories is mutually exclusive, and both should be considered for future research directions and treatment strategies.

  10. Nitric Oxide in Cerebral Vasospasm: Theories, Measurement, and Treatment

    Directory of Open Access Journals (Sweden)

    Michael Siuta

    2013-01-01

    Full Text Available In recent decades, a large body of research has focused on the role of nitric oxide (NO in the development of cerebral vasospasm (CV following subarachnoid hemorrhage (SAH. Literature searches were therefore conducted regarding the role of NO in cerebral vasospasm, specifically focusing on NO donors, reactive nitrogen species, and peroxynitrite in manifestation of vasospasm. Based off the assessment of available evidence, two competing theories are reviewed regarding the role of NO in vasospasm. One school of thought describes a deficiency in NO due to scavenging by hemoglobin in the cisternal space, leading to an NO signaling deficit and vasospastic collapse. A second hypothesis focuses on the dysfunction of nitric oxide synthase, an enzyme that synthesizes NO, and subsequent generation of reactive nitrogen species. Both theories have strong experimental evidence behind them and hold promise for translation into clinical practice. Furthermore, NO donors show definitive promise for preventing vasospasm at the angiographic and clinical level. However, NO augmentation may also cause systemic hypotension and worsen vasospasm due to oxidative distress. Recent evidence indicates that targeting NOS dysfunction, for example, through erythropoietin or statin administration, also shows promise at preventing vasospasm and neurotoxicity. Ultimately, the role of NO in neurovascular disease is complex. Neither of these theories is mutually exclusive, and both should be considered for future research directions and treatment strategies.

  11. Protective measures against electric shock. Fundamentals and their practical implementation. 12. new rev. ed.; Schutzmassnahmen gegen elektrischen Schlag. Grundlagen und deren praktische Umsetzung

    Energy Technology Data Exchange (ETDEWEB)

    Luber, Georg; Pelta, Reinhard; Rudnik, Siegfried

    2013-02-01

    The fundament of all requirements for the protection against electric shock is based on the knowledge on the impacts of the electric power through the human body. In combination with the determination of the potential contact voltages in the event of a fault, protective measures such as switch-off or potential equalization can be set. Turn-off times of protective devices depend on the contact voltage as well as on the tolerance of the human being. The book under consideration summarizes all facts and supplies explanations of fixed protective measures of the standards DIN VDE 0100.

  12. Quivers, words and fundamentals

    International Nuclear Information System (INIS)

    Mattioli, Paolo; Ramgoolam, Sanjaye

    2015-01-01

    A systematic study of holomorphic gauge invariant operators in general N=1 quiver gauge theories, with unitary gauge groups and bifundamental matter fields, was recently presented in http://dx.doi.org/10.1007/JHEP04(2013)094. For large ranks a simple counting formula in terms of an infinite product was given. We extend this study to quiver gauge theories with fundamental matter fields, deriving an infinite product form for the refined counting in these cases. The infinite products are found to be obtained from substitutions in a simple building block expressed in terms of the weighted adjacency matrix of the quiver. In the case without fundamentals, it is a determinant which itself is found to have a counting interpretation in terms of words formed from partially commuting letters associated with simple closed loops in the quiver. This is a new relation between counting problems in gauge theory and the Cartier-Foata monoid. For finite ranks of the unitary gauge groups, the refined counting is given in terms of expressions involving Littlewood-Richardson coefficients.

  13. Light scattering by nonspherical particles theory, measurements, and applications

    CERN Document Server

    Mishchenko, Michael I; Travis, Larry D

    1999-01-01

    There is hardly a field of science or engineering that does not have some interest in light scattering by small particles. For example, this subject is important to climatology because the energy budget for the Earth's atmosphere is strongly affected by scattering of solar radiation by cloud and aerosol particles, and the whole discipline of remote sensing relies largely on analyzing the parameters of radiation scattered by aerosols, clouds, and precipitation. The scattering of light by spherical particles can be easily computed using the conventional Mie theory. However, most small solid part

  14. Confidence measurement in the light of signal detection theory

    Science.gov (United States)

    Massoni, Sébastien; Gajdos, Thibault; Vergnaud, Jean-Christophe

    2014-01-01

    We compare three alternative methods for eliciting retrospective confidence in the context of a simple perceptual task: the Simple Confidence Rating (a direct report on a numerical scale), the Quadratic Scoring Rule (a post-wagering procedure), and the Matching Probability (MP; a generalization of the no-loss gambling method). We systematically compare the results obtained with these three rules to the theoretical confidence levels that can be inferred from performance in the perceptual task using Signal Detection Theory (SDT). We find that the MP provides better results in that respect. We conclude that MP is particularly well suited for studies of confidence that use SDT as a theoretical framework. PMID:25566135

  15. Species distributions, quantum theory, and the enhancement of biodiversity measures

    DEFF Research Database (Denmark)

    Real, Raimundo; Barbosa, A. Márcia; Bull, Joseph William

    2017-01-01

    Species distributions are typically represented by records of their observed occurrence at a given spatial and temporal scale. Such records are inevitably incomplete and contingent on the spatial–temporal circumstances under which the observations were made. Moreover, organisms may respond...... biodiversity”. We show how conceptualizing species’ distributions in this way could help overcome important weaknesses in current biodiversity metrics, both in theory and by using a worked case study of mammal distributions in Spain over the last decade. We propose that considerable theoretical advances could...

  16. Microwave engineering concepts and fundamentals

    CERN Document Server

    Khan, Ahmad Shahid

    2014-01-01

    Detailing the active and passive aspects of microwaves, Microwave Engineering: Concepts and Fundamentals covers everything from wave propagation to reflection and refraction, guided waves, and transmission lines, providing a comprehensive understanding of the underlying principles at the core of microwave engineering. This encyclopedic text not only encompasses nearly all facets of microwave engineering, but also gives all topics—including microwave generation, measurement, and processing—equal emphasis. Packed with illustrations to aid in comprehension, the book: •Describes the mathematical theory of waveguides and ferrite devices, devoting an entire chapter to the Smith chart and its applications •Discusses different types of microwave components, antennas, tubes, transistors, diodes, and parametric devices •Examines various attributes of cavity resonators, semiconductor and RF/microwave devices, and microwave integrated circuits •Addresses scattering parameters and their properties, as well a...

  17. Fundamental concepts of mathematics

    CERN Document Server

    Goodstein, R L

    Fundamental Concepts of Mathematics, 2nd Edition provides an account of some basic concepts in modern mathematics. The book is primarily intended for mathematics teachers and lay people who wants to improve their skills in mathematics. Among the concepts and problems presented in the book include the determination of which integral polynomials have integral solutions; sentence logic and informal set theory; and why four colors is enough to color a map. Unlike in the first edition, the second edition provides detailed solutions to exercises contained in the text. Mathematics teachers and people

  18. Implications Of The Crisis Of Objectivity In Accounting Measurement On The Development Of Finance Theory

    OpenAIRE

    Saratiel Wedzerai Musvoto

    2011-01-01

    Studies in accounting measurement indicate the absence of empirical relational structures that should form the basis for accounting measurement. This suggests the lack of objectivity of accounting information. Landmarks in the development of finance theory indicate the use of accounting measurement information as a basis for their development. This indicates that subjective accounting information is incorporated in finance theory. Consequently, this questions the status of finance as a univer...

  19. Measuring fundamental properties in operating solid oxide electrochemical cells by using in situ X-ray photoelectron spectroscopy

    Science.gov (United States)

    Zhang, Chunjuan; Grass, Michael E.; McDaniel, Anthony H.; Decaluwe, Steven C.; Gabaly, Farid El; Liu, Zhi; McCarty, Kevin F.; Farrow, Roger L.; Linne, Mark A.; Hussain, Zahid; Jackson, Gregory S.; Bluhm, Hendrik; Eichhorn, Bryan W.

    2010-11-01

    Photoelectron spectroscopic measurements have the potential to provide detailed mechanistic insight by resolving chemical states, electrochemically active regions and local potentials or potential losses in operating solid oxide electrochemical cells (SOCs), such as fuel cells. However, high-vacuum requirements have limited X-ray photoelectron spectroscopy (XPS) analysis of electrochemical cells to ex situ investigations. Using a combination of ambient-pressure XPS and CeO2-x/YSZ/Pt single-chamber cells, we carry out in situ spectroscopy to probe oxidation states of all exposed surfaces in operational SOCs at 750°C in 1mbar reactant gases H2 and H2O. Kinetic energy shifts of core-level photoelectron spectra provide a direct measure of the local surface potentials and a basis for calculating local overpotentials across exposed interfaces. The mixed ionic/electronic conducting CeO2-x electrodes undergo Ce3+/Ce4+ oxidation-reduction changes with applied bias. The simultaneous measurements of local surface Ce oxidation states and electric potentials reveal the active ceria regions during H2 electro-oxidation and H2O electrolysis. The active regions extend ~150μm from the current collectors and are not limited by the three-phase-boundary interfaces associated with other SOC materials. The persistence of the Ce3+/Ce4+ shifts in the ~150μm active region suggests that the surface reaction kinetics and lateral electron transport on the thin ceria electrodes are co-limiting processes.

  20. Fundamentals of quantum mechanics

    CERN Document Server

    House, J E

    2017-01-01

    Fundamentals of Quantum Mechanics, Third Edition is a clear and detailed introduction to quantum mechanics and its applications in chemistry and physics. All required math is clearly explained, including intermediate steps in derivations, and concise review of the math is included in the text at appropriate points. Most of the elementary quantum mechanical models-including particles in boxes, rigid rotor, harmonic oscillator, barrier penetration, hydrogen atom-are clearly and completely presented. Applications of these models to selected “real world” topics are also included. This new edition includes many new topics such as band theory and heat capacity of solids, spectroscopy of molecules and complexes (including applications to ligand field theory), and small molecules of astrophysical interest.

  1. Fundamentals of optical waveguides

    CERN Document Server

    Okamoto, Katsunari

    2006-01-01

    Fundamentals of Optical Waveguides is an essential resource for any researcher, professional or student involved in optics and communications engineering. Any reader interested in designing or actively working with optical devices must have a firm grasp of the principles of lightwave propagation. Katsunari Okamoto has presented this difficult technology clearly and concisely with several illustrations and equations. Optical theory encompassed in this reference includes coupled mode theory, nonlinear optical effects, finite element method, beam propagation method, staircase concatenation method, along with several central theorems and formulas. Since the publication of the well-received first edition of this book, planar lightwave circuits and photonic crystal fibers have fully matured. With this second edition the advances of these fibers along with other improvements on existing optical technologies are completely detailed. This comprehensive volume enables readers to fully analyze, design and simulate opti...

  2. Fundamentals of electronic image processing

    CERN Document Server

    Weeks, Arthur R

    1996-01-01

    This book is directed to practicing engineers and scientists who need to understand the fundamentals of image processing theory and algorithms to perform their technical tasks. It is intended to fill the gap between existing high-level texts dedicated to specialists in the field and the need for a more practical, fundamental text on image processing. A variety of example images are used to enhance reader understanding of how particular image processing algorithms work.

  3. Chern-Simons Theories with Fundamental Matter: A Brief Review of Large N Results Including Fermi-Bose Duality and the S-matrix

    Science.gov (United States)

    Wadia, Spenta R.

    We begin with a few words about Salam's contribution to the growth of String Theory in India. In the technical talk we review results in SU(N) Chern-Simons plus vector matter theories in 2+1 dim in the large N limit. The dressing of charged matter by Chern-Simons gauge fields leads to anyons that interpolate between fermions and bosons and lead to a duality symmetry between fermionic and bosonic theories. The S-matrix (defined in the large N limit) besides exhibiting this duality, also exhibits novel properties due to the presence of anyons. The S-matrix is not analytic, like in Aharonov-Bohm scattering, and satisfies modified crossing symmetry relations.

  4. Historical view of the influences of measurement and reading theories on the assessment of reading.

    Science.gov (United States)

    Engelhard, G

    2001-01-01

    The purpose of this study is to briefly explore the interactions among measurement theories, reading theories, and measurement practices from an historical perspective. The assessment of reading provides a useful framework for examining how theories influence, and in some cases fail to influence, the practice of reading assessment as operationalized in reading tests. The first section describes a conceptual framework for examining the assessment of reading. Next I describe the major research traditions in measurement theory that have dominated measurement practice during the 20th century. In the next section I briefly introduce major reading theories. Next, I bring together the previous two sections in order to examine the adequacy of the proposed conceptual framework for examining the assessment of reading. This section includes criticism of measurement theory by selected reading theorists. It also provides a brief history of the use of Rasch measurement theory to calibrate reading tests. Finally, the main points of the study are summarized and discussed. It should be recognized that this study represents a preliminary analysis of these issues.

  5. Coherent versus Measurement Feedback: Linear Systems Theory for Quantum Information

    Directory of Open Access Journals (Sweden)

    Naoki Yamamoto

    2014-11-01

    Full Text Available To control a quantum system via feedback, we generally have two options in choosing a control scheme. One is the coherent feedback, which feeds the output field of the system, through a fully quantum device, back to manipulate the system without involving any measurement process. The other one is measurement-based feedback, which measures the output field and performs a real-time manipulation on the system based on the measurement results. Both schemes have advantages and disadvantages, depending on the system and the control goal; hence, their comparison in several situations is important. This paper considers a general open linear quantum system with the following specific control goals: backaction evasion, generation of a quantum nondemolished variable, and generation of a decoherence-free subsystem, all of which have important roles in quantum information science. Some no-go theorems are proven, clarifying that those goals cannot be achieved by any measurement-based feedback control. On the other hand, it is shown that, for each control goal there exists a coherent feedback controller accomplishing the task. The key idea to obtain all the results is system theoretic characterizations of the above three notions in terms of controllability and observability properties or transfer functions of linear systems, which are consistent with their standard definitions.

  6. An investigation of emotional intelligence measures using item response theory.

    Science.gov (United States)

    Cho, Seonghee; Drasgow, Fritz; Cao, Mengyang

    2015-12-01

    This study investigated the psychometric properties of 3 frequently administered emotional intelligence (EI) scales (Wong and Law Emotional Intelligence Scale [WLEIS], Schutte Self-Report Emotional Intelligence Test [SEIT], and Trait Emotional Intelligence Questionnaire [TEIQue]), which were developed on the basis of different theoretical frameworks (i.e., ability EI and mixed EI). By conducting item response theory (IRT) analyses, the authors examined the item parameters and compared the fits of 2 response process models (i.e., dominance model and ideal point model) for these scales with data from 355 undergraduate sample recruited from the subject pool. Several important findings were obtained. First, the EI scales seem better able to differentiate individuals at low trait levels than high trait levels. Second, a dominance model showed better model fit to the self-report ability EI scale (WLEIS) and also fit better with most subfactors of the SEIT, except for the mood regulation/optimism factor. Both dominance and ideal point models fit a self-report mixed EI scale (TEIQue). Our findings suggest (a) the EI scales should be revised to include more items at moderate and higher trait levels; and (b) the nature of the EI construct should be considered during the process of scale development. (c) 2015 APA, all rights reserved).

  7. Estimation of the limit of detection using information theory measures.

    Science.gov (United States)

    Fonollosa, Jordi; Vergara, Alexander; Huerta, Ramón; Marco, Santiago

    2014-01-31

    Definitions of the limit of detection (LOD) based on the probability of false positive and/or false negative errors have been proposed over the past years. Although such definitions are straightforward and valid for any kind of analytical system, proposed methodologies to estimate the LOD are usually simplified to signals with Gaussian noise. Additionally, there is a general misconception that two systems with the same LOD provide the same amount of information on the source regardless of the prior probability of presenting a blank/analyte sample. Based upon an analogy between an analytical system and a binary communication channel, in this paper we show that the amount of information that can be extracted from an analytical system depends on the probability of presenting the two different possible states. We propose a new definition of LOD utilizing information theory tools that deals with noise of any kind and allows the introduction of prior knowledge easily. Unlike most traditional LOD estimation approaches, the proposed definition is based on the amount of information that the chemical instrumentation system provides on the chemical information source. Our findings indicate that the benchmark of analytical systems based on the ability to provide information about the presence/absence of the analyte (our proposed approach) is a more general and proper framework, while converging to the usual values when dealing with Gaussian noise. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Species Distributions, Quantum Theory, and the Enhancement of Biodiversity Measures.

    Science.gov (United States)

    Real, Raimundo; Barbosa, A Márcia; Bull, Joseph W

    2017-05-01

    Species distributions are typically represented by records of their observed occurrence at a given spatial and temporal scale. Such records are inevitably incomplete and contingent on the spatial-temporal circumstances under which the observations were made. Moreover, organisms may respond differently to similar environmental conditions at different places or moments, so their distribution is, in principle, not completely predictable. We argue that this uncertainty exists, and warrants considering species distributions as analogous to coherent quantum objects, whose distributions are better described by a wavefunction rather than by a set of locations. We use this to extend the existing concept of "dark diversity", which incorporates into biodiversity metrics those species that could, but which have not yet been observed to, inhabit a region-thereby developing the idea of "potential biodiversity". We show how conceptualizing species' distributions in this way could help overcome important weaknesses in current biodiversity metrics, both in theory and by using a worked case study of mammal distributions in Spain over the last decade. We propose that considerable theoretical advances could eventually be gained through interdisciplinary collaboration between biogeographers and quantum physicists. [Biogeography; favorability; physics; predictability; probability; species occurrence; uncertainty; wavefunction. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Reliability of the Measure of Acceptance of the Theory of Evolution (MATE) Instrument with University Students

    Science.gov (United States)

    Rutledge, Michael L.; Sadler, Kim C.

    2007-01-01

    The Measure of Acceptance of the Theory of Evolution (MATE) instrument was initially designed to assess high school biology teachers' acceptance of evolutionary theory. To determine if the MATE instrument is reliable with university students, it was administered to students in a non-majors biology course (n = 61) twice over a 3-week period.…

  10. A Fundamental Study of Laser-Induced Breakdown Spectroscopy Using Fiber Optics for Remote Measurements Of Trace Metals

    Energy Technology Data Exchange (ETDEWEB)

    Scott Goode; S. Michael Angel

    2004-01-20

    Develop a fiber-optic imaging probe for microanalysis of solid samples; Design a time-resolved plasma imaging system to measure the development of the LIBS signal; Setup a laboratory system capable of timing two lasers independently, for optimizing and characterizing dual-pulse LIBS; Compare the development of laser-induced plasmas generated with a single laser pulse to the development of laser-induced plasmas generated with a pre-ablation spark prior to sample ablation; Examine the effect of sample matrix on the LIBS signals of elements in different sample matrices; Investigate the effect of excitation wavelength of the ablation beam in pre-ablation spark dual-pulse LIBS experiments; Determine the effect of the physical properties of the sample on the mass of materials ablated.

  11. EPR steering and its application to fundamental questions in bell nonlocality

    OpenAIRE

    Bowles, Joseph

    2016-01-01

    Quantum theory is known to lead to correlations between the outcomes of measurements performed on distant systems that are strictly stronger than those obtained by any classical theory. Such correlations, termed nonlocal, highlight the radical departure of quantum theory from classical formulations of physics. Here, we study fundamental questions about the nature of nonlocal correlations in quantum theory, using as a tool the recently developed field of EPR steering. In the second part of the...

  12. Local gauge invariant QED with fundamental length

    International Nuclear Information System (INIS)

    Kadyshevsky, V.G.; Mateev, M.D.

    1981-01-01

    A local gauge theory of electromagnetic interactions with the fundamental length l as a new universal scale is worked out. The Lagrangian contains new extra terms in which the coupling constant is proportional to the fundamental length. The theory has an elegant geometrical basis: in momentum representation one faces de Sitter momentum space with curvature radius 1/l [ru

  13. On Entropy Production of Repeated Quantum Measurements I. General Theory

    Science.gov (United States)

    Benoist, T.; Jakšić, V.; Pautrat, Y.; Pillet, C.-A.

    2018-01-01

    We study entropy production (EP) in processes involving repeated quantum measurements of finite quantum systems. Adopting a dynamical system approach, we develop a thermodynamic formalism for the EP and study fine aspects of irreversibility related to the hypothesis testing of the arrow of time. Under a suitable chaoticity assumption, we establish a Large Deviation Principle and a Fluctuation Theorem for the EP.

  14. The theory and measurement of partial discharge transients

    DEFF Research Database (Denmark)

    Pedersen, Aage; Crichton, George C; McAllister, Iain Wilson

    1991-01-01

    are the charges which, as a result of partial discharge activity, are distributed within the voids of the insulation medium. These charge relationships are derived, and their application to the measured transients associated with the time dependence of the induced charge is presented. The application to multiple...

  15. Fundamentals of functional analysis

    CERN Document Server

    Farenick, Douglas

    2016-01-01

    This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...

  16. Invariant Set Theory: Violating Measurement Independence without Fine Tuning, Conspiracy, Constraints on Free Will or Retrocausality

    Directory of Open Access Journals (Sweden)

    Tim Palmer

    2015-11-01

    Full Text Available Invariant Set (IS theory is a locally causal ontic theory of physics based on the Cosmological Invariant Set postulate that the universe U can be considered a deterministic dynamical system evolving precisely on a (suitably constructed fractal dynamically invariant set in U's state space. IS theory violates the Bell inequalities by violating Measurement Independence. Despite this, IS theory is not fine tuned, is not conspiratorial, does not constrain experimenter free will and does not invoke retrocausality. The reasons behind these claims are discussed in this paper. These arise from properties not found in conventional ontic models: the invariant set has zero measure in its Euclidean embedding space, has Cantor Set structure homeomorphic to the p-adic integers (p>>0 and is non-computable. In particular, it is shown that the p-adic metric encapulates the physics of the Cosmological Invariant Set postulate, and provides the technical means to demonstrate no fine tuning or conspiracy. Quantum theory can be viewed as the singular limit of IS theory when when p is set equal to infinity. Since it is based around a top-down constraint from cosmology, IS theory suggests that gravitational and quantum physics will be unified by a gravitational theory of the quantum, rather than a quantum theory of gravity. Some implications arising from such a perspective are discussed.

  17. Surface and interfacial tension measurement, theory, and applications

    CERN Document Server

    Hartland, Stanley

    2004-01-01

    This edited volume offers complete coverage of the latest theoretical, experimental, and computer-based data as summarized by leading international researchers. It promotes full understanding of the physical phenomena and mechanisms at work in surface and interfacial tensions and gradients, their direct impact on interface shape and movement, and their significance to numerous applications. Assessing methods for the accurate measurement of surface tension, interfacial tension, and contact angles, Surface and Interfacial Tension presents modern simulations of complex interfacial motions, such a

  18. Improving the Performance of MEMS GYROS via Redundant Measurements: Theory and Experiments

    Science.gov (United States)

    2014-12-01

    performance, single axis fiber optic gyro ,” KVH Inc., Middletown, RI. Tech. Rep. DSP-3000, Jul. 2009. [9] E. Linacre and B. Geerts. (n.d.). The...MEMS GYROS VIA REDUNDANT MEASUREMENTS: THEORY AND EXPERIMENTS by Matthew J. Leszczynski December 2014 Thesis Advisor: Mark Karpenko...REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE IMPROVING THE PERFORMANCE OF MEMS GYROS VIA REDUNDANT MEASUREMENTS: THEORY

  19. Bayesian modeling of measurement error in predictor variables using item response theory

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Glas, Cornelis A.W.

    2000-01-01

    This paper focuses on handling measurement error in predictor variables using item response theory (IRT). Measurement error is of great important in assessment of theoretical constructs, such as intelligence or the school climate. Measurement error is modeled by treating the predictors as unobserved

  20. TWO GENERALIZATIONS OF AGGREGATED UNCERTAINTY MEASURE FOR EVALUATION OF DEZERT–SMARANDACHE THEORY

    OpenAIRE

    MAHDI KHODABANDEH; ALIREZA MOHAMMAD-SHAHRI

    2012-01-01

    Generality of the model which is used in Dezert–Smarandache Theory (DSmT) rather than other fusion algorithms such as Dempster–Shafer theory and capability of DSmT for dealing with highly conflict problems are two main reasons to prefer DSmT for decision-making systems. Aggregated uncertainty measure, which is called AU measure, has been introduced for Dempster–Shafer theory as one of the best presented ways to quantify the total uncertainty or the ambiguity of a belief function. Since AU can...

  1. Biomedical engineering fundamentals

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Engineering Fundamentals, the first volume of the handbook, presents material from respected scientists with diverse backgrounds in physiological systems, biomechanics, biomaterials, bioelectric phenomena, and neuroengineering. More than three dozen specific topics are examined, including cardia

  2. Infrared and Raman Spectroscopy of Eugenol, Isoeugenol, and Methyl Eugenol: Conformational Analysis and Vibrational Assignments from Density Functional Theory Calculations of the Anharmonic Fundamentals.

    Science.gov (United States)

    Chowdhry, Babur Z; Ryall, John P; Dines, Trevor J; Mendham, Andrew P

    2015-11-19

    IR and Raman spectra of eugenol, isoeugenol and methyl eugenol have been obtained in the liquid phase. Vibrational spectroscopic results are discussed in relation to computed structures and spectra of the low energy conformations of these molecules obtained from DFT calculations at the B3LYP/cc-pVTZ level. Although computed differences in vibrational spectra for the different conformers were generally small, close examination, in conjunction with the experimental spectra, enabled conformational analysis of all three molecules. Anharmonic contributions to computed vibrational spectra were obtained from calculations of cubic and quartic force constants at the B3LYP/DZ level. This permitted the determination of the anharmonic fundamentals for comparison with the experimental IR and Raman band positions, leading to an excellent fit between calculated and experimental spectra. Band assignments were obtained in terms of potential energy distributions (ped's).

  3. Comparison of Theory with Rotation Measurements in JET ICRH Plasmas

    International Nuclear Information System (INIS)

    R.V. Budny; C.S. Chang; C. Giroud; R.J. Goldston; D. McCune; J. Ongena; F.W. Perkins; R.B. White; K.-D. Zastrow; and contributors to the EFDA-JET work programme

    2001-01-01

    Plasma rotation appears to improve plasma performance by increasing the E x B flow shearing rate, thus decreasing radial correlations in the microturbulence. Also, plasma rotation can increase the stability to resistive MHD modes. In the Joint European Torus (JET), toroidal rotation rates omega (subscript ''tor'') with high Mach numbers are generally measured in NBI-heated plasmas (since the neutral beams aim in the co-plasma current direction). They are considerably lower with only ICRH (and Ohmic) heating, but still surprisingly large considering that ICRH appears to inject relatively small amounts of angular momentum. Either the applied torques are larger than naively expected, or the anomalous transport of angular momentum is smaller than expected. Since ICRH is one of the main candidates for heating next-step tokamaks, and for creating burning plasmas in future tokamak reactors, this paper attempts to understand ICRH-induced plasma rotation

  4. The elderly and quality of life: current theories and measurements.

    Science.gov (United States)

    Alesii, A; Mazzarella, F; Mastrilli, E; Fini, M

    2006-01-01

    The rapid evolution of biomedical knowledge and techniques has resulted in new life expectations, nourishing hope not only of adding years to life, but also quality of life (QoL) to years. The aim of the present study was to review the national and international literature concerning QoL and the elderly, and to outline the conceptual developments of QoL that have guided the research and development of different measurement instruments used for the assessment of QoL among the elderly population. From the review it emerged that the questionnaires most used to assess QoL in the research on the elderly are: the Short Form 36 (SF36), the Short Form 12 (SF12), the EuroQol (EQ5D), Life-Quality-Gerontology Centre Scale (LGC-Scale), and Quality of Life-Alzheimer's Disease (QoL-AD).

  5. Theory and measurements of emittance preservation in plasma wakefield acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Frederico, Joel

    2016-12-01

    In this dissertation, we examine the preservation and measurement of emittance in the plasma wakefield acceleration blowout regime. Plasma wakefield acceleration (PWFA) is a revolutionary approach to accelerating charged particles that has been demonstrated to have the potential for gradients orders of magnitude greater than traditional approaches. The application of PWFA to the design of a linear collider will make new high energy physics research possible, but the design parameters must first be shown to be competitive with traditional methods. Emittance preservation is necessary in the design of a linear collider in order to maximize luminosity. We examine the conditions necessary for circular symmetry in the PWFA blowout regime, and demonstrate that current proposals meet these bounds. We also present an application of beam lamentation which describes the process of beam parameter and emittance matching. We show that the emittance growth saturates as a consequence of energy spread in the beam. The initial beam parameters determine the amount of emittance growth, while the contribution of energy spread is negligible. We also present a model for ion motion in the presence of a beam that is much more dense than the plasma. By combining the model of ion motion and emittance growth, we find the emittance growth due to ion motion is minimal in the case of marginal ion motion. In addition, we present a simulation that validates the ion motion model, which is under further development to examine emittance growth of both marginal and pronounced ion motion. Finally, we present a proof-of-concept of an emittance measurement which may enable the analysis of emittance preservation in future PWFA experiments.

  6. Lead field theory provides a powerful tool for designing microelectrode array impedance measurements for biological cell detection and observation.

    Science.gov (United States)

    Böttrich, Marcel; Tanskanen, Jarno M A; Hyttinen, Jari A K

    2017-06-26

    Our aim is to introduce a method to enhance the design process of microelectrode array (MEA) based electric bioimpedance measurement systems for improved detection and viability assessment of living cells and tissues. We propose the application of electromagnetic lead field theory and reciprocity for MEA design and measurement result interpretation. Further, we simulated impedance spectroscopy (IS) with two- and four-electrode setups and a biological cell to illustrate the tool in the assessment of the capabilities of given MEA electrode constellations for detecting cells on or in the vicinity of the microelectrodes. The results show the power of the lead field theory in electromagnetic simulations of cell-microelectrode systems depicting the fundamental differences of two- and four-electrode IS measurement configurations to detect cells. Accordingly, the use in MEA system design is demonstrated by assessing the differences between the two- and four-electrode IS configurations. Further, our results show how cells affect the lead fields in these MEA system, and how we can utilize the differences of the two- and four-electrode setups in cell detection. The COMSOL simulator model is provided freely in public domain as open source. Lead field theory can be successfully applied in MEA design for the IS based assessment of biological cells providing the necessary visualization and insight for MEA design. The proposed method is expected to enhance the design and usability of automated cell and tissue manipulation systems required for bioreactors, which are intended for the automated production of cell and tissue grafts for medical purposes. MEA systems are also intended for toxicology to assess the effects of chemicals on living cells. Our results demonstrate that lead field concept is expected to enhance also the development of such methods and devices.

  7. The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics

    Science.gov (United States)

    Neal, Ralph D.

    1996-01-01

    Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.

  8. Ecological fundamentals of environmental protection

    International Nuclear Information System (INIS)

    Haber, W.

    1993-01-01

    The book reviews the state of the art of ecological knowledge. The emphasis is on ecosystem theory and in the interpretation of our environment with its irreversible anthropogenic changes. It is an important contribution to deeper knowledge about the ecological fundamentals of environmental protection and the factors that constitute nature's potential. (orig./BBR) [de

  9. Fundamental Approaches to Software Engineering

    NARCIS (Netherlands)

    Gnesi, S.; Rensink, Arend; Unknown, [Unknown

    This volume contains the proceedings of FASE 2014, the 17th International Conferences on Fundamental Approaches to Software Engineering, which was held in Grenoble, Italy, in April 2014 as part of the annual European Joint Conferences on Theory and Practice of Software (ETAPS).

  10. Fundamental limitations in filtering and control

    CERN Document Server

    Seron, Maria M

    1997-01-01

    The issue of fundamental limitations in filtering and control lies at the very heart of any feedback system design, since it reveals what is and is not achievable on the basis of that system's structural and dynamic characteristics. Alongside new succinct treatments of Bode's original results from the 1940s, this book presents a comprehensive analysis of modern results, featuring contemporary developments in multivariable systems, sampled-data, periodic and nonlinear problems. The text gives particular prominence to sensitivity functions which measure the fundamental qualities of the system, including performance and robustness. With extensive appendices covering the necessary background on complex variable theory, this book is an ideal self-contained resource for researchers and practitioners in this field.

  11. Spatial Filtering Velocimetry Fundamentals and Applications

    CERN Document Server

    Aizu, Yoshihisa

    2006-01-01

    The first monograph devoted exclusively to spatial filtering velocimetry, this book includes fundamental theory, imaging optics, signal analysis, spatial filtering devices and systems, plus applications. Also suitable as a tutorial for students and users who are unfamiliar with optics and signal processing, Spatial Filtering Velocimetry treats the principle and properties of this velocimetric technique in a concise and easily readable form, together with full appendices. The book reviews a wide range of systems and applications of the spatial-filtering technique for velocity and related measurements, putting forth examples useful in various fields of science, medicine, and engineering.

  12. Testing Fundamental Gravitation in Space

    Energy Technology Data Exchange (ETDEWEB)

    Turyshev, Slava G.

    2013-10-15

    General theory of relativity is a standard theory of gravitation; as such, it is used to describe gravity when the problems in astronomy, astrophysics, cosmology, and fundamental physics are concerned. The theory is also relied upon in many modern applications involving spacecraft navigation, geodesy, and time transfer. Here we review the foundations of general relativity and discuss its current empirical status. We describe both the theoretical motivation and the scientific progress that may result from the new generation of high-precision tests that are anticipated in the near future.

  13. Can theory of mind deficits be measured reliably in people with mild and moderate Alzheimer's dementia?

    Science.gov (United States)

    Choong, Caroline Sm; Doody, Gillian A

    2013-01-01

    Patients suffering from Alzheimer's dementia develop difficulties in social functioning. This has led to an interest in the study of "theory of mind" in this population. However, difficulty has arisen because the associated cognitive demands of traditional short story theory of mind assessments result in failure per se in this population, making it challenging to test pure theory of mind ability. Simplified, traditional 1st and 2nd order theory of mind short story tasks and a battery of alternative theory of mind cartoon jokes and control slapstick cartoon jokes, without memory components, were administered to 16 participants with mild-moderate Alzheimer's dementia, and 11 age-matched healthy controls. No significant differences were detected between participants with Alzheimer's dementia and controls on the 1st or 2nd order traditional short story theory of mind tasks (p = 0.155 and p = 0.154 respectively). However, in the cartoon joke tasks there were significant differences in performance between the Alzheimer participants and the control group, this was evident for both theory of mind cartoons and the control 'slapstick' jokes. It remains very difficult to assess theory of mind as an isolated phenomenon in populations with global cognitive impairment, such as Alzheimer's dementia, as the tasks used to assess this cognition invariably depend on other cognitive functions. Although a limitation of this study is the small sample size, the results suggest that there is no measurable specific theory of mind deficit in people with Alzheimer's dementia, and that the use of theory of mind representational models to measure social cognitive ability may not be appropriate in this population.

  14. On divergence of finite measures and their applicability in statistics and information theory

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Stummer, W.

    2009-01-01

    Roč. 44, č. 2 (2009), s. 169-187 ISSN 0233-1888 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Local and global divergences of finite measures * Divergences of sigma-finite measures * Statistical censoring * Pinsker's inequality, Ornstein's distance * Differential power entropies Subject RIV: BD - Theory of Information Impact factor: 0.759, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/vajda-on divergence of finite measures and their applicability in statistics and information theory.pdf

  15. Adult Personal Resilience: A New Theory, New Measure, and Practical Implications

    Directory of Open Access Journals (Sweden)

    Robert J. Taormina

    2015-04-01

    Full Text Available This is a new theory of adult personal resilience that can apply in any society. It differs from previous theories, which are post hoc because they traditionally focus on helping victims find ways to live with trauma after the trauma occurs. The present theory is a positive psychology approach because it identifies the aspects of a person that can make him or her stronger to prevent personal problems from occurring as well as to deal with traumas and the various vicissitudes of life in general. Whereas this is a new theory, and a complete theory would require a more comprehensive monograph, this paper focuses on describing the essential features of the theory. These are to define adult personal resilience and distinguish it from general concepts of resilience, explain personal resilience as a multidimensional construct by identifying the four dimensions of adult personal resilience (Determination, Endurance, Adaptability, and Recuperability, briefly review the new theory’s advantages over previous theories of resilience, describe the new four-subscale measure of adult personal resilience, and discuss implications of the new concept for theory, research, and practice.

  16. Finite-measuring approximation of operators of scattering theory in representation of wave packets

    International Nuclear Information System (INIS)

    Kukulin, V.I.; Rubtsova, O.A.

    2004-01-01

    Several types of the packet quantization of the continuos spectrum in the scattering theory quantum problems are considered. Such a quantization leads to the convenient finite-measuring (i.e. matrix) approximation of the integral operators in the scattering theory and it makes it possible to reduce the solution of the singular integral equations, complying with the scattering theory, to the convenient purely algebraic equations on the analytical basis, whereby all the singularities are separated in the obvious form. The main attention is paid to the problems of the method practical realization [ru

  17. Measuring Theory of Mind in Children. Psychometric Properties of the ToM Storybooks

    NARCIS (Netherlands)

    Blijd-Hoogewys, E. M. A.; van Geert, P. L. C.; Serra, M.; Minderaa, R. B.

    2008-01-01

    Although research on Theory-of-Mind (ToM) is often based on single task measurements, more comprehensive instruments result in a better understanding of ToM development. The ToM Storybooks is a new instrument measuring basic ToM-functioning and associated aspects. There are 34 tasks, tapping various

  18. Open problems in Banach spaces and measure theory | Rodríguez ...

    African Journals Online (AJOL)

    We collect several open questions in Banach spaces, mostly related to measure theoretic aspects of the theory. The problems are divided into five categories: miscellaneous problems in Banach spaces (non-separable Lp spaces, compactness in Banach spaces, w*-null sequences in dual spaces), measurability in Banach ...

  19. General theory of three-dimensional radiance measurements with optical microprobes RID A-1977-2009

    DEFF Research Database (Denmark)

    FukshanskyKazarinova, N.; Fukshansky, L.; Kuhl, M.

    1997-01-01

    Measurements of the radiance distribution and fluence rate within turbid samples with fiber-optic radiance microprobes contain a large variable instrumental error caused by the nonuniform directional sensitivity of the microprobes. A general theory of three-dimensional radiance measurements...

  20. Quantum measurement

    CERN Document Server

    Busch, Paul; Pellonpää, Juha-Pekka; Ylinen, Kari

    2016-01-01

    This is a book about the Hilbert space formulation of quantum mechanics and its measurement theory. It contains a synopsis of what became of the Mathematical Foundations of Quantum Mechanics since von Neumann’s classic treatise with this title. Fundamental non-classical features of quantum mechanics—indeterminacy and incompatibility of observables, unavoidable measurement disturbance, entanglement, nonlocality—are explicated and analysed using the tools of operational quantum theory. The book is divided into four parts: 1. Mathematics provides a systematic exposition of the Hilbert space and operator theoretic tools and relevant measure and integration theory leading to the Naimark and Stinespring dilation theorems; 2. Elements develops the basic concepts of quantum mechanics and measurement theory with a focus on the notion of approximate joint measurability; 3. Realisations offers in-depth studies of the fundamental observables of quantum mechanics and some of their measurement implementations; and 4....

  1. Atomic spectroscopy and highly accurate measurement: determination of fundamental constants; Spectroscopie atomique et mesures de grande precision: determination de constantes fonfamentales

    Energy Technology Data Exchange (ETDEWEB)

    Schwob, C

    2006-12-15

    This document reviews the theoretical and experimental achievements of the author concerning highly accurate atomic spectroscopy applied for the determination of fundamental constants. A pure optical frequency measurement of the 2S-12D 2-photon transitions in atomic hydrogen and deuterium has been performed. The experimental setting-up is described as well as the data analysis. Optimized values for the Rydberg constant and Lamb shifts have been deduced (R = 109737.31568516 (84) cm{sup -1}). An experiment devoted to the determination of the fine structure constant with an aimed relative uncertainty of 10{sup -9} began in 1999. This experiment is based on the fact that Bloch oscillations in a frequency chirped optical lattice are a powerful tool to transfer coherently many photon momenta to the atoms. We have used this method to measure accurately the ratio h/m(Rb). The measured value of the fine structure constant is {alpha}{sub -1} = 137.03599884 (91) with a relative uncertainty of 6.7*10{sup -9}. The future and perspectives of this experiment are presented. This document presented before an academic board will allow his author to manage research work and particularly to tutor thesis students. (A.C.)

  2. Value of Fundamental Science

    Science.gov (United States)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  3. Critical investigation of Jauch's approach to the quantum theory of measurement

    International Nuclear Information System (INIS)

    Herbut, Fedor

    1986-01-01

    To make Jauch's approach more realistic, his assumptions are modified in two ways: (1) On the quantum system plus the measuring apparatus (S + MA) after the measuring interaction has ceased, one can actually measure only operators of the form given. (2) Measurement is defined in the most general way (including, besides first-kind, also second-kind and third-kind or indirect measurements). It is shown that Jauch's basic result that the microstates (statistical operators) of S + MA before and after the collapse correspond to the same macrostate (belong to the same equivalence class of microstates) remains valid under the above modifications, and that the significance of this result goes beyond measurement theory. On the other hand, it is argued that taking the orthodox (i.e. uncompromisingly quantum) view of quantum mechanics, it is not the collapse, but the Jauch-type macrostates that are spurious in a Jauch-type theory. (author)

  4. Quantifying and handling errors in instrumental measurements using the measurement error theory

    DEFF Research Database (Denmark)

    Andersen, Charlotte Møller; Bro, R.; Brockhoff, P.B.

    2003-01-01

    Measurement error modelling is used for investigating the influence of measurement/sampling error on univariate predictions of water content and water-holding capacity (reference measurement) from nuclear magnetic resonance (NMR) relaxations (instrumental) measured on two gadoid fish species. Thi...

  5. A psychologist's response to the case study: application of theory and measurement.

    Science.gov (United States)

    Canada, Andrea L

    2011-01-01

    This article represents a psychologist's perspective on the case study of Doris, a middle-aged woman with metastatic breast cancer who is initially referred to Chaplain Rhonda for assistance with death anxiety. In the field of psychology, it has long been accepted that good clinical research is informed by theory. As such, Chaplain Rhonda's intervention with Doris will be examined through the lens of object relations theory. Specifically, we will see how Rhonda's relationship and interaction with Doris improves her image of God and, by doing so, decreases her death anxiety. In psychological research, it is also important to accurately measure the effects or outcomes of clinical interventions. In this light, several suggestions are offered for the measurement of constructs relevant to the case of Doris, namely God image and death anxiety. Finally, a simple case study research design, applying the aforementioned theory and measurement, is provided as a suggested starting point for research on the efficacy of chaplaincy interventions.

  6. Measuring the added value of workplace change: Performance measurement in theory and practice

    OpenAIRE

    Riratanaphong, C; van der Voordt, Theo

    2015-01-01

    Purpose: – The purpose of this paper is to compare performance measurement systems from the literature with current performance measurement approaches in practice to get a better understanding of the complex relationships between workplace change, added value and organisational performance.To be able to measure the added value of workplace change, a valid and reliable performance measurement system is needed to measure the impact of the work environment on organisational performance before an...

  7. Brief Report: Preliminary Evaluation of the Theory of Mind Inventory and Its Relationship to Measures of Social Skills

    Science.gov (United States)

    Lerner, Matthew D.; Hutchins, Tiffany L.; Prelock, Patricia A.

    2011-01-01

    This study presents updated information on a parent-report measure of Theory of Mind (ToM), formerly called the Perception of Children's Theory of Mind Measure (Hutchins et al., "J Autism Dev Disord" 38:143-155, 2008), renamed the Theory of Mind Inventory (ToMI), for use with parents of children with autism spectrum disorder (ASD). This study…

  8. EFTfitter: a tool for interpreting measurements in the context of effective field theories

    Energy Technology Data Exchange (ETDEWEB)

    Castro, Nuno [Universidade do Minho, Laboratorio de Instrumentacao e Fisica Experimental de Particulas, Departamento de Fisica, Braga (Portugal); Universidade do Porto, Departamento de Fisica e Astronomia, Faculdade de Ciencias, Porto (Portugal); Erdmann, Johannes; Grunwald, Cornelius; Kroeninger, Kevin [TU Dortmund, Lehrstuhl fuer Experimentelle Physik IV, Dortmund (Germany); Rosien, Nils-Arne [Universitaet Goettingen, II. Physikalisches Institut, Goettingen (Germany)

    2016-08-15

    Over the past years, the interpretation of measurements in the context of effective field theories has attracted much attention in the field of particle physics. We present a tool for interpreting sets of measurements in such models using a Bayesian ansatz by calculating the posterior probabilities of the corresponding free parameters numerically. An example is given, in which top-quark measurements are used to constrain anomalous couplings at the Wtb-vertex. (orig.)

  9. Fundamentos epistemológicos da teoria modular da mente de Jerry A. Fodor Epistemological fundaments of Jerry A. Fodor's modular theory of mind

    Directory of Open Access Journals (Sweden)

    Kleber Bez Birolo Candiotto

    2008-01-01

    Full Text Available Este artigo é uma apresentação dos fundamentos da teoria modular desenvolvida por Jerry A. Fodor e uma reflexão sobre seus principais desafios. A noção de modularidade da mente de Fodor, por um lado, procura superar as insuficiências metodológicas e epistemológicas do associacionismo e do localizacionismo a respeito das explicações da estrutura e do funcionamento mental; por outro lado, é uma oposição à postura culturalista de Vygotsky, para o qual as funções superiores da mente, como a cognição, são produtos artificiais, culturais. A psicologia cognitiva de Chomsky converteu esse produto "artificial" em "natural", postulando a existência de módulos inatos para desempenhar funções cognitivas específicos. Com base nessa ideia de Chomsky, Fodor procura explicar a mente como um conjunto de módulos. No entanto, sua principal contribuição para as ciências cognitivas é a apresentação da arquitetura mental em dois níveis e a afirmação da existência de módulos centrais responsáveis pelas atividades cognitivas superiores, como criatividade, reflexão ou imaginação.The aim of this paper is to present the basic elements regarding the modular theory developed by Jerry A. Fodor and some considerations about its main challenges. Fodor's notion of mind modularity, on the one hand, aims at overcoming the methodological and epistemological gaps of associationism and localizationism concerning the explanations of the structure and functioning of the mind; on the other hand, Fodor's notion stands as an opposition to Vygotsky's culturalist posture, since for the latter the higher functions of the mind, such as cognition, are artificial and cultural products. Chomsky's cognitive psychology has converted this "artificial" product into a "natural" one, postulating the existence of innate modules to perform specific cognitive functions. Based on Chomsky's idea, Fodor describes the mind as a group of modules. However, his main

  10. A signal detection-item response theory model for evaluating neuropsychological measures.

    Science.gov (United States)

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G

    2018-02-05

    Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the

  11. Quantum Theory for a Total System with One Internal Measuring Apparatus

    Science.gov (United States)

    Wang, Wen-Ge

    2011-03-01

    We propose a quantum theory for a total system including one internal measuring apparatus. The theory is based on three basic assumptions and a principle termed the principle of compatible description (PCD). The assumptions are: (i) Physical states of the total system can be associated with vectors in the Hilbert space. (ii) Dynamical evolution of a state vector obeys Schrö dinger equation. (iii) For a physical state of the total system described by a pure vector, in which a subsystem may play the role of an internal measuring apparatus, when certain stable condition is satisfied, the pure-vector description may be given a Born-type ensemble interpretation. The PCD states that different descriptions for the same state of the total system must give consistent predictions for results of measurements performed by the internal measuring apparatus. The proposed theory lies at a meeting point of Copenhagen, Everett's relative-state, and consistent-histories interpretations of quantum mechanics. While, it provides something new: For example, the PCD imposes a restriction to vectors that can be associated with physical states, which may effectively break the time-reversal symmetry of Schrödinger equation. As an application of the theory, we derive a condition under which a two-level quantum system may have definite properties, such that it may play the essential role of a measuring apparatus.

  12. Improving measurement of injection drug risk behavior using item response theory.

    Science.gov (United States)

    Janulis, Patrick

    2014-03-01

    Recent research highlights the multiple steps to preparing and injecting drugs and the resultant viral threats faced by drug users. This research suggests that more sensitive measurement of injection drug HIV risk behavior is required. In addition, growing evidence suggests there are gender differences in injection risk behavior. However, the potential for differential item functioning between genders has not been explored. To explore item response theory as an improved measurement modeling technique that provides empirically justified scaling of injection risk behavior and to examine for potential gender-based differential item functioning. Data is used from three studies in the National Institute on Drug Abuse's Criminal Justice Drug Abuse Treatment Studies. A two-parameter item response theory model was used to scale injection risk behavior and logistic regression was used to examine for differential item functioning. Item fit statistics suggest that item response theory can be used to scale injection risk behavior and these models can provide more sensitive estimates of risk behavior. Additionally, gender-based differential item functioning is present in the current data. Improved measurement of injection risk behavior using item response theory should be encouraged as these models provide increased congruence between construct measurement and the complexity of injection-related HIV risk. Suggestions are made to further improve injection risk behavior measurement. Furthermore, results suggest direct comparisons of composite scores between males and females may be misleading and future work should account for differential item functioning before comparing levels of injection risk behavior.

  13. Gendered language attitudes: exploring language as a gendered construct using Rasch measurement theory.

    Science.gov (United States)

    Knisely, Kris A; Wind, Stefanie A

    2015-01-01

    Gendered language attitudes (GLAs) are gender-based perceptions of language varieties based on connections between gender-related and linguistic characteristics of individuals, including the perception of language varieties as possessing degrees of masculinity and femininity. This study combines substantive theory about language learning and gender with a model based on Rasch measurement theory to explore the psychometric properties of a new measure of GLAs. Findings suggest that GLAs is a unidimensional construct and that the items used can be used to describe differences among students in terms of the strength of their GLAs. Implications for research, theory, and practice are discussed. Special emphasis is given to the teaching and learning of languages.

  14. Resource Theory of Superposition.

    Science.gov (United States)

    Theurer, T; Killoran, N; Egloff, D; Plenio, M B

    2017-12-08

    The superposition principle lies at the heart of many nonclassical properties of quantum mechanics. Motivated by this, we introduce a rigorous resource theory framework for the quantification of superposition of a finite number of linear independent states. This theory is a generalization of resource theories of coherence. We determine the general structure of operations which do not create superposition, find a fundamental connection to unambiguous state discrimination, and propose several quantitative superposition measures. Using this theory, we show that trace decreasing operations can be completed for free which, when specialized to the theory of coherence, resolves an outstanding open question and is used to address the free probabilistic transformation between pure states. Finally, we prove that linearly independent superposition is a necessary and sufficient condition for the faithful creation of entanglement in discrete settings, establishing a strong structural connection between our theory of superposition and entanglement theory.

  15. Axioms for quantum theory

    Energy Technology Data Exchange (ETDEWEB)

    Gerlich, G. [Universitaet Carolo-Wilhelmina, Braunschweig (Germany)

    1992-07-01

    The first three of these axioms describe quantum theory and classical mechanics as statistical theories from the very beginning. With these, it can be shown in which sense a more general than the conventional measure theoretic probability theory is used in quantum theory. One gets this generalization defining transition probabilities on pairs of events (not sets of pairs) as a fundamental, not derived, concept. A comparison with standard theories of stochastic processes gives a very general formulation of the non existence of quantum theories with hidden variables. The Cartesian product of probability spaces can be given a natural algebraic structure, the structure of an orthocomplemented, orthomodular, quasimodular, not modular, not distributive lattice, which can be compared with the quantum logic (lattice of all closed subspaces of an infinite dimensional Hilbert space). It is shown how our given system of axioms suggests generalized quantum theories, especially Schroedinger equations, for phase space amplitudes. 38 refs., 3 figs., 1 tab.

  16. Fundamental structures of algebra and discrete mathematics

    CERN Document Server

    Foldes, Stephan

    2011-01-01

    Introduces and clarifies the basic theories of 12 structural concepts, offering a fundamental theory of groups, rings and other algebraic structures. Identifies essentials and describes interrelationships between particular theories. Selected classical theorems and results relevant to current research are proved rigorously within the theory of each structure. Throughout the text the reader is frequently prompted to perform integrated exercises of verification and to explore examples.

  17. Fundamental Safety Principles

    International Nuclear Information System (INIS)

    Abdelmalik, W.E.Y.

    2011-01-01

    This work presents a summary of the IAEA Safety Standards Series publication No. SF-1 entitled F UDAMENTAL Safety PRINCIPLES p ublished on 2006. This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purposes. Safety measures and security measures have in common the aim of protecting human life and health and the environment. These safety principles are: 1) Responsibility for safety, 2) Role of the government, 3) Leadership and management for safety, 4) Justification of facilities and activities, 5) Optimization of protection, 6) Limitation of risks to individuals, 7) Protection of present and future generations, 8) Prevention of accidents, 9)Emergency preparedness and response and 10) Protective action to reduce existing or unregulated radiation risks. The safety principles concern the security of facilities and activities to the extent that they apply to measures that contribute to both safety and security. Safety measures and security measures must be designed and implemented in an integrated manner so that security measures do not compromise safety and safety measures do not compromise security.

  18. Cognitive load measurement as a means to advance cognitive load theory

    NARCIS (Netherlands)

    Paas, F.; Tuovinen, J.E.; Tabbers, H.; van Gerven, P.W.M.

    2003-01-01

    This paper discusses cognitive load measurement techniques with regard to their contribution to cognitive load theory (CLT). CLT is concerned with the design of instructional methods that efficiently use people's limited cognitive processing capacity to apply acquired knowledge and skills to new

  19. Invariant path integration and the functional measure for Einstein gravitation theory

    International Nuclear Information System (INIS)

    Botelho, L.C.L.

    1985-01-01

    An invariant path integral approach is proposed for the Einstein gravitation theory suitable to the analysis of the associated functional measure problem. The proposed formulation is used to analyse the phenomenon of quantum gravity in two dimensional space times. (Author) [pt

  20. Relative Proximity Theory: Measuring the Gap between Actual and Ideal Online Course Delivery

    Science.gov (United States)

    Swart, William; MacLeod, Kenneth; Paul, Ravi; Zhang, Aixiu; Gagulic, Mario

    2014-01-01

    Based on the Theory of Transactional Distance and Needs Assessment, this article reports a procedure for quantitatively measuring how close the actual delivery of a course was to ideal, as perceived by students. It extends Zhang's instrument and prescribes the computational steps to calculate relative proximity at the element and construct…

  1. Clean test of the electroweak theory by measuring weak boson masses

    International Nuclear Information System (INIS)

    Hioki, Zenro

    1985-01-01

    Role of the weak boson masses in the studies of electroweak higher order effects is surveyed. It is shown that precise measurements of these masses give us quite useful information for performing a clean test of the electroweak theory, and for a heavy fermion search. Effects of supersymmetric particles in these studies are also discussed. (author)

  2. Using Item Response Theory to Evaluate Measurement Precision of Selection Tests at the French Pilot Training

    NARCIS (Netherlands)

    Veldhuis, M.|info:eu-repo/dai/nl/338041869; Matton, N.; Vautier, S.

    2012-01-01

    In pilot selection settings, decisions are often based on cutoff scores. In item response theory the measurement precision of a test score can be evaluated by its degree of information. We investigated whether the maximum of test information corresponded to the cutoff zone for 10 cognitive ability

  3. Theory of thermal stresses

    CERN Document Server

    Boley, Bruno A

    1997-01-01

    Highly regarded text presents detailed discussion of fundamental aspects of theory, background, problems with detailed solutions. Basics of thermoelasticity, heat transfer theory, thermal stress analysis, more. 1985 edition.

  4. Theory-guided selection of discrimination measures for racial/ ethnic health disparities research among older adults.

    Science.gov (United States)

    Thrasher, Angela D; Clay, Olivio J; Ford, Chandra L; Stewart, Anita L

    2012-09-01

    Discrimination may contribute to health disparities among older adults. Existing measures of perceived discrimination have provided important insights but may have limitations when used in studies of older adults. This article illustrates the process of assessing the appropriateness of existing measures for theory-based research on perceived discrimination and health. First, we describe three theoretical frameworks that are relevant to the study of perceived discrimination and health-stress-process models, life course models, and the Public Health Critical Race (PHCR) praxis. We then review four widely-used measures of discrimination, comparing their content and describing how well they address key aspects of each framework, and discussing potential areas of modification. Using theory to guide measure selection can help improve understanding of how perceived discrimination may contribute to racial/ethnic health disparities among older adults.

  5. Measuring the added value of workplace change: Performance measurement in theory and practice

    NARCIS (Netherlands)

    Riratanaphong, C; van der Voordt, Theo

    2015-01-01

    Purpose: – The purpose of this paper is to compare performance measurement systems from the literature with current performance measurement approaches in practice to get a better understanding of the complex relationships between workplace change, added value and organisational performance.To be

  6. On measurements of Effective Residual Ink Concentration (ERIC) of deinked papers using Kubelka-Munk theory

    Science.gov (United States)

    D.W. Vahey; J.Y. Zhu; C.J. Houtman

    2006-01-01

    The measurement of effective residual ink concentration (ERIC) in recycled papers depends on their opacity. For opacity less than 97.0%, the method is based on application of the Kubelka-Munk theory to diffuse reflection from papers measured once with a black backing and again with a thick backing of the same papers. At opacities above 97.0%, the two reflection values...

  7. Foam engineering fundamentals and applications

    CERN Document Server

    2012-01-01

    Containing contributions from leading academic and industrial researchers, this book provides a much needed update of foam science research. The first section of the book presents an accessible summary of the theory and fundamentals of foams. This includes chapters on morphology, drainage, Ostwald ripening, coalescence, rheology, and pneumatic foams. The second section demonstrates how this theory is used in a wide range of industrial applications, including foam fractionation, froth flotation and foam mitigation. It includes chapters on suprafroths, flotation of oil sands, foams in enhancing petroleum recovery, Gas-liquid Mass Transfer in foam, foams in glass manufacturing, fire-fighting foam technology and consumer product foams.

  8. On the Nature of Measurement Records in Relativistic Quantum Field Theory

    Science.gov (United States)

    Barrett, Jeffrey A.

    A resolution of the quantum measurement problem would require one to explain how it is that we end up with determinate records at the end of our measurements. Metaphysical commitments typically do real work in such an explanation. Indeed, one should not be satisfied with one's metaphysical commitments unless one can provide some account of determinate measurement records. I will explain some of the problems in getting determinate records in relativistic quantum field theory and pay particular attention to the relationship between the measurement problem and a generalized version of Malament's theorem.

  9. New progress of fundamental aspects in quantum mechanics

    International Nuclear Information System (INIS)

    Sun Changpu

    2001-01-01

    The review recalls the conceptual origins of various interpretations of quantum mechanics. With the focus on quantum measurement problems, new developments of fundamental quantum theory are described in association with recent experiments such as the decoherence process in cavity quantum electrodynamics 'which-way' detection using the Bragg scattering of cold atoms, and quantum interference using the small quantum system of molecular C 60 . The fundamental problems include the quantum coherence of a macroscopic object, the von Neumann chain in quantum measurement, the Schroedinger cat paradox, et al. Many land math experiments have been accomplished with possible important applications in quantum information. The most recent research on the new quantum theory by G.'t Hooft is reviewed, as well as future prospects of quantum mechanics

  10. Generalized Galilean transformations and the measurement problem in the entropic dynamics approach to quantum theory

    Science.gov (United States)

    Johnson, David T.

    Quantum mechanics is an extremely successful and accurate physical theory, yet since its inception, it has been afflicted with numerous conceptual difficulties. The primary subject of this thesis is the theory of entropic quantum dynamics (EQD), which seeks to avoid these conceptual problems by interpreting quantum theory from an informational perspective. We begin by reviewing Cox's work in describing probability theory as a means of rationally and consistently quantifying uncertainties. We then discuss how probabilities can be updated according to either Bayes' theorem or the extended method of maximum entropy (ME). After that discussion, we review the work of Caticha and Giffin that shows that Bayes' theorem is a special case of ME. This important result demonstrates that the ME method is the general method for updating probabilities. We then review some motivating difficulties in quantum mechanics before discussing Caticha's work in deriving quantum theory from the approach of entropic dynamics, which concludes our review. After entropic dynamics is introduced, we develop the concepts of symmetries and transformations from an informational perspective. The primary result is the formulation of a symmetry condition that any transformation must satisfy in order to qualify as a symmetry in EQD. We then proceed to apply this condition to the extended Galilean transformation. This transformation is of interest as it exhibits features of both special and general relativity. The transformation yields a gravitational potential that arises from an equivalence of information. We conclude the thesis with a discussion of the measurement problem in quantum mechanics. We discuss the difficulties that arise in the standard quantum mechanical approach to measurement before developing our theory of entropic measurement. In entropic dynamics, position is the only observable. We show how a theory built on this one observable can account for the multitude of measurements present in

  11. Theoretical prediction and impact of fundamental electric dipole moments

    International Nuclear Information System (INIS)

    Ellis, Sebastian A.R.; Kane, Gordon L.

    2016-01-01

    The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level in the theory at the unification or string scale ∼O(10 16 GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5×10 −30 e cm, and the neutron EDM should not be larger than about 5×10 −29 e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. We comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.

  12. Metric-independent measures for supersymmetric extended object theories on curved backgrounds

    International Nuclear Information System (INIS)

    Nishino, Hitoshi; Rajpoot, Subhash

    2014-01-01

    For Green–Schwarz superstring σ-model on curved backgrounds, we introduce a non-metric measure Φ≡ϵ ij ϵ IJ (∂ i φ I )(∂ j φ J ) with two scalars φ I (I=1,2) used in ‘Two-Measure Theory’ (TMT). As in the flat-background case, the string tension T=(2πα ′ ) −1 emerges as an integration constant for the A i -field equation. This mechanism is further generalized to supermembrane theory, and to super-p-brane theory, both on general curved backgrounds. This shows the universal applications of dynamical measure of TMT to general supersymmetric extended objects on general curved backgrounds

  13. Soft Measurement Modeling Based on Chaos Theory for Biochemical Oxygen Demand (BOD

    Directory of Open Access Journals (Sweden)

    Junfei Qiao

    2016-12-01

    Full Text Available The precision of soft measurement for biochemical oxygen demand (BOD is always restricted due to various factors in the wastewater treatment plant (WWTP. To solve this problem, a new soft measurement modeling method based on chaos theory is proposed and is applied to BOD measurement in this paper. Phase space reconstruction (PSR based on Takens embedding theorem is used to extract more information from the limited datasets of the chaotic system. The WWTP is first testified as a chaotic system by the correlation dimension (D, the largest Lyapunov exponents (λ1, the Kolmogorov entropy (K of the BOD and other water quality parameters time series. Multivariate chaotic time series modeling method with principal component analysis (PCA and artificial neural network (ANN is then adopted to estimate the value of the effluent BOD. Simulation results show that the proposed approach has higher accuracy and better prediction ability than the corresponding modeling approaches not based on chaos theory.

  14. Study and Application on Stability Classification of Tunnel Surrounding Rock Based on Uncertainty Measure Theory

    Directory of Open Access Journals (Sweden)

    Hujun He

    2014-01-01

    Full Text Available Based on uncertainty measure theory, a stability classification and order-arranging model of surrounding rock was established. Considering the practical engineering geologic condition, 5 factors that influence surrounding rock stability were taken into account and uncertainty measure function was obtained based on the in situ data. In this model, uncertainty influence factors were analyzed quantitatively and qualitatively based on the real situation; the weight of index was given based on information entropy theory; surrounding rock stability level was judged based on credible degree recognition criterion; and surrounding rock was ordered based on order-arranging criterion. Furthermore, this model was employed to evaluate 5 sections surrounding rock in Dongshan tunnel of Huainan. The results show that uncertainty measure method is reasonable and can have significance for surrounding rock stability evaluation in the future.

  15. Measuring belief in conspiracy theories: The Generic Conspiracist Beliefs scale (GCB

    Directory of Open Access Journals (Sweden)

    Robert eBrotherton

    2013-05-01

    Full Text Available The psychology of conspiracy theory beliefs is not yet well understood, although research indicates that there are stable individual differences in conspiracist ideation – individuals’ general tendency to engage with conspiracy theories. Researchers have created several short self-report measures of conspiracist ideation. These measures largely consist of items referring to an assortment of prominent conspiracy theories regarding specific real-world events. However, these instruments have not been psychometrically validated, and this assessment approach suffers from practical and theoretical limitations. Therefore, we present the Generic Conspiracist Beliefs (GCB scale: a novel measure of individual differences in generic conspiracist ideation. The scale was developed and validated across four studies. In Study 1, exploratory factor analysis of a novel 75-item measure of non-event-based conspiracist beliefs identified five conspiracist facets. The 15-item GCB scale was developed to sample from each of these themes. Studies 2, 3 and 4 examined the structure and validity of the GCB, demonstrating internal reliability, content, criterion-related, convergent and discriminant validity, and good test-retest reliability. In sum, this research indicates that the GCB is a psychometrically sound and practically useful measure of conspiracist ideation, and the findings add to our theoretical understanding of conspiracist ideation as a monological belief system unpinned by a relatively small number of generic assumptions about the typicality of conspiratorial activity in the world.

  16. Comparisons of some scattering theories with recent scatterometer measurements. [sea roughness radar model

    Science.gov (United States)

    Fung, A. K.; Dome, G.; Moore, R. K.

    1977-01-01

    The paper compares the predictions of two different types of sea scatter theories with recent scatterometer measurements which indicate the variations of the backscattering coefficient with polarization, incident angle, wind speed, and azimuth angle. Wright's theory (1968) differs from that of Chan and Fung (1977) in two major aspects: (1) Wright uses Phillips' sea spectrum (1966) while Chan and Fung use that of Mitsuyasu and Honda, and (2) Wright uses a modified slick sea slope distribution by Cox and Munk (1954) while Chan and Fung use the slick sea slope distribution of Cox and Munk defined with respect to the plane perpendicular to the look direction. Satisfactory agreements between theory and experimental data are obtained when Chan and Fung's model is used to explain the wind and azimuthal dependence of the scattering coefficient.

  17. Exchange Rates and Fundamentals.

    Science.gov (United States)

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  18. Merging Psychophysical and Psychometric Theory to Estimate Global Visual State Measures from Forced-Choices

    International Nuclear Information System (INIS)

    Massof, Robert W; Schmidt, Karen M; Laby, Daniel M; Kirschen, David; Meadows, David

    2013-01-01

    Visual acuity, a forced-choice psychophysical measure of visual spatial resolution, is the sine qua non of clinical visual impairment testing in ophthalmology and optometry patients with visual system disorders ranging from refractive error to retinal, optic nerve, or central visual system pathology. Visual acuity measures are standardized against a norm, but it is well known that visual acuity depends on a variety of stimulus parameters, including contrast and exposure duration. This paper asks if it is possible to estimate a single global visual state measure from visual acuity measures as a function of stimulus parameters that can represent the patient's overall visual health state with a single variable. Psychophysical theory (at the sensory level) and psychometric theory (at the decision level) are merged to identify the conditions that must be satisfied to derive a global visual state measure from parameterised visual acuity measures. A global visual state measurement model is developed and tested with forced-choice visual acuity measures from 116 subjects with no visual impairments and 560 subjects with uncorrected refractive error. The results are in agreement with the expectations of the model

  19. Merging Psychophysical and Psychometric Theory to Estimate Global Visual State Measures from Forced-Choices

    Science.gov (United States)

    Massof, Robert W.; Schmidt, Karen M.; Laby, Daniel M.; Kirschen, David; Meadows, David

    2013-09-01

    Visual acuity, a forced-choice psychophysical measure of visual spatial resolution, is the sine qua non of clinical visual impairment testing in ophthalmology and optometry patients with visual system disorders ranging from refractive error to retinal, optic nerve, or central visual system pathology. Visual acuity measures are standardized against a norm, but it is well known that visual acuity depends on a variety of stimulus parameters, including contrast and exposure duration. This paper asks if it is possible to estimate a single global visual state measure from visual acuity measures as a function of stimulus parameters that can represent the patient's overall visual health state with a single variable. Psychophysical theory (at the sensory level) and psychometric theory (at the decision level) are merged to identify the conditions that must be satisfied to derive a global visual state measure from parameterised visual acuity measures. A global visual state measurement model is developed and tested with forced-choice visual acuity measures from 116 subjects with no visual impairments and 560 subjects with uncorrected refractive error. The results are in agreement with the expectations of the model.

  20. Free release measurement of radioactive waste on the basis of the Bayes theory

    International Nuclear Information System (INIS)

    Sokcic-Kostic, M.; Langer, F.; Schultheis, R.

    2013-01-01

    The application of Bayesian theory in the evaluation of the free release measurements requires complex co-ordination between experiment and analysis. The algorithms are more complex compared to those used in the frequentist data analysis and partly to those of the Monte Carlo methods. The user can get an objective treatment of parameters of the measurement error and - as a result - a reliable indication of confidence intervals. For release measurement, the upper limit of the confidence interval must be compared with the limit given by the Radiation Protection Regulations (StrlSchV) to decide on a possible release of the material under test. (orig.)

  1. Fundamentals of set and number theory

    CERN Document Server

    Rodionov, Timofey V

    2018-01-01

    The series is devoted to the publication of monographs and high-level textbooks in mathematics, mathematical methods and their applications. Apart from covering important areas of current interest, a major aim is to make topics of an interdisciplinary nature accessible to the non-specialist. The works in this series are addressed to advanced students and researchers in mathematics and theoretical physics. In addition, it can serve as a guide for lectures and seminars on a graduate level. The series de Gruyter Studies in Mathematics was founded ca. 30 years ago by the late Professor Heinz Bauer and Professor Peter Gabriel with the aim to establish a series of monographs and textbooks of high standard, written by scholars with an international reputation presenting current fields of research in pure and applied mathematics. While the editorial board of the Studies has changed with the years, the aspirations of the Studies are unchanged. In times of rapid growth of mathematical knowledge carefully written monogr...

  2. Fundamentals of the theory of plasticity

    CERN Document Server

    Kachanov, L M

    2004-01-01

    Intended for use by advanced engineering students and professionals, this volume focuses on plastic deformation of metals at normal temperatures, as applied to strength of machines and structures. 1971 edition.

  3. Fundamental tests of galaxy formation theory

    Science.gov (United States)

    Silk, J.

    1982-01-01

    The structure of the universe as an environment where traces exist of the seed fluctuations from which galaxies formed is studied. The evolution of the density fluctuation modes that led to the eventual formation of matter inhomogeneities is reviewed, How the resulting clumps developed into galaxies and galaxy clusters acquiring characteristic masses, velocity dispersions, and metallicities, is discussed. Tests are described that utilize the large scale structure of the universe, including the dynamics of the local supercluster, the large scale matter distribution, and the anisotropy of the cosmic background radiation, to probe the earliest accessible stages of evolution. Finally, the role of particle physics is described with regard to its observable implications for galaxy formation.

  4. Duality and free measures in vector spaces, the spectral theory of actions of non-locally compact groups

    OpenAIRE

    Vershik, A.

    2017-01-01

    The paper presents a general duality theory for vector measure spaces taking its origin in the author's papers written in the 1960s. The main result establishes a direct correspondence between the geometry of a measure in a vector space and the properties of the space of measurable linear functionals on this space regarded as closed subspaces of an abstract space of measurable functions. An example of useful new features of this theory is the notion of a free measure and its applications.

  5. Robust Measurement via A Fused Latent and Graphical Item Response Theory Model.

    Science.gov (United States)

    Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang

    2018-03-12

    Item response theory (IRT) plays an important role in psychological and educational measurement. Unlike the classical testing theory, IRT models aggregate the item level information, yielding more accurate measurements. Most IRT models assume local independence, an assumption not likely to be satisfied in practice, especially when the number of items is large. Results in the literature and simulation studies in this paper reveal that misspecifying the local independence assumption may result in inaccurate measurements and differential item functioning. To provide more robust measurements, we propose an integrated approach by adding a graphical component to a multidimensional IRT model that can offset the effect of unknown local dependence. The new model contains a confirmatory latent variable component, which measures the targeted latent traits, and a graphical component, which captures the local dependence. An efficient proximal algorithm is proposed for the parameter estimation and structure learning of the local dependence. This approach can substantially improve the measurement, given no prior information on the local dependence structure. The model can be applied to measure both a unidimensional latent trait and multidimensional latent traits.

  6. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  7. A game theory-based trust measurement model for social networks.

    Science.gov (United States)

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  8. Molecular imaging. Fundamentals and applications

    International Nuclear Information System (INIS)

    Tian, Jie

    2013-01-01

    Covers a wide range of new theory, new techniques and new applications. Contributed by many experts in China. The editor has obtained the National Science and Technology Progress Award twice. ''Molecular Imaging: Fundamentals and Applications'' is a comprehensive monograph which describes not only the theory of the underlying algorithms and key technologies but also introduces a prototype system and its applications, bringing together theory, technology and applications. By explaining the basic concepts and principles of molecular imaging, imaging techniques, as well as research and applications in detail, the book provides both detailed theoretical background information and technical methods for researchers working in medical imaging and the life sciences. Clinical doctors and graduate students will also benefit from this book.

  9. In search of the "lost capital". A theory for valuation, investment decisions, performance measurement

    OpenAIRE

    Magni, Carlo Alberto

    2007-01-01

    This paper presents a theoretical framework for valuation, investment decisions, and performance measurement based on a nonstandard theory of residual income. It is derived from the notion of "unrecovered" capital, which is here named "lost" capital because it represents the capital foregone by the investors. Its theoretical strength and meaningfulness is shown by deriving it from four main perspectives: financial, microeconomic, axiomatic, accounting. Implications for asset valuation, cap...

  10. Further test of internal-conversion theory with a measurement in Pt197

    Science.gov (United States)

    Nica, N.; Hardy, J. C.; Iacob, V. E.; Goodwin, J.; Balonek, C.; Hernberg, M.; Nolan, J.; Trzhaskovskaya, M. B.

    2009-12-01

    We have measured the K-shell internal conversion coefficient, αK, for the 346.5-keV M4 transition in Pt197 to be 4.23(7). This result differs from a previous value, which disagreed significantly from theory. Our new value agrees well with Dirac-Fock calculations and removes the earlier discrepancy as a source of concern.

  11. A grounded theory analysis of the pre-measurement phase for the accounting recognition of assets

    OpenAIRE

    El Tawy, Nevine Abdel Halim

    2010-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. This thesis induces a theory for the pre-measurement phase of the asset recognition process in the financial reporting domain centred upon the use of the induced artefact-based asset recognition criteria which are applicable to all assets. In common with standard-setting bodies, such as the International Accounting Standards Board (IASB), I adopt a social constructionist stance (Miller, 1994)...

  12. Critical Investigation of Jauch's Approach to the Quantum Theory of Measurement

    Science.gov (United States)

    Herbut, Fedor

    1986-08-01

    To make Jauch's approach more realistic, his assumptions are modified in two ways: (1) On the quantum system plus the measuring apparatus (S+MA) after the measuring interaction has ceased, one can actually measure only operators of the form A⊗∑ k b k Q k ,where A is any Hermitian operator for S, the resolution of the identity ∑kQk=1 defines MA as a classical system (following von Neumann), and the b k are real numbers (S and MA are distant). (2) Measurement is defined in the most general way (including, besides first-kind, also second-kind and third-kind or indirect measurements). It is shown that Jauch's basic result that the microstates (statistical operators) of S+MA before and after the collapse correspond to the same macrostate (belong to the same equivalence class of microstates) remains valid under the above modifications, and that the significance of this result goes beyond measurement theory. On the other hand, it is argued that taking the orthodox (i.e. uncompromisingly quantum) view of quantum mechanics, it is not the collapse, but the Jauch-type macrostates that are spurious in a Jauch-type theory.

  13. A new measurement for the revised reinforcement sensitivity theory: psychometric criteria and genetic validation

    Directory of Open Access Journals (Sweden)

    Martin eReuter

    2015-03-01

    Full Text Available Jeffrey Gray’s Reinforcement Sensitivity Theory (RST represents one of the most influential biologically-based personality theories describing individual differences in approach and avoidance tendencies. The most prominent self-report inventory to measure individual differences in approach and avoidance behavior to date is the BIS/BAS scale by Carver & White (1994. As Gray & McNaughton (2000 revised the RST after its initial formulation in the 1970/80s, and given the Carver & White measure is based on the initial conceptualization of RST, there is a growing need for self-report inventories measuring individual differences in the revised behavioral inhibition system (BIS, behavioral activation system (BAS and the fight, flight, freezing system (FFFS. Therefore, in this paper we present a new questionnaire measuring individual differences in the revised constructs of the BIS, BAS and FFFS in N = 1814 participants (German sample. An English translated version of the new measure is also presented and tested in N = 299 English language participants. A large number of German participants (N = 1090 also filled in the BIS/BAS scales by Carver & White (1994 and the correlations between these measures are presented. Finally, this same subgroup of participants provided buccal swaps for the investigation of the arginine vasopressin receptor 1a (AVPR1a gene. Here, a functional genetic polymorphism (rs11174811 on the AVPR1a gene was shown to be associated with individual differences in both the revised BIS and classic BIS dimensions.

  14. A new measure for the revised reinforcement sensitivity theory: psychometric criteria and genetic validation.

    Science.gov (United States)

    Reuter, Martin; Cooper, Andrew J; Smillie, Luke D; Markett, Sebastian; Montag, Christian

    2015-01-01

    Jeffrey Gray's Reinforcement Sensitivity Theory (RST) represents one of the most influential biologically-based personality theories describing individual differences in approach and avoidance tendencies. The most prominent self-report inventory to measure individual differences in approach and avoidance behavior to date is the BIS/BAS scale by Carver and White (1994). As Gray and McNaughton (2000) revised the RST after its initial formulation in the 1970/80s, and given the Carver and White measure is based on the initial conceptualization of RST, there is a growing need for self-report inventories measuring individual differences in the revised behavioral inhibition system (BIS), behavioral activation system (BAS) and the fight, flight, freezing system (FFFS). Therefore, in this paper we present a new questionnaire measuring individual differences in the revised constructs of the BIS, BAS and FFFS in N = 1814 participants (German sample). An English translated version of the new measure is also presented and tested in N = 299 English language participants. A large number of German participants (N = 1090) also filled in the BIS/BAS scales by Carver and White (1994) and the correlations between these measures are presented. Finally, this same subgroup of participants provided buccal swaps for the investigation of the arginine vasopressin receptor 1a (AVPR1a) gene. Here, a functional genetic polymorphism (rs11174811) on the AVPR1a gene was shown to be associated with individual differences in both the revised BIS and classic BIS dimensions.

  15. Conductive shield for ultra-low-field magnetic resonance imaging: Theory and measurements of eddy currents.

    Science.gov (United States)

    Zevenhoven, Koos C J; Busch, Sarah; Hatridge, Michael; Oisjöen, Fredrik; Ilmoniemi, Risto J; Clarke, John

    2014-03-14

    Eddy currents induced by applied magnetic-field pulses have been a common issue in ultra-low-field magnetic resonance imaging. In particular, a relatively large prepolarizing field-applied before each signal acquisition sequence to increase the signal-induces currents in the walls of the surrounding conductive shielded room. The magnetic-field transient generated by the eddy currents may cause severe image distortions and signal loss, especially with the large prepolarizing coils designed for in vivo imaging. We derive a theory of eddy currents in thin conducting structures and enclosures to provide intuitive understanding and efficient computations. We present detailed measurements of the eddy-current patterns and their time evolution in a previous-generation shielded room. The analysis led to the design and construction of a new shielded room with symmetrically placed 1.6-mm-thick aluminum sheets that were weakly coupled electrically. The currents flowing around the entire room were heavily damped, resulting in a decay time constant of about 6 ms for both the measured and computed field transients. The measured eddy-current vector maps were in excellent agreement with predictions based on the theory, suggesting that both the experimental methods and the theory were successful and could be applied to a wide variety of thin conducting structures.

  16. Reliability and validity of advanced theory-of-mind measures in middle childhood and adolescence.

    Science.gov (United States)

    Hayward, Elizabeth O; Homer, Bruce D

    2017-09-01

    Although theory-of-mind (ToM) development is well documented for early childhood, there is increasing research investigating changes in ToM reasoning in middle childhood and adolescence. However, the psychometric properties of most advanced ToM measures for use with older children and adolescents have not been firmly established. We report on the reliability and validity of widely used, conventional measures of advanced ToM with this age group. Notable issues with both reliability and validity of several of the measures were evident in the findings. With regard to construct validity, results do not reveal a clear empirical commonality between tasks, and, after accounting for comprehension, developmental trends were evident in only one of the tasks investigated. Statement of contribution What is already known on this subject? Second-order false belief tasks have acceptable internal consistency. The Eyes Test has poor internal consistency. Validity of advanced theory-of-mind tasks is often based on the ability to distinguish clinical from typical groups. What does this study add? This study examines internal consistency across six widely used advanced theory-of-mind tasks. It investigates validity of tasks based on comprehension of items by typically developing individuals. It further assesses construct validity, or commonality between tasks. © 2017 The British Psychological Society.

  17. A pilot study to validate measures of the theory of reasoned action for organ donation behavior.

    Science.gov (United States)

    Wong, Shui Hung; Chow, Amy Yin Man

    2018-04-01

    The present study aimed at taking the first attempt in validating the measures generated based on the theory of reasoned action (TRA). A total of 211 university students participated in the study, 95 were included in the exploratory factor analysis and 116 were included in the confirmatory factor analysis. The TRA measurements were established with adequate psychometric properties, internal consistency, and construct validity. Findings also suggested that attitude toward organ donation has both a cognitive and affective nature, while the subjective norm of the family seems to be important to students' views on organ donation.

  18. Dielectric properties of agricultural products – fundamental principles, influencing factors, and measurement technirques. Chapter 4. Electrotechnologies for Food Processing: Book Series. Volume 3. Radio-Frequency Heating

    Science.gov (United States)

    In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...

  19. Mercury in Environmental and Biological Samples Using Online Combustion with Sequential Atomic Absorption and Fluorescence Measurements: A Direct Comparison of Two Fundamental Techniques in Spectrometry

    Science.gov (United States)

    Cizdziel, James V.

    2011-01-01

    In this laboratory experiment, students quantitatively determine the concentration of an element (mercury) in an environmental or biological sample while comparing and contrasting the fundamental techniques of atomic absorption spectrometry (AAS) and atomic fluorescence spectrometry (AFS). A mercury analyzer based on sample combustion,…

  20. Quantitative Analysis of Situation Awareness (QASA): modelling and measuring situation awareness using signal detection theory.

    Science.gov (United States)

    Edgar, Graham K; Catherwood, Di; Baker, Steven; Sallis, Geoff; Bertels, Michael; Edgar, Helen E; Nikolla, Dritan; Buckle, Susanna; Goodwin, Charlotte; Whelan, Allana

    2017-12-29

    This paper presents a model of situation awareness (SA) that emphasises that SA is necessarily built using a subset of available information. A technique (Quantitative Analysis of Situation Awareness - QASA), based around signal detection theory, has been developed from this model that provides separate measures of actual SA (ASA) and perceived SA (PSA), together with a feature unique to QASA, a measure of bias (information acceptance). These measures allow the exploration of the relationship between actual SA, perceived SA and information acceptance. QASA can also be used for the measurement of dynamic ASA, PSA and bias. Example studies are presented and full details of the implementation of the QASA technique are provided. Practitioner Summary: This paper presents a new model of situation awareness (SA) together with an associated tool (Quantitative Analysis of Situation Awareness - QASA) that employs signal detection theory to measure several aspects of SA, including actual and perceived SA and information acceptance. Full details are given of the implementation of the tool.

  1. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  2. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  3. Fundamentals of gas dynamics

    CERN Document Server

    Babu, V

    2014-01-01

    Fundamentals of Gas Dynamics, Second Edition isa comprehensively updated new edition and now includes a chapter on the gas dynamics of steam. It covers the fundamental concepts and governing equations of different flows, and includes end of chapter exercises based on the practical applications. A number of useful tables on the thermodynamic properties of steam are also included.Fundamentals of Gas Dynamics, Second Edition begins with an introduction to compressible and incompressible flows before covering the fundamentals of one dimensional flows and normal shock wav

  4. Revisiting energy efficiency fundamentals

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Lombard, L.; Velazquez, D. [Grupo de Termotecnia, Escuela Superior de Ingenieros, Universidad de Sevilla, Camino de los Descubrimientos s/n, 41092 Seville (Spain); Ortiz, J. [Building Research Establishment (BRE), Garston, Watford, WD25 9XX (United Kingdom)

    2013-05-15

    Energy efficiency is a central target for energy policy and a keystone to mitigate climate change and to achieve a sustainable development. Although great efforts have been carried out during the last four decades to investigate the issue, focusing into measuring energy efficiency, understanding its trends and impacts on energy consumption and to design effective energy efficiency policies, many energy efficiency-related concepts, some methodological problems for the construction of energy efficiency indicators (EEI) and even some of the energy efficiency potential gains are often ignored or misunderstood, causing no little confusion and controversy not only for laymen but even for specialists. This paper aims to revisit, analyse and discuss some efficiency fundamental topics that could improve understanding and critical judgement of efficiency stakeholders and that could help in avoiding unfounded judgements and misleading statements. Firstly, we address the problem of measuring energy efficiency both in qualitative and quantitative terms. Secondly, main methodological problems standing in the way of the construction of EEI are discussed, and a sequence of actions is proposed to tackle them in an ordered fashion. Finally, two key topics are discussed in detail: the links between energy efficiency and energy savings, and the border between energy efficiency improvement and renewable sources promotion.

  5. Fundamentals of estuarine physical oceanography

    CERN Document Server

    Bruner de Miranda, Luiz; Kjerfve, Björn; Castro Filho, Belmiro Mendes de

    2017-01-01

    This book provides an introduction to the complex system functions, variability and human interference in ecosystem between the continent and the ocean. It focuses on circulation, transport and mixing of estuarine and coastal water masses, which is ultimately related to an understanding of the hydrographic and hydrodynamic characteristics (salinity, temperature, density and circulation), mixing processes (advection and diffusion), transport timescales such as the residence time and the exposure time. In the area of physical oceanography, experiments using these water bodies as a natural laboratory and interpreting their circulation and mixing processes using theoretical and semi-theoretical knowledge are of fundamental importance. Small-scale physical models may also be used together with analytical and numerical models. The book highlights the fact that research and theory are interactive, and the results provide the fundamentals for the development of the estuarine research.

  6. Experimental test of proximity effect theories by surface impedance measurements on the Pb-Sn system

    International Nuclear Information System (INIS)

    Hook, J.R.; Battilana, J.A.

    1976-01-01

    The proximity effect in the Pb-Sn system in zero magnetic field has been studied by measuring the surface impedance at 3 GHz of a thin film of tin evaporated on to a bulk lead substrate. The results are compared with the predictions of theories of the proximity effect. It is found that good agreement can be obtained by using a theory due to Hook and Waldram of the spatial variation of the superconducting order parameter Δ inside each metal together with suitable boundary conditions on Δ at the interface between the metals. The required boundary conditions are a generalization to the case of non-zero electron reflection at the interface of the boundary conditions given by Zaitsev for the Ginsburg-Landau equation. (author)

  7. Chronometric Geodesy and Fundamental Physics

    Science.gov (United States)

    Delva, P.; Puchades, N.; Lodewyck, J.

    2016-12-01

    Atomic clocks are today essential for several daily life applications, such as the building of the International Atomic Time (TAI) or Global Navigation Satellite Systems (GNSS). With the new generation of optical clocks, they reach such accuracy and stability that they are now considered in practical applications for the measurement of gravitational potential differences, thanks to the Einstein effect, or gravitational redshift. Several projects explored the possibilities of using clocks in geodesy or geophysical applications and research. This context offers a fantastic opportunity to use atomic clocks to test fundamental physics. In this talk I will present two such studies for testing the gravitational redshift and Lorentz invariance.The first project is the "Galileo gravitational Redshift test with Eccentric sATellites" (GREAT), funded by the European Space Agency (ESA). Here we use the on-board atomic clocks of the Galileo satellites 5 and 6 to look for violations of general relativity theory. These two satellites were launched on August, 30th 2014 and, because of a technical problem, the launcher brought them on an elliptic orbit. An elliptic orbit induces a periodic modulation of the gravitational redshift while the good stability of recent GNSS clocks allows to test this periodic modulation to a very good level of accuracy. The Galileo 5 and 6 satellites, with their large eccentricity and on-board H-maser clocks, are hence perfect candidates to perform this test.In the second study we propose a test of special relativity theory using a network of distant optical lattice clocks located in France, Germany and Great-Britain. By exploiting the difference between the velocities of each clock in the inertial geocentric frame, due to their different positions on the surface of the Earth, we can test the time dilation effect. The connection between these clocks, achieved with phase-compensated optical fibers, allows for an unprecedented level of statistical

  8. Fundamental neutron physics

    International Nuclear Information System (INIS)

    Deslattes, R.; Dombeck, T.; Greene, G.; Ramsey, N.; Rauch, H.; Werner, S.

    1984-01-01

    Fundamental physics experiments of merit can be conducted at the proposed intense neutron sources. Areas of interest include: neutron particle properties, neutron wave properties, and fundamental physics utilizing reactor produced γ-rays. Such experiments require intense, full-time utilization of a beam station for periods ranging from several months to a year or more

  9. Non-Newtonian Gravity and New Weak Forces: an Index of Measurements and Theory

    Science.gov (United States)

    Fischbach, E.; Gillies, G. T.; Krause, D. E.; Schwan, J. G.; Talmadge, C.

    1992-01-01

    The precise measurement of weak effects plays a pivotal role in metrology and in the determination of the fundamental constants. Hence, the possibility of new weak forces, and the related question of non-Newtonian behaviour of the gravitational force, have been of special interest to both measurement scientists and those involved in precise tests of physical laws. To date there is no compelling evidence for any deviations from the predictions of Newtonian gravity in the nonrelativistic weak-field regime. A significant literature on this question has developed over the past few years, and a host of experiments and theoretical scenarios have been discussed. Moreover, a very close relationship exists between the experimental methodologies used to determine the absolute value of the Newtonian gravitational constant G, and those employed in searches for new weak forces and for breakdowns in the inverse-square law of gravity. We have therefore prepared a new index of measurements of such effects, using the original bibliographic work of Gillies as a starting point, but also including citations to the appropriate theoretical papers in the field. The focus of the present version of the index is then studies of the "fifth force", measurements of gravitational effects on antimatter, searches for a spin-component in the gravitational force, and related phenomena.

  10. Fundamentals and advanced techniques in derivatives hedging

    CERN Document Server

    Bouchard, Bruno

    2016-01-01

    This book covers the theory of derivatives pricing and hedging as well as techniques used in mathematical finance. The authors use a top-down approach, starting with fundamentals before moving to applications, and present theoretical developments alongside various exercises, providing many examples of practical interest. A large spectrum of concepts and mathematical tools that are usually found in separate monographs are presented here. In addition to the no-arbitrage theory in full generality, this book also explores models and practical hedging and pricing issues. Fundamentals and Advanced Techniques in Derivatives Hedging further introduces advanced methods in probability and analysis, including Malliavin calculus and the theory of viscosity solutions, as well as the recent theory of stochastic targets and its use in risk management, making it the first textbook covering this topic. Graduate students in applied mathematics with an understanding of probability theory and stochastic calculus will find this b...

  11. Betting on the outcomes of measurements: a Bayesian theory of quantum probability

    Science.gov (United States)

    Pitowsky, Itamar

    We develop a systematic approach to quantum probability as a theory of rational betting in quantum gambles. In these games of chance, the agent is betting in advance on the outcomes of several (finitely many) incompatible measurements. One of the measurements is subsequently chosen and performed and the money placed on the other measurements is returned to the agent. We show how the rules of rational betting imply all the interesting features of quantum probability, even in such finite gambles. These include the uncertainty principle and the violation of Bell's inequality among others. Quantum gambles are closely related to quantum logic and provide a new semantics for it. We conclude with a philosophical discussion on the interpretation of quantum mechanics.

  12. Measuring organizational effectiveness in information and communication technology companies using item response theory.

    Science.gov (United States)

    Trierweiller, Andréa Cristina; Peixe, Blênio César Severo; Tezza, Rafael; Pereira, Vera Lúcia Duarte do Valle; Pacheco, Waldemar; Bornia, Antonio Cezar; de Andrade, Dalton Francisco

    2012-01-01

    The aim of this paper is to measure the effectiveness of the organizations Information and Communication Technology (ICT) from the point of view of the manager, using Item Response Theory (IRT). There is a need to verify the effectiveness of these organizations which are normally associated to complex, dynamic, and competitive environments. In academic literature, there is disagreement surrounding the concept of organizational effectiveness and its measurement. A construct was elaborated based on dimensions of effectiveness towards the construction of the items of the questionnaire which submitted to specialists for evaluation. It demonstrated itself to be viable in measuring organizational effectiveness of ICT companies under the point of view of a manager through using Two-Parameter Logistic Model (2PLM) of the IRT. This modeling permits us to evaluate the quality and property of each item placed within a single scale: items and respondents, which is not possible when using other similar tools.

  13. Measuring implementation intentions in the context of the theory of planned behavior.

    Science.gov (United States)

    Rise, Jostein; Thompson, Marianne; Verplanken, Bas

    2003-04-01

    The usefulness of measuring implementation intentions in the context of the theory of planned behavior (TPB) was explored among 112 Norwegian college students. They responded to a questionnaire measuring past behavior, perceived behavioral control, behavioral intentions, implementation intentions, and actual performance of regular exercising and recycling of drinking cartons. Implementation intentions were measured using five items relating to recycling and four items relating to exercise, which showed satisfactory internal consistencies. Consistent with the main prediction, the presence of implementation intentions was related to performing the two behaviors, although behavioral intentions were the strongest determinant for both behaviors. The results suggest that the TPB may benefit from inclusion of the concept of implementation intentions to provide a more complete understanding of the psychological process in which motivation is translated into action.

  14. A new measure of skill mismatch: theory and evidence from PIAAC

    Directory of Open Access Journals (Sweden)

    Michele Pellizzari

    2017-01-01

    Full Text Available Abstract This paper proposes a new measure of skill mismatch to be applied to the recent OECD Survey of Adult Skills (PIAAC. The measure is derived from a formal theory and combines information about skill proficiency, self-reported mismatch and skill use. The theoretical foundations underling this measure allow identifying minimum and maximum skill requirements for each occupation and to classify workers into three groups: the well-matched, the under-skilled and the over-skilled. The availability of skill use data further permits the computation of the degree of under- and over-usage of skills in the economy. The empirical analysis is carried out using the first round of the PIAAC data, allowing comparisons across skill domains, labour market statuses and countries.

  15. Measuring the jitter of ring oscillators by means of information theory quantifiers

    Science.gov (United States)

    Antonelli, M.; De Micco, L.; Larrondo, H. A.

    2017-02-01

    Ring oscillators (RO's) are elementary blocks widely used in digital design. Jitter is unavoidable in RO's, its presence is an undesired behavior in many applications, as clock generators. On the contrary, jitter may be used as the noise source in RO-based true-random numbers generators (TRNG). Consequently, jitter measure is a relevant issue to characterize a RO, and it is the subject of this paper. The main contribution is the use of Information Theory Quantifiers (ITQ) as measures of RO's jitter. It is shown that among several ITQ evaluated, two of them emerge as good measures because they are independent of parameters used for their statistical determination. They turned out to be robust and may be implemented experimentally. We encountered that a dual entropy plane allows a visual comparison of results.

  16. Dynamic Team Theory of Stochastic Differential Decision Systems with Decentralized Noisy Information Structures via Girsanov's Measure Transformation

    OpenAIRE

    Charalambous, Charalambos D.; Ahmed, Nasir U.

    2013-01-01

    In this paper, we present two methods which generalize static team theory to dynamic team theory, in the context of continuous-time stochastic nonlinear differential decentralized decision systems, with relaxed strategies, which are measurable to different noisy information structures. For both methods we apply Girsanov's measure transformation to obtain an equivalent dynamic team problem under a reference probability measure, so that the observations and information structures available for ...

  17. Roughness in Surface Force Measurements: Extension of DLVO Theory To Describe the Forces between Hafnia Surfaces.

    Science.gov (United States)

    Eom, Namsoon; Parsons, Drew F; Craig, Vincent S J

    2017-07-06

    The interaction between colloidal particles is commonly viewed through the lens of DLVO theory, whereby the interaction is described as the sum of the electrostatic and dispersion forces. For similar materials acting across a medium at pH values remote from the isoelectric point the theory typically involves an electrostatic repulsion that is overcome by dispersion forces at very small separations. However, the dominance of the dispersion forces at short separations is generally not seen in force measurements, with the exception of the interaction between mica surfaces. The discrepancy for silica surfaces has been attributed to hydration forces, but this does not explain the situation for titania surfaces where the dispersion forces are very much larger. Here, the interaction forces between very smooth hafnia surfaces have been measured using the colloid probe technique and the forces evaluated within the DLVO framework, including both hydration forces and the influence of roughness. The measured forces across a wide range of pH at different salt concentrations are well described with a single parameter for the surface roughness. These findings show that even small degrees of surface roughness significantly alter the form of the interaction force and therefore indicate that surface roughness needs to be included in the evaluation of surface forces between all surfaces that are not ideally smooth.

  18. Linear algebraic theory of partial coherence: discrete fields and measures of partial coherence.

    Science.gov (United States)

    Ozaktas, Haldun M; Yüksel, Serdar; Kutay, M Alper

    2002-08-01

    A linear algebraic theory of partial coherence is presented that allows precise mathematical definitions of concepts such as coherence and incoherence. This not only provides new perspectives and insights but also allows us to employ the conceptual and algebraic tools of linear algebra in applications. We define several scalar measures of the degree of partial coherence of an optical field that are zero for full incoherence and unity for full coherence. The mathematical definitions are related to our physical understanding of the corresponding concepts by considering them in the context of Young's experiment.

  19. A geometric formulation of Higgs Effective Field Theory: Measuring the curvature of scalar field space

    Science.gov (United States)

    Alonso, Rodrigo; Jenkins, Elizabeth E.; Manohar, Aneesh V.

    2016-03-01

    A geometric formulation of Higgs Effective Field Theory (HEFT) is presented. Experimental observables are given in terms of geometric invariants of the scalar sigma model sector such as the curvature of the scalar field manifold M. We show how the curvature can be measured experimentally via Higgs cross-sections, WL scattering, and the S parameter. The one-loop action of HEFT is given in terms of geometric invariants of M. The distinction between the Standard Model (SM) and HEFT is whether M is flat or curved, and the curvature is a signal of the scale of new physics.

  20. A geometric formulation of Higgs Effective Field Theory: Measuring the curvature of scalar field space

    Directory of Open Access Journals (Sweden)

    Rodrigo Alonso

    2016-03-01

    Full Text Available A geometric formulation of Higgs Effective Field Theory (HEFT is presented. Experimental observables are given in terms of geometric invariants of the scalar sigma model sector such as the curvature of the scalar field manifold M. We show how the curvature can be measured experimentally via Higgs cross-sections, WL scattering, and the S parameter. The one-loop action of HEFT is given in terms of geometric invariants of M. The distinction between the Standard Model (SM and HEFT is whether M is flat or curved, and the curvature is a signal of the scale of new physics.

  1. A Geometric Formulation of Higgs Effective Field Theory: Measuring the Curvature of Scalar Field Space

    CERN Document Server

    Alonso, Rodrigo; Manohar, Aneesh V

    2016-01-01

    A geometric formulation of Higgs Effective Field Theory (HEFT) is presented. Experimental observables are given in terms of geometric invariants of the scalar sigma model sector such as the curvature of the scalar field manifold $\\mathcal M$. We show how the curvature can be measured experimentally via Higgs cross-sections, $W_L$ scattering, and the $S$ parameter. The one-loop action of HEFT is given in terms of geometric invariants of $\\mathcal M$. The distinction between the Standard Model (SM) and HEFT is whether $\\mathcal M$ is flat or curved, not whether the scalars transform linearly or non-linearly under the electroweak group.

  2. Edge theory approach to topological entanglement entropy and other entanglement measures of (2+1) dimensional Chern-Simons theories on a general manifold

    Science.gov (United States)

    Wen, Xueda; Matsuura, Shunji; Ryu, Shinsei

    Topological entanglement entropy of (2+1) dimensional Chern-Simons gauge theories on a general manifold is usually calculated with Witten's method of surgeries and replica trick, in which the spacetime manifold under consideration is very complicated. In this work, we develop an edge theory approach, which greatly simplifies the calculation of topological entanglement entropy of a Chern-Simons theory. Our approach applies to a general manifold with arbitrary genus. The effect of braiding and fusion of Wilson lines can be straightforwardly calculated within our framework. In addition, our method can be generalized to the study of other entanglement measures such as mutual information and entanglement negativity of a topological quantum field theory on a general manifold.

  3. Multiphase flow dynamics 1 fundamentals

    CERN Document Server

    Kolev, Nikolay Ivanov

    2004-01-01

    Multi-phase flows are part of our natural environment such as tornadoes, typhoons, air and water pollution and volcanic activities as well as part of industrial technology such as power plants, combustion engines, propulsion systems, or chemical and biological industry. The industrial use of multi-phase systems requires analytical and numerical strategies for predicting their behavior. In its third extended edition this monograph contains theory, methods and practical experience for describing complex transient multi-phase processes in arbitrary geometrical configurations, providing a systematic presentation of the theory and practice of numerical multi-phase fluid dynamics. In the present first volume the fundamentals of multiphase dynamics are provided. This third edition includes various updates, extensions and improvements in all book chapters.

  4. Multiphase flow dynamics 1 fundamentals

    CERN Document Server

    Kolev, Nikolay Ivanov

    2007-01-01

    Multi-phase flows are part of our natural environment such as tornadoes, typhoons, air and water pollution and volcanic activities as well as part of industrial technology such as power plants, combustion engines, propulsion systems, or chemical and biological industry. The industrial use of multi-phase systems requires analytical and numerical strategies for predicting their behavior. In its third extended edition this monograph contains theory, methods and practical experience for describing complex transient multi-phase processes in arbitrary geometrical configurations, providing a systematic presentation of the theory and practice of numerical multi-phase fluid dynamics. In the present first volume the fundamentals of multiphase dynamics are provided. This third edition includes various updates, extensions and improvements in all book chapters.

  5. How Unstable Are Fundamental Quantum Supermembranes?

    OpenAIRE

    Kaku, Michio

    1996-01-01

    String duality requires the presence of solitonic $p$-branes. By contrast, the existence of fundamental supermembranes is problematic, since they are probably unstable. In this paper, we re-examine the quantum stability of fundamental supermembranes in 11 dimensions. Previously, supermembranes were shown to be unstable by approximating them with SU(n) super Yang-Mills fields as $n \\rightarrow \\infty$. We show that this instability persists even if we quantize the continuum theory from the ver...

  6. Einstein Gravity Explorer–a medium-class fundamental physics mission

    NARCIS (Netherlands)

    Schiller, S.; Tino, G.M.; Gill, E.

    2008-01-01

    The Einstein Gravity Explorer mission (EGE) is devoted to a precise measurement of the properties of space-time using atomic clocks. It tests one of the most fundamental predictions of Einstein’s Theory of General Relativity, the gravitational redshift, and thereby searches for hints of quantum

  7. Mass anomalous dimension and running of the coupling in SU(2) with six fundamental fermions

    DEFF Research Database (Denmark)

    Bursa, Francis; Del Debbio, Luigi; Keegan, Liam

    2010-01-01

    We simulate SU(2) gauge theory with six massless fundamental Dirac fermions. By using the Schr\\"odinger Functional method we measure the running of the coupling and the fermion mass over a wide range of length scales. We observe very slow running of the coupling and construct an estimator...

  8. Quantum dissipation theory and applications to quantum transport and quantum measurement in mesoscopic systems

    Science.gov (United States)

    Cui, Ping

    The thesis comprises two major themes of quantum statistical dynamics. One is the development of quantum dissipation theory (QDT). It covers the establishment of some basic relations of quantum statistical dynamics, the construction of several nonequivalent complete second-order formulations, and the development of exact QDT. Another is related to the applications of quantum statistical dynamics to a variety of research fields. In particular, unconventional but novel theories of the electron transfer in Debye solvents, quantum transport, and quantum measurement are developed on the basis of QDT formulations. The thesis is organized as follows. In Chapter 1, we present some background knowledge in relation to the aforementioned two themes of this thesis. The key quantity in QDT is the reduced density operator rho(t) ≡ trBrho T(t); i.e., the partial trace of the total system and bath composite rhoT(t) over the bath degrees of freedom. QDT governs the evolution of reduced density operator, where the effects of bath are treated in a quantum statistical manner. In principle, the reduced density operator contains all dynamics information of interest. However, the conventional quantum transport theory is formulated in terms of nonequilibrium Green's function. The newly emerging field of quantum measurement in relation to quantum information and quantum computing does exploit a sort of QDT formalism. Besides the background of the relevant theoretical development, some representative experiments on molecular nanojunctions are also briefly discussed. In chapter 2, we outline some basic (including new) relations that highlight several important issues on QDT. The content includes the background of nonequilibrium quantum statistical mechanics, the general description of the total composite Hamiltonian with stochastic system-bath interaction, a novel parameterization scheme for bath correlation functions, a newly developed exact theory of driven Brownian oscillator (DBO

  9. Fundamentals of electronics

    CERN Document Server

    Schubert, Thomas F

    2015-01-01

    This book, Electronic Devices and Circuit Application, is the first of four books of a larger work, Fundamentals of Electronics. It is comprised of four chapters describing the basic operation of each of the four fundamental building blocks of modern electronics: operational amplifiers, semiconductor diodes, bipolar junction transistors, and field effect transistors. Attention is focused on the reader obtaining a clear understanding of each of the devices when it is operated in equilibrium. Ideas fundamental to the study of electronic circuits are also developed in the book at a basic level to

  10. Fundamental principles, measurement techniques and data analysis in a ion accelerator; Principios fundamentales, tecnicas de medicion y analisis de datos en un acelerador de iones

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez M, O. [Facultad de Ciencias, UNAM, Ciudad Universitaria, 04510 Mexico D. F. (Mexico); Gleason, C. [Facultad de Ciencias, Universidad Autonoma del Estado de Morelos, Cuernavaca, Morelos (Mexico); Hinojosa, G. [Instituto de Ciencias Fisicas, UNAM, Ciudad Universitaria, 04510 Mexico D. F. (Mexico)]. e-mail: hinojosa@fis.unam.mx

    2008-07-01

    The present work is intended to be a general reference for students and professionals interested in the field. Here, we present an introduction to the analysis techniques and fundamental principles for data processing and operation of a typical ion accelerator that operates in the low energy range. We also present a detailed description of the apparatus and propose new analysis methods for the results. In addition, we introduce illustrative simulations of the ion's trajectories in the different components of the apparatus performed with specialized software and, a new computer data acquisition and control interface. (Author)

  11. Theory of the double-edge molecular technique for Doppler lidar wind measurement.

    Science.gov (United States)

    Flesia, C; Korb, C L

    1999-01-20

    The theory of the double-edge lidar technique for measuring the wind with molecular backscatter is described. Two high-spectral-resolution edge filters are located in the wings of the Rayleigh-Brillouin profile. This doubles the signal change per unit Doppler shift, the sensitivity, and improves measurement accuracy relative to the single-edge technique by nearly a factor of 2. The use of a crossover region where the sensitivity of a molecular- and an aerosol-based measurement is equal is described. Use of this region desensitizes the molecular measurement to the effects of aerosol scattering over a velocity range of +/-100 m/s. We give methods for correcting short-term, shot-to-shot, frequency jitter and drift with a laser reference frequency measurement and methods for long-term frequency correction with a servo control system. The effects of Rayleigh-Brillouin scattering on the measurement are shown to be significant and are included in the analysis. Simulations for a conical scanning satellite-based lidar at 355 nm show an accuracy of 2-3 m/s for altitudes of 2-15 km for a 1-km vertical resolution, a satellite altitude of 400 km, and a 200 km x 200 km spatial resolution.

  12. Theory of the Double-Edge Molecular Technique for Doppler Lidar Wind Measurement

    Science.gov (United States)

    Flesia, Cristina; Korb, C. Laurence

    1999-01-01

    The theory of the double-edge lidar technique for measuring the wind with molecular backscatter is described. Two high-spectral-resolution edge filters are located in the wings of the Rayleigh Brillouin profile. This doubles the signal change per unit Doppler shift, the sensitivity, and improves measurement accuracy relative to the single-edge technique by nearly a factor of 2. The use of a crossover region where the sensitivity of a molecular- and an aerosol-based measurement is equal is described. Use of this region desensitizes the molecular measurement to the effects of aerosol scattering over a velocity range of 100 m s. We give methods for correcting short-term, shot-to-shot, frequency jitter and drift with a laser reference frequency measurement and methods for long-term frequency correction with a servo control system. The effects of Rayleigh Brillouin scattering on the measurement are shown to be significant and are included in the analysis. Simulations for a conical scanning satellite-based lidar at 355 nm show an accuracy of 2 3 m s for altitudes of 2 15 km for a 1-km vertical resolution, a satellite altitude of 400 km, and a 200 km 200 km spatial resolution.

  13. The numeracy understanding in medicine instrument: a measure of health numeracy developed using item response theory.

    Science.gov (United States)

    Schapira, Marilyn M; Walker, Cindy M; Cappaert, Kevin J; Ganschow, Pamela S; Fletcher, Kathlyn E; McGinley, Emily L; Del Pozo, Sam; Schauer, Carrie; Tarima, Sergey; Jacobs, Elizabeth A

    2012-01-01

    Health numeracy can be defined as the ability to understand and apply information conveyed with numbers, tables and graphs, probabilities, and statistics to effectively communicate with health care providers, take care of one's health, and participate in medical decisions. To develop the Numeracy Understanding in Medicine Instrument (NUMi) using item response theory scaling methods. A 20-item test was formed drawing from an item bank of numeracy questions. Items were calibrated using responses from 1000 participants and a 2-parameter item response theory model. Construct validity was assessed by comparing scores on the NUMi to established measures of print and numeric health literacy, mathematic achievement, and cognitive aptitude. Community and clinical populations in the Milwaukee and Chicago metropolitan areas. Twenty-nine percent of the 1000 respondents were Hispanic, 24% were non-Hispanic white, and 42% were non-Hispanic black. Forty-one percent had no more than a high school education. The mean score on the NUMi was 13.2 (s = 4.6) with a Cronbach α of 0.86. Difficulty and discrimination item response theory parameters of the 20 items ranged from -1.70 to 1.45 and 0.39 to 1.98, respectively. Performance on the NUMi was strongly correlated with the Wide Range Achievement Test-Arithmetic (0.73, P < 0.001), the Lipkus Expanded Numeracy Scale (0.69, P < 0.001), the Medical Data Interpretation Test (0.75, P < 0.001), and the Wonderlic Cognitive Ability Test (0.82, P < 0.001). Performance was moderately correlated to the Short Test of Functional Health Literacy (0.43, P < 0.001). The NUMi was found to be most discriminating among respondents with a lower-than-average level of health numeracy. The NUMi can be applied in research and clinical settings as a robust measure of the health numeracy construct.

  14. arXiv Minimal Fundamental Partial Compositeness

    CERN Document Server

    Cacciapaglia, Giacomo; Sannino, Francesco; Thomsen, Anders Eller

    Building upon the fundamental partial compositeness framework we provide consistent and complete composite extensions of the standard model. These are used to determine the effective operators emerging at the electroweak scale in terms of the standard model fields upon consistently integrating out the heavy composite dynamics. We exhibit the first effective field theories matching these complete composite theories of flavour and analyse their physical consequences for the third generation quarks. Relations with other approaches, ranging from effective analyses for partial compositeness to extra dimensions as well as purely fermionic extensions, are briefly discussed. Our methodology is applicable to any composite theory of dynamical electroweak symmetry breaking featuring a complete theory of flavour.

  15. Fundamentals of crystallography

    CERN Document Server

    2011-01-01

    Crystallography is a basic tool for scientists in many diverse disciplines. This text offers a clear description of fundamentals and of modern applications. It supports curricula in crystallography at undergraduate level.

  16. Fundamentals of electrochemical science

    CERN Document Server

    Oldham, Keith

    1993-01-01

    Key Features* Deals comprehensively with the basic science of electrochemistry* Treats electrochemistry as a discipline in its own right and not as a branch of physical or analytical chemistry* Provides a thorough and quantitative description of electrochemical fundamentals

  17. Measuring theory of mind in children. Psychometric properties of the ToM Storybooks.

    Science.gov (United States)

    Blijd-Hoogewys, E M A; van Geert, P L C; Serra, M; Minderaa, R B

    2008-11-01

    Although research on Theory-of-Mind (ToM) is often based on single task measurements, more comprehensive instruments result in a better understanding of ToM development. The ToM Storybooks is a new instrument measuring basic ToM-functioning and associated aspects. There are 34 tasks, tapping various emotions, beliefs, desires and mental-physical distinctions. Four studies on the validity and reliability of the test are presented, in typically developing children (n = 324, 3-12 years) and children with PDD-NOS (n = 30). The ToM Storybooks have good psychometric qualities. A component analysis reveals five components corresponding with the underlying theoretical constructs. The internal consistency, test-retest reliability, inter-rater reliability, construct validity and convergent validity are good. The ToM Storybooks can be used in research as well as in clinical settings.

  18. Aligning physical elements with persons' attitude: an approach using Rasch measurement theory

    International Nuclear Information System (INIS)

    Camargo, F R; Henson, B

    2013-01-01

    Affective engineering uses mathematical models to convert the information obtained from persons' attitude to physical elements into an ergonomic design. However, applications in the domain have not in many cases met measurement assumptions. This paper proposes a novel approach based on Rasch measurement theory to overcome the problem. The research demonstrates that if data fit the model, further variables can be added to a scale. An empirical study was designed to determine the range of compliance where consumers could obtain an impression of a moisturizer cream when touching some product containers. Persons, variables and stimulus objects were parameterised independently on a linear continuum. The results showed that a calibrated scale preserves comparability although incorporating further variables

  19. Information security fundamentals

    CERN Document Server

    Peltier, Thomas R

    2013-01-01

    Developing an information security program that adheres to the principle of security as a business enabler must be the first step in an enterprise's effort to build an effective security program. Following in the footsteps of its bestselling predecessor, Information Security Fundamentals, Second Edition provides information security professionals with a clear understanding of the fundamentals of security required to address the range of issues they will experience in the field.The book examines the elements of computer security, employee roles and r

  20. Religious fundamentalism and conflict

    OpenAIRE

    Muzaffer Ercan Yılmaz

    2006-01-01

    This study provides an analytical discussion for the issue of religious fundamentalism and itsrelevance to conflict, in its broader sense. It is stressed that religious fundamentalism manifests itself in twoways: nonviolent intolerance and violent intolerance. The sources of both types of intolerance and theirconnection to conflict are addressed and discussed in detail. Further research is also suggested on conditionsconnecting religion to nonviolent intolerance so as to cope with the problem...

  1. A laboratory scale fundamental time?

    International Nuclear Information System (INIS)

    Mendes, R.V.

    2012-01-01

    The existence of a fundamental time (or fundamental length) has been conjectured in many contexts. However, the ''stability of physical theories principle'' seems to be the one that provides, through the tools of algebraic deformation theory, an unambiguous derivation of the stable structures that Nature might have chosen for its algebraic framework. It is well-known that c and ℎ are the deformation parameters that stabilize the Galilean and the Poisson algebra. When the stability principle is applied to the Poincare-Heisenberg algebra, two deformation parameters emerge which define two time (or length) scales. In addition there are, for each of them, a plus or minus sign possibility in the relevant commutators. One of the deformation length scales, related to non-commutativity of momenta, is probably related to the Planck length scale but the other might be much larger and already detectable in laboratory experiments. In this paper, this is used as a working hypothesis to look for physical effects that might settle this question. Phase-space modifications, resonances, interference, electron spin resonance and non-commutative QED are considered. (orig.)

  2. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, A.V.

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments, which are our concern in this review [ru

  3. An information theory approach for evaluating earth radiation budget (ERB) measurements - Nonuniform sampling of reflected shortwave radiation

    Science.gov (United States)

    Barkstrom, Bruce R.; Direskeneli, Haldun; Halyo, Nesim

    1992-01-01

    An information theory approach to examine the temporal nonuniform sampling characteristics of shortwave (SW) flux for earth radiation budget (ERB) measurements is suggested. The information gain is computed by computing the information content before and after the measurements. A stochastic diurnal model for the SW flux is developed, and measurements for different orbital parameters are examined. The methodology is applied to specific NASA Polar platform and Tropical Rainfall Measuring Mission (TRMM) orbital parameters. The information theory approach, coupled with the developed SW diurnal model, is found to be promising for measurements involving nonuniform orbital sampling characteristics.

  4. Is PT -symmetric quantum theory false as a fundamental theory?

    Czech Academy of Sciences Publication Activity Database

    Znojil, Miloslav

    2016-01-01

    Roč. 56, č. 3 (2016), s. 254-257 ISSN 1210-2709 R&D Projects: GA ČR GA16-22945S Institutional support: RVO:61389005 Keywords : quantum mechanics * PT-symmetric representations of observables * masurement outcomes Subject RIV: BE - Theoretical Physics

  5. Development of the effectiveness measure for an advanced alarm system using signal detection theory

    International Nuclear Information System (INIS)

    Park, J.K.; Choi, S.S.; Hong, J.H.; Chang, S.H.

    1997-01-01

    Since many alarms which are activated during major process deviations or accidents in nuclear power plants can result in negative effects for operators, various types of advanced alarm systems that can select important alarms for the identification of process deviation have been developed to reduce the operator's workload. However, the irrelevant selection of important alarms could distract the operator from correct identification of process deviation. Therefore, to evaluate the effectiveness of the advanced alarm system, a tradeoff between the alarm reduction rate (how many alarms are reduced?) and informativeness (how many important alarms that are conducive to identifying process deviation are provided?) of an advanced alarm system should be considered. In this paper, a new measure is proposed to evaluate the effectiveness of an advanced alarm system with regard to the identification of process deviation. Here, the effectiveness measure is the combination of informativeness measure and reduction rate, and the informativeness measure means the information processing capability performed by the advanced alarm system including wrong rejection and wrong acceptance, and it can be calculated using the signal detection theory (SDT). The effectiveness of the prototype alarm system was evaluated using the loss of coolant accident (LOCA) scenario, and the validity of the effectiveness measure was investigated from two types of the operator response, such as the identification accuracy and the operator's preference for the identification of LOCA

  6. Game Theory

    Indian Academy of Sciences (India)

    This article tries to outline what game theory is all about. It illustrates game theory's fundamental solution concept viz., Nash equilibrium, using various examples. The Genesis. In the late thirties, the mathematician John von Neumann turned his prodigious innovative talents towards economics. This brief encounter of his with ...

  7. Measuring the Acceptance of Evolutionary Theory in Texas 2-Year Colleges

    Science.gov (United States)

    Brown, Jack; Scott, Joyce A.

    2016-01-01

    Evolutionary theory is the central unifying theory of the life sciences. However, acceptance and understanding of the theory have been found to be lacking in the general public, high school, and university populations. Prior research has linked low acceptance of the theory to a poor knowledge base in evolution, to the nature of science, and to…

  8. Measuring the quality of life in hypertension according to Item Response Theory

    Directory of Open Access Journals (Sweden)

    José Wicto Pereira Borges

    Full Text Available ABSTRACT OBJECTIVE To analyze the Miniquestionário de Qualidade de Vida em Hipertensão Arterial (MINICHAL – Mini-questionnaire of Quality of Life in Hypertension using the Item Response Theory. METHODS This is an analytical study conducted with 712 persons with hypertension treated in thirteen primary health care units of Fortaleza, State of Ceará, Brazil, in 2015. The steps of the analysis by the Item Response Theory were: evaluation of dimensionality, estimation of parameters of items, and construction of scale. The study of dimensionality was carried out on the polychoric correlation matrix and confirmatory factor analysis. To estimate the item parameters, we used the Gradual Response Model of Samejima. The analyses were conducted using the free software R with the aid of psych and mirt. RESULTS The analysis has allowed the visualization of item parameters and their individual contributions in the measurement of the latent trait, generating more information and allowing the construction of a scale with an interpretative model that demonstrates the evolution of the worsening of the quality of life in five levels. Regarding the item parameters, the items related to the somatic state have had a good performance, as they have presented better power to discriminate individuals with worse quality of life. The items related to mental state have been those which contributed with less psychometric data in the MINICHAL. CONCLUSIONS We conclude that the instrument is suitable for the identification of the worsening of the quality of life in hypertension. The analysis of the MINICHAL using the Item Response Theory has allowed us to identify new sides of this instrument that have not yet been addressed in previous studies.

  9. Harmonizing Measures of Cognitive Performance Across International Surveys of Aging Using Item Response Theory.

    Science.gov (United States)

    Chan, Kitty S; Gross, Alden L; Pezzin, Liliana E; Brandt, Jason; Kasper, Judith D

    2015-12-01

    To harmonize measures of cognitive performance using item response theory (IRT) across two international aging studies. Data for persons ≥65 years from the Health and Retirement Study (HRS, N = 9,471) and the English Longitudinal Study of Aging (ELSA, N = 5,444). Cognitive performance measures varied (HRS fielded 25, ELSA 13); 9 were in common. Measurement precision was examined for IRT scores based on (a) common items, (b) common items adjusted for differential item functioning (DIF), and (c) DIF-adjusted all items. Three common items (day of date, immediate word recall, and delayed word recall) demonstrated DIF by survey. Adding survey-specific items improved precision but mainly for HRS respondents at lower cognitive levels. IRT offers a feasible strategy for harmonizing cognitive performance measures across other surveys and for other multi-item constructs of interest in studies of aging. Practical implications depend on sample distribution and the difficulty mix of in-common and survey-specific items. © The Author(s) 2015.

  10. DOE Fundamentals Handbook: Classical Physics

    International Nuclear Information System (INIS)

    1992-06-01

    The Classical Physics Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of physical forces and their properties. The handbook includes information on the units used to measure physical properties; vectors, and how they are used to show the net effect of various forces; Newton's Laws of motion, and how to use these laws in force and motion applications; and the concepts of energy, work, and power, and how to measure and calculate the energy involved in various applications. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility systems and equipment

  11. Designing a performance measurement system for supply chain using balanced scorecard, path analysis, cooperative game theory and evolutionary game theory: A Case Study

    Directory of Open Access Journals (Sweden)

    Seyed Hootan Eskafi

    2015-04-01

    Full Text Available In recent years, supply chain management is known as the key factor for achieving competitive advantage. Better customer service, revenue improvement and cost reduction are the results of this philosophy. Organizations can manage the performance of their firms by appropriate goal setting, identifying criteria and continuous performance measurement, which creates a good view for the business circumstances. Developing and defining appropriate indicators at different levels of chain is necessary for implementing a performance measurement system. In this study, we propose a new method to determine the measurement indicators and strategies of the company in term of balanced scorecard. The study is a combination of balanced scorecard, path analysis, evolutionary game theory and cooperative game theory for strategic planning. The study offers an appropriate program for future activities of organizations and determines the present status of the firm. The implementation of the proposed method is introduced for a food producer and the results are analyzed.

  12. Measurement incompatibility and Schrödinger-Einstein-Podolsky-Rosen steering in a class of probabilistic theories

    International Nuclear Information System (INIS)

    Banik, Manik

    2015-01-01

    Steering is one of the most counter intuitive non-classical features of bipartite quantum system, first noticed by Schrödinger at the early days of quantum theory. On the other hand, measurement incompatibility is another non-classical feature of quantum theory, initially pointed out by Bohr. Recently, Quintino et al. [Phys. Rev. Lett. 113, 160402 (2014)] and Uola et al. [Phys. Rev. Lett. 113, 160403 (2014)] have investigated the relation between these two distinct non-classical features. They have shown that a set of measurements is not jointly measurable (i.e., incompatible) if and only if they can be used for demonstrating Schrödinger-Einstein-Podolsky-Rosen steering. The concept of steering has been generalized for more general abstract tensor product theories rather than just Hilbert space quantum mechanics. In this article, we discuss that the notion of measurement incompatibility can be extended for general probability theories. Further, we show that the connection between steering and measurement incompatibility holds in a border class of tensor product theories rather than just quantum theory

  13. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    Science.gov (United States)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  14. Development of a measure of work motivation for a meta-theory of motivation.

    Science.gov (United States)

    Ryan, James C

    2011-06-01

    This study presents a measure of work motivation designed to assess the motivational concepts of the meta-theory of motivation. These concepts include intrinsic process motivation, goal internalization motivation, instrumental motivation, external self-concept motivation, and internal self-concept motivation. Following a process of statement development and identification, six statements for each concept were presented to a sample of working professionals (N = 330) via a paper-and-pencil questionnaire. Parallel analysis supported a 5-factor solution, with a varimax rotation identifying 5 factors accounting for 48.9% of total variance. All 5 scales had Cronbach alpha coefficients above .70. Limitations of the newly proposed questionnaire and suggestions for its further development and use are discussed.

  15. Usability of a theory of visual attention (TVA) for parameter-based measurement of attention I

    DEFF Research Database (Denmark)

    Finke, Kathrin; Bublak, Peter; Krummenacher, Joseph

    2005-01-01

    The present study investigated the usability of whole and partial report of briefly displayed letter arrays as a diagnostic tool for the assessment of attentional functions. The tool is based on Bundesen’s (1990, 1998, 2002; Bundesen et al., 2005) theory of visual attention (TVA), which assumes...... four separable attentional components: processing speed, working memory storage capacity, spatial distribution of attention, and top-down control. A number of studies (Duncan et al., 1999; Habekost & Bundesen, 2003; Peers et al., 2005) have already demonstrated the clinical relevance...... clinical tests measuring similar constructs. The empirical independence of the four TVA parameters is suggested by nonsignificant or, in the case of processing speed and working memory storage capacity, only modest correlations between the parameter values....

  16. Measuring the World: How theory follows observation (Alexander von Humboldt Medal)

    Science.gov (United States)

    Savenije, Hubert H. G.

    2015-04-01

    I started my professional career as a hydrologist working for the government of Mozambique. I was responsible for overseeing the hydrological network, the operational hydrology and answering specific questions related to water resources availability and the occurrence of floods. In the late 1970s and early 1980s, the use of telecommunication and computers was still very limited. We had to work with handbooks, lecture notes and consultancy reports, but mostly with our brains. The key to answering a specific question was to go into the field and observe. We measured as much as we could to understand the processes that we observed. I didn't know it at the time, but this perfectly fits in the tradition of Von Humboldt. During my time in Mozambique I surveyed during and after extreme floods, such as the 1984 flood caused by the tropical cyclone Demoina. I surveyed the geometry, hydraulics and salt intrusion of 4 major Mozambican estuaries. And I measured the quality and the quantity of the flows draining onto these estuaries. Having only limited access to the literature, it was a survey without much theoretical guidance. This maybe slowed us down a bit, and sometimes led to inefficient approaches, but scientifically it was a gold mine. Not being biased by established theories is a great advantage. One does not follow onto the well-trodden, but sometimes erroneous, paths of others. After working for 6 years in Mozambique I joined an international consultant, for whom I worked for 6 years in many different countries in Asia, Africa and South America. Although the access to literature and other people's experience was better, I continued the practice of observing before believing. These 12 years of doing hydrology in practice formed the basis for the development of my own theories on hydrological processes, salt intrusion in estuaries, tidal hydraulics and even atmospheric moisture recycling. So when I started on my PhD at the age of 38, I made a completely different start

  17. Homeschooling and religious fundamentalism

    Directory of Open Access Journals (Sweden)

    Robert Kunzman

    2010-10-01

    Full Text Available This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to contemporary culture; suspicion of institutional authority and professional expertise; parental control and centrality of the family; and interweaving of faith and academics. It is important to recognize, however, that fundamentalism exists on a continuum; conservative religious homeschoolers resist liberal democratic values to varying degrees, and efforts to foster dialogue and accommodation with religious homeschoolers can ultimately help strengthen the broader civic fabric.

  18. Fundamentals of nonlinear optics

    CERN Document Server

    Powers, Peter E

    2011-01-01

    Peter Powers's rigorous but simple description of a difficult field keeps the reader's attention throughout. … All chapters contain a list of references and large numbers of practice examples to be worked through. … By carefully working through the proposed problems, students will develop a sound understanding of the fundamental principles and applications. … the book serves perfectly for an introductory-level course for second- and third-order nonlinear optical phenomena. The author's writing style is refreshing and original. I expect that Fundamentals of Nonlinear Optics will fast become pop

  19. Fundamentals of piping design

    CERN Document Server

    Smith, Peter

    2013-01-01

    Written for the piping engineer and designer in the field, this two-part series helps to fill a void in piping literature,since the Rip Weaver books of the '90s were taken out of print at the advent of the Computer Aid Design(CAD) era. Technology may have changed, however the fundamentals of piping rules still apply in the digitalrepresentation of process piping systems. The Fundamentals of Piping Design is an introduction to the designof piping systems, various processes and the layout of pipe work connecting the major items of equipment forthe new hire, the engineering student and the vetera

  20. Pragmatic electrical engineering fundamentals

    CERN Document Server

    Eccles, William

    2011-01-01

    Pragmatic Electrical Engineering: Fundamentals introduces the fundamentals of the energy-delivery part of electrical systems. It begins with a study of basic electrical circuits and then focuses on electrical power. Three-phase power systems, transformers, induction motors, and magnetics are the major topics.All of the material in the text is illustrated with completely-worked examples to guide the student to a better understanding of the topics. This short lecture book will be of use at any level of engineering, not just electrical. Its goal is to provide the practicing engineer with a practi

  1. Fundamentals of continuum mechanics

    CERN Document Server

    Rudnicki, John W

    2014-01-01

    A concise introductory course text on continuum mechanics Fundamentals of Continuum Mechanics focuses on the fundamentals of the subject and provides the background for formulation of numerical methods for large deformations and a wide range of material behaviours. It aims to provide the foundations for further study, not just of these subjects, but also the formulations for much more complex material behaviour and their implementation computationally.  This book is divided into 5 parts, covering mathematical preliminaries, stress, motion and deformation, balance of mass, momentum and energ

  2. Fundamentals of reactor chemistry

    International Nuclear Information System (INIS)

    Akatsu, Eiko

    1981-12-01

    In the Nuclear Engineering School of JAERI, many courses are presented for the people working in and around the nuclear reactors. The curricula of the courses contain also the subject material of chemistry. With reference to the foreign curricula, a plan of educational subject material of chemistry in the Nuclear Engineering School of JAERI was considered, and the fundamental part of reactor chemistry was reviewed in this report. Since the students of the Nuclear Engineering School are not chemists, the knowledge necessary in and around the nuclear reactors was emphasized in order to familiarize the students with the reactor chemistry. The teaching experience of the fundamentals of reactor chemistry is also given. (author)

  3. Fundamentals of fluid lubrication

    Science.gov (United States)

    Hamrock, Bernard J.

    1991-01-01

    The aim is to coordinate the topics of design, engineering dynamics, and fluid dynamics in order to aid researchers in the area of fluid film lubrication. The lubrication principles that are covered can serve as a basis for the engineering design of machine elements. The fundamentals of fluid film lubrication are presented clearly so that students that use the book will have confidence in their ability to apply these principles to a wide range of lubrication situations. Some guidance on applying these fundamentals to the solution of engineering problems is also provided.

  4. Formulando uma Psicopatologia Fundamental

    OpenAIRE

    Pereira, Mario Eduardo Costa

    1998-01-01

    O presente trabalho busca situar a Psicopatologia Fundamental em relação ao contexto atual da psicopatologia e delimitar seu âmbito científico naquilo que ela traz de original na discussão psicopatológica. Inicialmente, o campo da psicopatologia é estudado em relação à formalização proposta por Karl Jaspers em termos de uma psicopatologia geral. Em seguida, discute-se a incidência específica da psicanálise nesse debate. Propõe-se que a tarefa da psicopatologia fundamental tem três frentes pri...

  5. Fundamentals and Optimal Institutions

    DEFF Research Database (Denmark)

    Gonzalez-Eiras, Martin; Harmon, Nikolaj Arpe; Rossi, Martín

    2016-01-01

    To shed light on the relation between fundamentals and adopted institutions we examine institutional choice across the ``Big Four'' US sports leagues. Despite having very similar business models and facing the same economic and legal environment, these leagues exhibit large differences in their use...... of regulatory institutions such as revenue sharing, salary caps or luxury taxes. We show, theoretically and empirically, that these large differences in adopted institutions can be rationalized as optimal responses to differences in the fundamental characteristics of the sports being played. This provides...

  6. Infosec management fundamentals

    CERN Document Server

    Dalziel, Henry

    2015-01-01

    Infosec Management Fundamentals is a concise overview of the Information Security management concepts and techniques, providing a foundational template for both experienced professionals and those new to the industry. This brief volume will also appeal to business executives and managers outside of infosec who want to understand the fundamental concepts of Information Security and how it impacts their business decisions and daily activities. Teaches ISO/IEC 27000 best practices on information security management Discusses risks and controls within the context of an overall information securi

  7. Electrostatic Introduction Theory Based Spatial Filtering Method for Solid Particle Velocity Measurement

    Science.gov (United States)

    Xu, Chuanlong; Tang, Guanghua; Zhou, Bin; Yang, Daoye; Zhang, Jianyong; Wang, Shimin

    2007-06-01

    Electrostatic induction theory based spatial filtering method for particle velocity measurement has the advantages of the simplicity of measurement system and of the convenience of data processing. In this paper, the relationship between solid particle velocity and the power spectrum of the output signal of the electrostatic senor was derived theoretically. And the effects of the length of the electrode, the thickness of the dielectric pipe and its length on the spatial filtering characteristics of the electrostatic sensor were investigated numerically using finite element method. Additionally, as for the roughness and the difficult determination of the peak frequency fmax of the power spectrum characteristics curve of the output signal, a wavelet analysis based filtering method was adopted to smooth the curve, which can determine peak frequency fmax accurately. Finally, the velocity measurement method was applied in a dense phase pneumatic conveying system under high pressure, and the experimental results show that the system repeatability is within ±4% over the gas superficial velocity range of 8.63-18.62 m/s for particle concentration range 0.067-0.130 m3/m3.

  8. Electron energy and charge albedos - calorimetric measurement vs Monte Carlo theory

    International Nuclear Information System (INIS)

    Lockwood, G.J.; Ruggles, L.E.; Miller, G.H.; Halbleib, J.A.

    1981-11-01

    A new calorimetric method has been employed to obtain saturated electron energy albedos for Be, C, Al, Ti, Mo, Ta, U, and UO 2 over the range of incident energies from 0.1 to 1.0 MeV. The technique was so designed to permit the simultaneous measurement of saturated charge albedos. In the cases of C, Al, Ta, and U the measurements were extended down to about 0.025 MeV. The angle of incidence was varied from 0 0 (normal) to 75 0 in steps of 15 0 , with selected measurements at 82.5 0 in Be and C. In each case, state-of-the-art predictions were obtained from a Monte Carlo model. The generally good agreement between theory and experiment over this extensive parameter space represents a strong validation of both the theoretical model and the new experimental method. Nevertheless, certain discrepancies at low incident energies, especially in high-atomic-number materials, and at all energies in the case of the U energy albedos are not completely understood

  9. Model representations of kerogen structures: An insight from density functional theory calculations and spectroscopic measurements.

    Science.gov (United States)

    Weck, Philippe F; Kim, Eunja; Wang, Yifeng; Kruichak, Jessica N; Mills, Melissa M; Matteo, Edward N; Pellenq, Roland J-M

    2017-08-01

    Molecular structures of kerogen control hydrocarbon production in unconventional reservoirs. Significant progress has been made in developing model representations of various kerogen structures. These models have been widely used for the prediction of gas adsorption and migration in shale matrix. However, using density functional perturbation theory (DFPT) calculations and vibrational spectroscopic measurements, we here show that a large gap may still remain between the existing model representations and actual kerogen structures, therefore calling for new model development. Using DFPT, we calculated Fourier transform infrared (FTIR) spectra for six most widely used kerogen structure models. The computed spectra were then systematically compared to the FTIR absorption spectra collected for kerogen samples isolated from Mancos, Woodford and Marcellus formations representing a wide range of kerogen origin and maturation conditions. Limited agreement between the model predictions and the measurements highlights that the existing kerogen models may still miss some key features in structural representation. A combination of DFPT calculations with spectroscopic measurements may provide a useful diagnostic tool for assessing the adequacy of a proposed structural model as well as for future model development. This approach may eventually help develop comprehensive infrared (IR)-fingerprints for tracing kerogen evolution.

  10. M/L, Hα Rotation Curves, and H I Gas Measurements for 329 Nearby Cluster and Field Spirals. III. Evolution in Fundamental Galaxy Parameters

    Science.gov (United States)

    Vogt, Nicole P.; Haynes, Martha P.; Giovanelli, Riccardo; Herter, Terry

    2004-06-01

    We have conducted a study of optical and H I properties of spiral galaxies (size, luminosity, Hα flux distribution, circular velocity, H I gas mass) to investigate causes (e.g., nature vs. nurture) for variation within the cluster environment. We find H I-deficient cluster galaxies to be offset in fundamental plane space, with disk scale lengths decreased by a factor of 25%. This may be a relic of early galaxy formation, caused by the disk coalescing out of a smaller, denser halo (e.g., higher concentration index) or by truncation of the hot gas envelope due to the enhanced local density of neighbors, although we cannot completely rule out the effect of the gas stripping process. The spatial extent of Hα flux and the B-band radius also decreases, but only in early-type spirals, suggesting that gas removal is less efficient within steeper potential wells (or that stripped late-type spirals are quickly rendered unrecognizable). We find no significant trend in stellar mass-to-light ratios or circular velocities with H I gas content, morphological type, or clustercentric radius, for star-forming spiral galaxies throughout the clusters. These data support the findings of a companion paper that gas stripping promotes a rapid truncation of star formation across the disk and could be interpreted as weak support for dark matter domination over baryons in the inner regions of spiral galaxies.

  11. Safety analysis fundamentals

    International Nuclear Information System (INIS)

    Wright, A.C.D.

    2002-01-01

    This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations

  12. Fundamentals of Diesel Engines.

    Science.gov (United States)

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…

  13. Introduction and fundamentals

    International Nuclear Information System (INIS)

    Thomas, R.H.

    1980-01-01

    This introduction discusses advances in the fundamental sciences which underlie the applied science of health physics and radiation protection. Risk assessments in nuclear medicine are made by defining the conditions of exposure, identification of adverse effects, relating exposure with effect, and estimation of the overall risk for ionizing radiations

  14. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  15. Fundamentals of astrodynamics

    NARCIS (Netherlands)

    Wakker, K.F.

    2015-01-01

    This book deals with the motion of the center of mass of a spacecraft; this discipline is generally called astrodynamics. The book focuses on an analytical treatment of the motion of spacecraft and provides insight into the fundamentals of spacecraft orbit dynamics. A large number of topics are

  16. Fundamental partial compositeness

    DEFF Research Database (Denmark)

    Sannino, Francesco; Strumia, Alessandro; Tesi, Andrea

    2016-01-01

    We construct renormalizable Standard Model extensions, valid up to the Planck scale, that give a composite Higgs from a new fundamental strong force acting on fermions and scalars. Yukawa interactions of these particles with Standard Model fermions realize the partial compositeness scenario. Unde...

  17. Fundamental research data base

    Science.gov (United States)

    1983-01-01

    A fundamental research data base containing ground truth, image, and Badhwar profile feature data for 17 North Dakota, South Dakota, and Minnesota agricultural sites is described. Image data was provided for a minimum of four acquisition dates for each site and all four images were registered to one another.

  18. Fast fundamental frequency estimation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom

    2017-01-01

    Modelling signals as being periodic is common in many applications. Such periodic signals can be represented by a weighted sum of sinusoids with frequencies being an integer multiple of the fundamental frequency. Due to its widespread use, numerous methods have been proposed to estimate the funda...

  19. Fundamental Fluid Mechanics

    Indian Academy of Sciences (India)

    BOOK I REVIEW. Fundamental Fluid. Mechanics. Good Text Book Material. V H Arakeri. Fluid Mechanics for Engineers. P N Chatterjee. MacMillan India Limited. Vol. 1, pp. 367. RS.143. Vo1.2, pp.306. RS.130. Fluid Mechanics for Engineers in two vol- umes by P N Chatterjee contains standard material for a first level ...

  20. Fundamental Metallurgy of Solidification

    DEFF Research Database (Denmark)

    Tiedje, Niels

    2004-01-01

    The text takes the reader through some fundamental aspects of solidification, with focus on understanding the basic physics that govern solidification in casting and welding. It is described how the first solid is formed and which factors affect nucleation. It is described how crystals grow from ...

  1. On the conception of fundamental time asymmetries in physics

    Energy Technology Data Exchange (ETDEWEB)

    Wohlfarth, Daniel

    2013-02-05

    The investigation is divided in 7 chapters and aims to argue for the realizability of a new conception of 'fundamental time asymmetries' in physics. After an introduction (chapter 1) in the field of interest, the investigation continues by developing a conception of fundamentality for time asymmetries in chapter 2. Chapter 3 shows that this conception is realized in classical cosmology and chapter 4 demonstrates, by taking in to account the result from chapter 3, that classical electrodynamics is understandable as a time asymmetric theory. Chapter 5 focuses on time asymmetries in quantum cosmology as well as quantum thermodynamics and demonstrates - as in the classical case - that a fundamental time asymmetry is imbedded in those fields. The considerations, contained in chapter 6, are focused on non relativistic quantum mechanics (NRQM). Here the main aim is to demonstrate that NRQM can be understood as a time asymmetric theory - even without using the measurement-process for that purpose. Chapter 7 summarized the main arguments and conclusions.

  2. On the conception of fundamental time asymmetries in physics

    International Nuclear Information System (INIS)

    Wohlfarth, Daniel

    2013-01-01

    The investigation is divided in 7 chapters and aims to argue for the realizability of a new conception of 'fundamental time asymmetries' in physics. After an introduction (chapter 1) in the field of interest, the investigation continues by developing a conception of fundamentality for time asymmetries in chapter 2. Chapter 3 shows that this conception is realized in classical cosmology and chapter 4 demonstrates, by taking in to account the result from chapter 3, that classical electrodynamics is understandable as a time asymmetric theory. Chapter 5 focuses on time asymmetries in quantum cosmology as well as quantum thermodynamics and demonstrates - as in the classical case - that a fundamental time asymmetry is imbedded in those fields. The considerations, contained in chapter 6, are focused on non relativistic quantum mechanics (NRQM). Here the main aim is to demonstrate that NRQM can be understood as a time asymmetric theory - even without using the measurement-process for that purpose. Chapter 7 summarized the main arguments and conclusions.

  3. Fundamentals of nanoscaled field effect transistors

    CERN Document Server

    Chaudhry, Amit

    2013-01-01

    Fundamentals of Nanoscaled Field Effect Transistors gives comprehensive coverage of the fundamental physical principles and theory behind nanoscale transistors. The specific issues that arise for nanoscale MOSFETs, such as quantum mechanical tunneling and inversion layer quantization, are fully explored. The solutions to these issues, such as high-κ technology, strained-Si technology, alternate devices structures and graphene technology are also given. Some case studies regarding the above issues and solution are also given in the book. In summary, this book: Covers the fundamental principles behind nanoelectronics/microelectronics Includes chapters devoted to solutions tackling the quantum mechanical effects occurring at nanoscale Provides some case studies to understand the issue mathematically Fundamentals of Nanoscaled Field Effect Transistors is an ideal book for researchers and undergraduate and graduate students in the field of microelectronics, nanoelectronics, and electronics.

  4. Cyberspace Assurance Metrics: Utilizing Models of Networks, Complex Systems Theory, Multidimensional Wavelet Analysis, and Generalized Entrophy Measures

    National Research Council Canada - National Science Library

    Johnson, Joseph E; Gudkov, Vladimir

    2005-01-01

    ... as continuous group theory and Markov processes. Based upon this research he has proposed that entropy metrics, and the associated cluster analysis of the network so measured by these metrics, can be useful indicators of aberrant processes and behavior. Other team members have obtained important connections using higher order Renyi entropy metrics, and complexity theory to both monitor real networks and to study networks by simulation.

  5. Materials Fundamentals of Gate Dielectrics

    CERN Document Server

    Demkov, Alexander A

    2006-01-01

    This book presents materials fundamentals of novel gate dielectrics that are being introduced into semiconductor manufacturing to ensure the continuous scalling of the CMOS devices. This is a very fast evolving field of research so we choose to focus on the basic understanding of the structure, thermodunamics, and electronic properties of these materials that determine their performance in device applications. Most of these materials are transition metal oxides. Ironically, the d-orbitals responsible for the high dielectric constant cause sever integration difficulties thus intrinsically limiting high-k dielectrics. Though new in the electronics industry many of these materials are wel known in the field of ceramics, and we describe this unique connection. The complexity of the structure-property relations in TM oxides makes the use of the state of the art first-principles calculations necessary. Several chapters give a detailed description of the modern theory of polarization, and heterojunction band discont...

  6. Rayleigh Scattering Density Measurements, Cluster Theory, and Nucleation Calculations at Mach 10

    Science.gov (United States)

    Balla, R. Jeffrey; Everhart, Joel L.

    2012-01-01

    In an exploratory investigation, quantitative unclustered laser Rayleigh scattering measurements of density were performed in the air in the NASA Langley Research Center's 31 in. Mach 10 wind tunnel. A review of 20 previous years of data in supersonic and Mach 6 hypersonic flows is presented where clustered signals typically overwhelmed molecular signals. A review of nucleation theory and accompanying nucleation calculations are also provided to interpret the current observed lack of clustering. Data were acquired at a fixed stagnation temperature near 990Kat five stagnation pressures spanning 2.41 to 10.0 MPa (350 to 1454 psi) using a pulsed argon fluoride excimer laser and double-intensified charge-coupled device camera. Data averaged over 371 images and 210 pixels along a 36.7mmline measured freestream densities that agree with computed isentropic-expansion densities to less than 2% and less than 6% at the highest and lowest densities, respectively. Cluster-free Mach 10 results are compared with previous clustered Mach 6 and condensation-free Mach 14 results. Evidence is presented indicating vibrationally excited oxygen and nitrogen molecules are absorbed as the clusters form, release their excess energy, and inhibit or possibly reverse the clustering process. Implications for delaying clustering and condensation onset in hypersonic and hypervelocity facilities are discussed.

  7. Fundamental enabling issues in nanotechnology :

    Energy Technology Data Exchange (ETDEWEB)

    Floro, Jerrold Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Foiles, Stephen Martin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hearne, Sean Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoyt, Jeffrey John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Seel, Steven Craig [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Webb III, Edmund Blackburn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Morales, Alfredo Martin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zimmerman, Jonathan A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2007-10-01

    To effectively integrate nanotechnology into functional devices, fundamental aspects of material behavior at the nanometer scale must be understood. Stresses generated during thin film growth strongly influence component lifetime and performance; stress has also been proposed as a mechanism for stabilizing supported nanoscale structures. Yet the intrinsic connections between the evolving morphology of supported nanostructures and stress generation are still a matter of debate. This report presents results from a combined experiment and modeling approach to study stress evolution during thin film growth. Fully atomistic simulations are presented predicting stress generation mechanisms and magnitudes during all growth stages, from island nucleation to coalescence and film thickening. Simulations are validated by electrodeposition growth experiments, which establish the dependence of microstructure and growth stresses on process conditions and deposition geometry. Sandia is one of the few facilities with the resources to combine experiments and modeling/theory in this close a fashion. Experiments predicted an ongoing coalescence process that generates signficant tensile stress. Data from deposition experiments also supports the existence of a kinetically limited compressive stress generation mechanism. Atomistic simulations explored island coalescence and deposition onto surfaces intersected by grain boundary structures to permit investigation of stress evolution during later growth stages, e.g. continual island coalescence and adatom incorporation into grain boundaries. The predictive capabilities of simulation permit direct determination of fundamental processes active in stress generation at the nanometer scale while connecting those processes, via new theory, to continuum models for much larger island and film structures. Our combined experiment and simulation results reveal the necessary materials science to tailor stress, and therefore performance, in

  8. Establishing score equivalence of the Functional Independence Measure motor scale and the Barthel Index, utilising the International Classification of Functioning, Disability and Health and Rasch measurement theory

    OpenAIRE

    Prodinger, B; O'Connor, RJ; Stucki, G; Tennant, A

    2017-01-01

    Introduction: Two widely used outcome measures to assess functioning in neurological rehabilitation are the Functional Independence Measure (FIM™) and the Barthel Index. The current study aims to establish the equivalence of the total score of the FIM™ motor scale and the Barthel Index through the application of the International Classification of Functioning, Disability and Health, and Rasch measurement theory. Methods: Secondary analysis of a large sample of patients with stroke, spinal cor...

  9. Biological data analysis as an information theory problem: multivariable dependence measures and the shadows algorithm.

    Science.gov (United States)

    Sakhanenko, Nikita A; Galas, David J

    2015-11-01

    Information theory is valuable in multiple-variable analysis for being model-free and nonparametric, and for the modest sensitivity to undersampling. We previously introduced a general approach to finding multiple dependencies that provides accurate measures of levels of dependency for subsets of variables in a data set, which is significantly nonzero only if the subset of variables is collectively dependent. This is useful, however, only if we can avoid a combinatorial explosion of calculations for increasing numbers of variables.  The proposed dependence measure for a subset of variables, τ, differential interaction information, Δ(τ), has the property that for subsets of τ some of the factors of Δ(τ) are significantly nonzero, when the full dependence includes more variables. We use this property to suppress the combinatorial explosion by following the "shadows" of multivariable dependency on smaller subsets. Rather than calculating the marginal entropies of all subsets at each degree level, we need to consider only calculations for subsets of variables with appropriate "shadows." The number of calculations for n variables at a degree level of d grows therefore, at a much smaller rate than the binomial coefficient (n, d), but depends on the parameters of the "shadows" calculation. This approach, avoiding a combinatorial explosion, enables the use of our multivariable measures on very large data sets. We demonstrate this method on simulated data sets, and characterize the effects of noise and sample numbers. In addition, we analyze a data set of a few thousand mutant yeast strains interacting with a few thousand chemical compounds.

  10. What is Fundamental?

    CERN Multimedia

    2004-01-01

    Discussing what is fundamental in a variety of fields, biologist Richard Dawkins, physicist Gerardus 't Hooft, and mathematician Alain Connes spoke to a packed Main Auditorium at CERN 15 October. Dawkins, Professor of the Public Understanding of Science at Oxford University, explained simply the logic behind Darwinian natural selection, and how it would seem to apply anywhere in the universe that had the right conditions. 't Hooft, winner of the 1999 Physics Nobel Prize, outlined some of the main problems in physics today, and said he thinks physics is so fundamental that even alien scientists from another planet would likely come up with the same basic principles, such as relativity and quantum mechanics. Connes, winner of the 1982 Fields Medal (often called the Nobel Prize of Mathematics), explained how physics is different from mathematics, which he described as a "factory for concepts," unfettered by connection to the physical world. On 16 October, anthropologist Sharon Traweek shared anecdotes from her ...

  11. Fundamental composite electroweak dynamics

    DEFF Research Database (Denmark)

    Arbey, Alexandre; Cacciapaglia, Giacomo; Cai, Haiying

    2017-01-01

    Using the recent joint results from the ATLAS and CMS collaborations on the Higgs boson, we determine the current status of composite electroweak dynamics models based on the expected scalar sector. Our analysis can be used as a minimal template for a wider class of models between the two limiting...... cases of composite Goldstone Higgs and Technicolor-like ones. This is possible due to the existence of a unified description, both at the effective and fundamental Lagrangian levels, of models of composite Higgs dynamics where the Higgs boson itself can emerge, depending on the way the electroweak...... space at the effective Lagrangian level. We show that a wide class of models of fundamental composite electroweak dynamics are still compatible with the present constraints. The results are relevant for the ongoing and future searches at the Large Hadron Collider....

  12. Fundamentals of nuclear physics

    CERN Document Server

    Takigawa, Noboru

    2017-01-01

    This book introduces the current understanding of the fundamentals of nuclear physics by referring to key experimental data and by providing a theoretical understanding of principal nuclear properties. It primarily covers the structure of nuclei at low excitation in detail. It also examines nuclear forces and decay properties. In addition to fundamentals, the book treats several new research areas such as non-relativistic as well as relativistic Hartree–Fock calculations, the synthesis of super-heavy elements, the quantum chromodynamics phase diagram, and nucleosynthesis in stars, to convey to readers the flavor of current research frontiers in nuclear physics. The authors explain semi-classical arguments and derivation of its formulae. In these ways an intuitive understanding of complex nuclear phenomena is provided. The book is aimed at graduate school students as well as junior and senior undergraduate students and postdoctoral fellows. It is also useful for researchers to update their knowledge of diver...

  13. Frontiers of Fundamental Physics

    CERN Document Server

    2014-01-01

    The 14th annual international symposium “Frontiers of Fundamental Physics” (FFP14) was organized by the OCEVU Labex. It was held in Marseille, on the Saint-Charles Campus of Aix Marseille University (AMU) and had over 280 participants coming from all over the world. FFP Symposium began in India in 1997 and it became itinerant in 2004, through Europe, Canada and Australia. It covers topics in fundamental physics with the objective to enable scholars working in related areas to meet on a single platform and exchange ideas. In addition to highlighting the progress in these areas, the symposium invites the top researchers to reflect on the educational aspects of our discipline. Moreover, the scientific concepts are also discussed through philosophical and epistemological viewpoints. Several eminent scientists, such as the laureates of prestigious awards (Nobel Prize, Fields Medal,…), have already participated in these meetings. The FFP14 Symposium developed around seven main themes, namely: Astroparticle Ph...

  14. Fundamental physics constants

    International Nuclear Information System (INIS)

    Cohen, E.R.; Taylor, B.N.

    1995-01-01

    Present technological applications require the values used for the fundamental physical and chemical constants to be more and more precise and at the same time coherent. Great importance is then attached to the task of coordinating and comparing the most recent experimental data, extracting from them as a whole, by means of a least square fit, a set of values for the fundamental constants as precise and coherent as possible. The set of values which is at present in usage, derives from a fit performed in 1986, but new experimental results already promise a large reduction in the uncertainties of various constants. A new global fit that will implement such reductions is scheduled for completion in 1995 or 1996

  15. Fundamental principles of a new EM tool for in-situ resistivity measurement. 2; Denji yudoho ni yoru gen`ichi hiteiko sokutei sochi no kento. 2

    Energy Technology Data Exchange (ETDEWEB)

    Noguchi, K.; Aoki, H. [Waseda University, Tokyo (Japan). School of Science and Engineering; Saito, A. [Mitsui Mineral Development Engineering Co. Ltd., Tokyo (Japan)

    1997-10-22

    In-situ resistivity measuring devices are tested for performance in relation to the principle of focusing. After numerical calculation, it is shown that in the absence of focusing the primary magnetic field will prevail and that changes in the separate-mode component will be difficult to detect in actual measurement because the in-phase component assumes a value far larger than the out-of-phase component. Concerning the transmission loop radius, the study reveals that a larger radius will yield a stronger response and that such will remove the influence of near-surface layers. Two types of devices are constructed, one applying the principle of focusing and the other not, and both are activated to measure the response from a saline solution medium. The results are compared and it is found that focusing eliminates the influence of the primary magnetic field and that it enables the measurement of changes in resistivity of the medium which cannot be detected in the absence of focusing. 3 refs., 9 figs.

  16. Fundamental concepts on energy

    International Nuclear Information System (INIS)

    Rodriguez, M.H.

    1998-01-01

    The fundamental concepts on energy and the different forms in which it is manifested are presented. Since it is possible to transform energy in a way to other, the laws that govern these transformations are discussed. The energy transformation processes are an essential compound in the capacity humanizes to survive and be developed. The energy use brings important economic aspects, technical and political. Because this, any decision to administer energy system will be key for our future life

  17. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  18. Fundamentals of gas counters

    International Nuclear Information System (INIS)

    Bateman, J.E.

    1994-01-01

    The operation of gas counters used for detecting radiation is explained in terms of the four fundamental physical processes which govern their operation. These are 1) conversion of neutral radiation into charged particles, 2) ionization of the host gas by a fast charge particle 3) transport of the gas ions to the electrodes and 4) amplification of the electrons in a region of enhanced electric field. Practical implications of these are illustrated. (UK)

  19. Fundamentals of linear algebra

    CERN Document Server

    Dash, Rajani Ballav

    2008-01-01

    FUNDAMENTALS OF LINEAR ALGEBRA is a comprehensive Text Book, which can be used by students and teachers of All Indian Universities. The Text has easy, understandable form and covers all topics of UGC Curriculum. There are lots of worked out examples which helps the students in solving the problems without anybody's help. The Problem sets have been designed keeping in view of the questions asked in different examinations.

  20. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Wells, J.; Mill, A.J.; Charles, M.W.

    1978-05-01

    The basic processes of living cells which are relevant to an understanding of the interaction of ionizing radiation with man are described. Particular reference is made to cell death, cancer induction and genetic effects. This is the second of a series of reports which present the fundamentals necessary for an understanding of the bases of regulatory criteria such as those recommended by the International Commision on Radiological Protection (ICRP). Others consider basic radiation physics and the biological effects of ionizing radiation. (author)

  1. Fundamentals of Fire Phenomena

    DEFF Research Database (Denmark)

    Quintiere, James

    analyses. Fire phenomena encompass everything about the scientific principles behind fire behaviour. Combining the principles of chemistry, physics, heat and mass transfer, and fluid dynamics necessary to understand the fundamentals of fire phenomena, this book integrates the subject into a clear...... as a visiting professor at BYG.DTU financed by the Larsen and Nielsen Foundation, and is entered to the research database by Kristian Hertz responsible for the visiting professorship....

  2. Remarks on “A new non-specificity measure in evidence theory based on belief intervals”

    Directory of Open Access Journals (Sweden)

    Joaquín ABELLÁN

    2018-03-01

    Full Text Available Two types of uncertainty co-exist in the theory of evidence: discord and non-specificity. From 90s, many mathematical expressions have arisen to quantify these two parts in an evidence. An important aspect of each measure presented is the verification of a coherent set of properties. About non-specificity, so far only one measure verifies an important set of those properties. Very recently, a new measure of non-specificity based on belief intervals has been presented as an alternative measure that quantifies a similar set of properties (Yang et al., 2016. It is shown that the new measure really does not verify two of those important properties. Some errors have been found in their corresponding proofs in the original publication. Keywords: Additivity, Imprecise probabilities, Non-specificity, Subadditivity, Theory of evidence, Uncertainty measures

  3. Fundamentals of Structural Geology

    Science.gov (United States)

    Pollard, David D.; Fletcher, Raymond C.

    2005-09-01

    Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors

  4. Theory of calorimetry

    CERN Document Server

    Zielenkiewicz, Wojciech

    2004-01-01

    The purpose of this book is to give a comprehensive description of the theoretical fundamentals of calorimetry. The considerations are based on the relations deduced from the laws and general equations of heat exchange theory and steering theory.

  5. Islamic Fundamentalism as a Signifier of the Sixth Phase of Globalization

    OpenAIRE

    Basri, Mohammad Hasan

    2011-01-01

    This article examines Islamic fundamentalism through a historical perspective on globalization offered by Roland Robertson, the theory of Islamic fundamentalism and its relation to the involvement of youth in Indonesia. In theory, Islamic fundamentalism is divided into two classifications. First is the "continuity and change", the development of fundamentalism in Islam is both continuity and change in Islamic history. The second theory is the "challenges and opportunities". The basic assumpti...

  6. Technical prerequisites for efficient drive systems - Fundamentals for SwissEnergy measures; Technische Grundlagen effizienter Antriebssysteme. Grundlagen fuer Aktionen (Massnahmen) von Energieschweiz

    Energy Technology Data Exchange (ETDEWEB)

    Schnyder, G.; Ritz, Ch.

    2007-03-15

    This final report for the Swiss Federal Office of Energy (SFOE) reports on the technical prerequisites necessary for the implementation of various measures that are to be taken to promote efficient electrical drive systems. The document defines the approach taken and describes the methodologies to be used, including market analysis, the collection of basic data, the definition of measures and the acquisition of partners. The potential for making savings is estimated. Eight areas of action are defined, including the organisation of tutorials, exchange of experience, knowledge transfer, basic consulting services, the deployment of consultants, the setting-up of an Internet portal, information transfer in conferences and the optimisation of auxiliaries in domestic installations. A comprehensive annex completes the report.

  7. Dynamic vibration measurements at the fundaments of wind power plants to reduce the expenses during maintenance; Dynamische Schwingungsmessungen an WEA-Fundamenten zur Kostenreduzierung bei der Instandhaltung

    Energy Technology Data Exchange (ETDEWEB)

    Deininger, Klaus [KTW Umweltschutztechnik GmbH, Mellingen (Germany)

    2013-06-01

    The author of the contribution under consideration reports on dynamic vibration measurements at the foundations of wind power plants. Typical damages at these foundations as well as various options of sealing are described. The author recommends the installation of condition monitoring systems which punctually display critical states of wind power plants using the global positioning system or direct involvement in the entire data of the wind power plant.

  8. Fundamental principles of a new EM tool for in-situ resistivity measurement; Denji yudoho ni yoru den`ichi hiteiko sokutei sochi no kento

    Energy Technology Data Exchange (ETDEWEB)

    Noguchi, K.; Aoki, H. [Waseda University, Tokyo (Japan). School of Science and Engineering; Saito, A. [Mitsui Mineral Development Engineering Co. Ltd., Tokyo (Japan)

    1996-05-01

    For the purpose of measuring in-situ resistivity without contact with the rock, a study was made about a measuring device using electromagnetic induction. This measuring device has two concentric transmission loops and a receiving point at the center of the loops, and performs focusing by canceling the primary magnetic field at the receiving point. Using this device, a trial was made to eliminate the influence of surface undulation. In the model calculation, response was calculated after the structure with a heavily undulated ground surface was replaced by a two-layer structure with the first layer provided with a higher resistivity. In the model, the first layer had a resistivity of 10000 Ohm m, and the second layer 1000 Ohm m. Using the ratio between the transmission loop radii as a parameter, relationship with the thickness of the first layer was studied, and it was found that the sensitivity to the second layer resistivity increases when the inner and outer loops are nearer to each other in terms of radius and that this eliminates the influence near the surface layer. A decision needs to fall within a scope assuring good reception because response intensity decreases as the ratio between the transmission loop radii approaches 1. 3 refs., 11 figs.

  9. Fundamental studies of fusion plasmas

    Science.gov (United States)

    Aamodt, R. E.; Catto, P. J.; Dippolito, D. A.; Myra, J. R.; Russell, D. A.

    1992-05-01

    The major portion of this program is devoted to critical ICH phenomena. The topics include edge physics, fast wave propagation, ICH induced high frequency instabilities, and a preliminary antenna design for Ignitor. This research was strongly coordinated with the world's experimental and design teams at JET, Culham, ORNL, and Ignitor. The results have been widely publicized at both general scientific meetings and topical workshops including the specialty workshop on ICRF design and physics sponsored by Lodestar in April 1992. The combination of theory, empirical modeling, and engineering design in this program makes this research particularly important for the design of future devices and for the understanding and performance projections of present tokamak devices. Additionally, the development of a diagnostic of runaway electrons on TEXT has proven particularly useful for the fundamental understanding of energetic electron confinement. This work has led to a better quantitative basis for quasilinear theory and the role of magnetic vs. electrostatic field fluctuations on electron transport. An APS invited talk was given on this subject and collaboration with PPPL personnel was also initiated. Ongoing research on these topics will continue for the remainder of the contract period and the strong collaborations are expected to continue, enhancing both the relevance of the work and its immediate impact on areas needing critical understanding.

  10. Ultrasonic attenuation measurements at very high SNR: Correlation, information theory and performance

    International Nuclear Information System (INIS)

    Challis, Richard; Ivchenko, Vladimir; Al-Lashi, Raied

    2013-01-01

    This paper describes a system for ultrasonic wave attenuation measurements which is based on pseudo-random binary codes as transmission signals combined with on-the-fly correlation for received signal detection. The apparatus can receive signals in the nanovolt range against a noise background in the order of hundreds of microvolts and an analogue to digital convertor (ADC) bit-step also in the order of hundreds of microvolts. Very high signal to noise ratios (SNRs) are achieved without recourse to coherent averaging with its associated requirement for high sampling times. The system works by a process of dithering – in which very low amplitude received signals enter the dynamic range of the ADC by 'riding' on electronic noise at the system input. The amplitude of this 'useful noise' has to be chosen with care for an optimised design. The process of optimisation is explained on the basis of classical information theory and is achieved through a simple noise model. The performance of the system is examined for different transmitted code lengths and gain settings in the receiver chain. Experimental results are shown to verify the expected operation when the system is applied to a very highly attenuating material – an aerated slurry

  11. Occupational hazard evaluation model underground coal mine based on unascertained measurement theory

    Science.gov (United States)

    Deng, Quanlong; Jiang, Zhongan; Sun, Yaru; Peng, Ya

    2017-05-01

    In order to study how to comprehensively evaluate the influence of several occupational hazard on miners’ physical and mental health, based on unascertained measurement theory, occupational hazard evaluation indicator system was established to make quantitative and qualitative analysis. Determining every indicator weight by information entropy and estimating the occupational hazard level by credible degree recognition criteria, the evaluation model was programmed by Visual Basic, applying the evaluation model to occupational hazard comprehensive evaluation of six posts under a coal mine, and the occupational hazard degree was graded, the evaluation results are consistent with actual situation. The results show that dust and noise is most obvious among the coal mine occupational hazard factors. Excavation face support workers are most affected, secondly, heading machine drivers, coal cutter drivers, coalface move support workers, the occupational hazard degree of these four types workers is II mild level. The occupational hazard degree of ventilation workers and safety inspection workers is I level. The evaluation model could evaluate underground coal mine objectively and accurately, and can be employed to the actual engineering.

  12. On the use of the Webb-Pearman-Leuning theory for closed-path eddy correlation measurements

    DEFF Research Database (Denmark)

    Ibrom, Andreas; Dellwik, Ebba; Larsen, Søren Ejling

    2007-01-01

    We consider an imperfection of real closed-path eddy correlation systems-the decoupling of the water vapour and CO2 concentrations-with respect to the application of the Webb-Pearman-Leuning (WPL) theory. It is described why and how the current application of the WPL theory needs to be adapted...... to the processes in closed-path sensors. We show the quantitative effects of applying the WPL theory in different ways using CO2 flux measurements taken above the Danish Beech forest CarboEurope site near Soro, Zealand. Using the WPL theory in closed-path sensors without taking amplitude damping and decoupling...... not apply to open-path sensors....

  13. Quantum mechanics. A basic course of non-relativistic quantum theory. 2. rev. and upd. ed.

    International Nuclear Information System (INIS)

    Straumann, Norbert

    2013-01-01

    The following topics are dealt with: Matter waves and Schroedinger equation, the statistical interpretation of the wave function, uncertainty relations and measuring process, the formal principles of quantum mechanics, angular momentum and particles with spin, perturbation theory and applications, many-electron systems, scattering theory, quantum chemistry, time dependent perturbation theory, fundamental problems of quantum mechanics. (HSI)

  14. Social Class and Work-Related Decisions: Measurement, Theory, and Social Mobility

    Science.gov (United States)

    Fouad, Nadya A.; Fitzpatrick, Mary E.

    2009-01-01

    In this reaction to Diemer and Ali's article, "Integrating Social Class Into Vocational Psychology: Theory and Practice Implications," the authors point out concerns with binary schema of social class, highlight the contribution of social class to the social cognitive career theory, argue for a more nuanced look at ways that work…

  15. Cognitive Load Theory: Advances in Research on Worked Examples, Animations, and Cognitive Load Measurement

    NARCIS (Netherlands)

    T.A.J.M. van Gog (Tamara); G.W.C. Paas (Fred); J. Sweller (John)

    2010-01-01

    textabstractThe contributions to this special issue document some recent advances of cognitive load theory, and are based on contributions to the Third International Cognitive Load Theory Conference (2009), Heerlen, The Netherlands. The contributions focus on developments in example-based learning,

  16. Laser Resonators and Beam Propagation Fundamentals, Advanced Concepts and Applications

    CERN Document Server

    Hodgson, Norman

    2005-01-01

    Optical Resonators provides a detailed discussion of the properties of optical resonators for lasers from basic theory to recent research. In addition to describing the fundamental theories of resonators such as geometrical optics, diffraction, and polarisation the characteristics of all important resonator schemes and their calculation are presented. Experimental examples, practical problems and a collection of measurement techniques support the comprehensive treatment of the subject. Optical Resonators is the only book currently available that provides a comprehensive overview of the the subject. Combined with the structure of the text and the autonomous nature of the chapters this work will be as suitable for those new to the field as it will be invaluable to specialists conducting research. This second edition has been enlarged by new sections on Q-switching and resonators with internal phase/amplitude control. In addition, the whole book has been brought up-to-date.

  17. Photovoltaics fundamentals, technology and practice

    CERN Document Server

    Mertens, Konrad

    2013-01-01

    Concise introduction to the basic principles of solar energy, photovoltaic systems, photovoltaic cells, photovoltaic measurement techniques, and grid connected systems, overviewing the potential of photovoltaic electricity for students and engineers new to the topic After a brief introduction to the topic of photovoltaics' history and the most important facts, Chapter 1 presents the subject of radiation, covering properties of solar radiation, radiation offer, and world energy consumption. Chapter 2 looks at the fundamentals of semiconductor physics. It discusses the build-up of semiconducto

  18. Fundamental Physics with Space Experiments

    Science.gov (United States)

    Vitale, S.

    I review a category of experiments in fundamental physics that need space as a laboratory. All these experiments have in common the need of a very low gravity environment to achieve as an ideal free fall as possible: LISA, the gravitational wave observatory, and its technology demonstrator SMART-2. The satellite tests of the equivalence principle Microscope, and the ultimate sensitivity one STEP, with its close heritage from GP-B, the experiment to measure the gravito-magnetic field of the Earth. Finally the entirely new field of cold atoms in space with its promise to produce the next generation of inertial gravitational and inertial sensors for general relativity experiments.

  19. On how access to an insurance market affects investments in safety measures, based on the expected utility theory

    International Nuclear Information System (INIS)

    Bjorheim Abrahamsen, Eirik; Asche, Frank

    2011-01-01

    This paper focuses on how access to an insurance market should influence investments in safety measures in accordance with the ruling paradigm for decision-making under uncertainty-the expected utility theory. We show that access to an insurance market in most situations will influence investments in safety measures. For an expected utility maximizer, an overinvestment in safety measures is likely if access to an insurance market is ignored, while an underinvestment in safety measures is likely if insurance is purchased without paying attention to the possibility for reducing the probability and/or consequences of an accidental event by safety measures.

  20. FUNDAMENTALS OF BIOMECHANICS

    Directory of Open Access Journals (Sweden)

    Duane Knudson

    2007-09-01

    Full Text Available DESCRIPTION This book provides a broad and in-depth theoretical and practical description of the fundamental concepts in understanding biomechanics in the qualitative analysis of human movement. PURPOSE The aim is to bring together up-to-date biomechanical knowledge with expert application knowledge. Extensive referencing for students is also provided. FEATURES This textbook is divided into 12 chapters within four parts, including a lab activities section at the end. The division is as follows: Part 1 Introduction: 1.Introduction to biomechanics of human movement; 2.Fundamentals of biomechanics and qualitative analysis; Part 2 Biological/Structural Bases: 3.Anatomical description and its limitations; 4.Mechanics of the musculoskeletal system; Part 3 Mechanical Bases: 5.Linear and angular kinematics; 6.Linear kinetics; 7.Angular kinetics; 8.Fluid mechanics; Part 4 Application of Biomechanics in Qualitative Analysis :9.Applying biomechanics in physical education; 10.Applying biomechanics in coaching; 11.Applying biomechanics in strength and conditioning; 12.Applying biomechanics in sports medicine and rehabilitation. AUDIENCE This is an important reading for both student and educators in the medicine, sport and exercise-related fields. For the researcher and lecturer it would be a helpful guide to plan and prepare more detailed experimental designs or lecture and/or laboratory classes in exercise and sport biomechanics. ASSESSMENT The text provides a constructive fundamental resource for biomechanics, exercise and sport-related students, teachers and researchers as well as anyone interested in understanding motion. It is also very useful since being clearly written and presenting several ways of examples of the application of biomechanics to help teach and apply biomechanical variables and concepts, including sport-related ones

  1. A fundamental study on measurement of internal residual stress of sintered Fe-Cr/TiN composite material with neutron diffraction

    International Nuclear Information System (INIS)

    Takago, Shigeki; Sasaki, Toshihiko; Hirose, Yukio; Minakawa, Nobuaki; Morii, Yukio

    2001-01-01

    The Neutron diffraction technique was applied for the internal stress measurements of a composite material consisted of chromium alloy and titanium nitride manufactured by the powder metallurgy. The material has been developed for the valve seat insert of diesel engines in automobiles, because material has high wear-resistance and heat-resistance. In this study, the influence of the titanium nitride on the stresses in each phase was investigated. The Fe-Cr 200 diffraction peak occurs at 2θ=93.4 deg. and the TiN 311 diffraction peak at 2θ=109.5 deg are available. Neutron diffraction data obtained from both phases were compared to the Micromechanics model based on Eshelby's approach and the Mori-Tanaka theorem. It was found that experimental phase stress agrees well with the theoretical estimation. It has been shown that neutron diffraction method is suitable to determine the residual stress of composite materials. (author)

  2. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Charles, M.W.; Wells, J.; Mill, A.J.

    1978-04-01

    A brief review is presented of the early and late effects of ionising radiation on man, with particular emphasis on those aspects of importance in radiological protection. The terminology and dose response curves, are explained. Early effects on cells, tissues and whole organs are discussed. Late somatic effects considered include cancer and life-span shortening. Genetic effects are examined. The review is the third of a series of reports which present the fundamentals necessary for an understanding of the basis of regulatory criteria, such as those of the ICRP. (u.K.)

  3. Fundamentals of microwave photonics

    CERN Document Server

    Urick, V J; McKinney , Jason D

    2015-01-01

    A comprehensive resource to designing andconstructing analog photonic links capable of high RFperformanceFundamentals of Microwave Photonics provides acomprehensive description of analog optical links from basicprinciples to applications.  The book is organized into fourparts. The first begins with a historical perspective of microwavephotonics, listing the advantages of fiber optic links anddelineating analog vs. digital links. The second section coversbasic principles associated with microwave photonics in both the RFand optical domains.  The third focuses on analog modulationformats-starti

  4. Fundamental of biomedical engineering

    CERN Document Server

    Sawhney, GS

    2007-01-01

    About the Book: A well set out textbook explains the fundamentals of biomedical engineering in the areas of biomechanics, biofluid flow, biomaterials, bioinstrumentation and use of computing in biomedical engineering. All these subjects form a basic part of an engineer''s education. The text is admirably suited to meet the needs of the students of mechanical engineering, opting for the elective of Biomedical Engineering. Coverage of bioinstrumentation, biomaterials and computing for biomedical engineers can meet the needs of the students of Electronic & Communication, Electronic & Instrumenta

  5. Nanomachines fundamentals and applications

    CERN Document Server

    Wang, Joseph

    2013-01-01

    This first-hand account by one of the pioneers of nanobiotechnology brings together a wealth of valuable material in a single source. It allows fascinating insights into motion at the nanoscale, showing how the proven principles of biological nanomotors are being transferred to artificial nanodevices.As such, the author provides engineers and scientists with the fundamental knowledge surrounding the design and operation of biological and synthetic nanomotors and the latest advances in nanomachines. He addresses such topics as nanoscale propulsions, natural biomotors, molecular-scale machin

  6. Fundamentals of semiconductor devices

    CERN Document Server

    Lindmayer, Joseph

    1965-01-01

    Semiconductor properties ; semiconductor junctions or diodes ; transistor fundamentals ; inhomogeneous impurity distributions, drift or graded-base transistors ; high-frequency properties of transistors ; band structure of semiconductors ; high current densities and mechanisms of carrier transport ; transistor transient response and recombination processes ; surfaces, field-effect transistors, and composite junctions ; additional semiconductor characteristics ; additional semiconductor devices and microcircuits ; more metal, insulator, and semiconductor combinations for devices ; four-pole parameters and configuration rotation ; four-poles of combined networks and devices ; equivalent circuits ; the error function and its properties ; Fermi-Dirac statistics ; useful physical constants.

  7. Fundamentals of Project Management

    CERN Document Server

    Heagney, Joseph

    2011-01-01

    With sales of more than 160,000 copies, Fundamentals of Project Management has helped generations of project managers navigate the ins and outs of every aspect of this complex discipline. Using a simple step-by-step approach, the book is the perfect introduction to project management tools, techniques, and concepts. Readers will learn how to: ò Develop a mission statement, vision, goals, and objectives ò Plan the project ò Create the work breakdown structure ò Produce a workable schedule ò Understand earned value analysis ò Manage a project team ò Control and evaluate progress at every stage.

  8. DOE fundamentals handbook: Chemistry

    International Nuclear Information System (INIS)

    1993-01-01

    This handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of chemistry. This volume contains the following modules: reactor water chemistry (effects of radiation on water chemistry, chemistry parameters), principles of water treatment (purpose; treatment processes [ion exchange]; dissolved gases, suspended solids, and pH control; water purity), and hazards of chemicals and gases (corrosives [acids, alkalies], toxic compounds, compressed gases, flammable/combustible liquids)

  9. Fundamentals of Cavitation

    CERN Document Server

    Franc, Jean-Pierre

    2005-01-01

    The present book is aimed at providing a comprehensive presentation of cavitation phenomena in liquid flows. It is further backed up by the experience, both experimental and theoretical, of the authors whose expertise has been internationally recognized. A special effort is made to place the various methods of investigation in strong relation with the fundamental physics of cavitation, enabling the reader to treat specific problems independently. Furthermore, it is hoped that a better knowledge of the cavitation phenomenon will allow engineers to create systems using it positively. Examples in the literature show the feasibility of this approach.

  10. Fundamentals of calculus

    CERN Document Server

    Morris, Carla C

    2015-01-01

    Fundamentals of Calculus encourages students to use power, quotient, and product rules for solutions as well as stresses the importance of modeling skills.  In addition to core integral and differential calculus coverage, the book features finite calculus, which lends itself to modeling and spreadsheets.  Specifically, finite calculus is applied to marginal economic analysis, finance, growth, and decay.  Includes: Linear Equations and FunctionsThe DerivativeUsing the Derivative Exponential and Logarithmic Functions Techniques of DifferentiationIntegral CalculusIntegration TechniquesFunctions

  11. Fundamentals of magnetism

    CERN Document Server

    Getzlaff, Mathias

    2007-01-01

    In the last decade a tremendous progress has taken place in understanding the basis of magnetism, especially in reduced dimensions. In the first part, the fundamentals of magnetism are conveyed for atoms and bulk-like solid-state systems providing a basis for the understanding of new phenomena which exclusively occur in low-dimensional systems as the giant magneto resistance. This wide field is discussed in the second part and illustrated by copious examples. This textbook is particularly suitable for graduate students in physical and materials sciences. It includes numerous examples, exercises, and references.

  12. DOE fundamentals handbook: Chemistry

    International Nuclear Information System (INIS)

    1993-01-01

    The Chemistry Handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of chemistry. The handbook includes information on the atomic structure of matter; chemical bonding; chemical equations; chemical interactions involved with corrosion processes; water chemistry control, including the principles of water treatment; the hazards of chemicals and gases, and basic gaseous diffusion processes. This information will provide personnel with a foundation for understanding the chemical properties of materials and the way these properties can impose limitations on the operation of equipment and systems

  13. Electronic circuits fundamentals & applications

    CERN Document Server

    Tooley, Mike

    2015-01-01

    Electronics explained in one volume, using both theoretical and practical applications.New chapter on Raspberry PiCompanion website contains free electronic tools to aid learning for students and a question bank for lecturersPractical investigations and questions within each chapter help reinforce learning Mike Tooley provides all the information required to get to grips with the fundamentals of electronics, detailing the underpinning knowledge necessary to appreciate the operation of a wide range of electronic circuits, including amplifiers, logic circuits, power supplies and oscillators. The

  14. El grupo fundamental

    Directory of Open Access Journals (Sweden)

    Carlos A. Robles Corbalá

    2015-12-01

    Full Text Available En este artículo se aborda un problema clásico para poder detectar si dos espacios topológicos son homeomorfos o no. Para lo cual a cada espacio topológico se le asocia un grupo algebraico, de tal suerte que si los espacios son homeomorfos, entonces los grupos asociados serán isomorfos. Se presenta una construcción del grupo fundamental de un espacio topológico y se enfoca en demostrar que efectivamente es un grupo.

  15. Measuring the quality of life in hypertension according to Item Response Theory.

    Science.gov (United States)

    Borges, José Wicto Pereira; Moreira, Thereza Maria Magalhães; Schmitt, Jeovani; Andrade, Dalton Francisco de; Barbetta, Pedro Alberto; Souza, Ana Célia Caetano de; Lima, Daniele Braz da Silva; Carvalho, Irialda Saboia

    2017-05-04

    To analyze the Miniquestionário de Qualidade de Vida em Hipertensão Arterial (MINICHAL - Mini-questionnaire of Quality of Life in Hypertension) using the Item Response Theory. This is an analytical study conducted with 712 persons with hypertension treated in thirteen primary health care units of Fortaleza, State of Ceará, Brazil, in 2015. The steps of the analysis by the Item Response Theory were: evaluation of dimensionality, estimation of parameters of items, and construction of scale. The study of dimensionality was carried out on the polychoric correlation matrix and confirmatory factor analysis. To estimate the item parameters, we used the Gradual Response Model of Samejima. The analyses were conducted using the free software R with the aid of psych and mirt. The analysis has allowed the visualization of item parameters and their individual contributions in the measurement of the latent trait, generating more information and allowing the construction of a scale with an interpretative model that demonstrates the evolution of the worsening of the quality of life in five levels. Regarding the item parameters, the items related to the somatic state have had a good performance, as they have presented better power to discriminate individuals with worse quality of life. The items related to mental state have been those which contributed with less psychometric data in the MINICHAL. We conclude that the instrument is suitable for the identification of the worsening of the quality of life in hypertension. The analysis of the MINICHAL using the Item Response Theory has allowed us to identify new sides of this instrument that have not yet been addressed in previous studies. Analisar o Miniquestionário de Qualidade de Vida em Hipertensão Arterial (MINICHAL) por meio da Teoria da Resposta ao Item. Estudo analítico realizado com 712 pessoas com hipertensão arterial atendidas em 13 unidades de atenção primária em saúde de Fortaleza, CE, em 2015. As etapas da an

  16. Fundamental volatility and stock returns : does fundamental volatility explain stock returns?

    OpenAIRE

    Selboe, Guner K.; Virdee, Jaspal Singh

    2017-01-01

    In this thesis, we investigate whether the fundamental uncertainty can explain the crosssection of stock returns. To measure the fundamental uncertainty, we estimate rolling standard deviations and accounting betas of four different fundamentals: revenues, gross profit, earnings and cash flows. The standard deviation and the beta of revenues significantly explain returns in the Fama-Macbeth procedure, but only appears significant among smaller stocks in the portfolio formation ...

  17. Fundamentals of QCD

    International Nuclear Information System (INIS)

    Taylor, J.C.

    1983-01-01

    The author introduces quantum chromodynamics as a SU(3)-Yang-Mills theory describing the interactions between the quarks. After a general introduction the Feynman rules are discussed. Then the Ward identity is considered. Thereafter the renormalization is described. Finally the beta function and asymptotic freedom are considered. (HSI)

  18. Development of Measurement Facility for Sound Speed of Hydrogen with a Spherical Acoustic Resonator ~Theory of Measurement and Calibration for

    OpenAIRE

    山口, 朝彦; 桃木, 悟; ジャンバル, オダゲレル; 松崎, 勇人; 上滝, 祐介; 今道, 統也; 金丸, 邦康

    2011-01-01

    Thermophysical properties of hydrogen are needed to design equipments associated with hydrogen. The speed of sound is one of thermophysical properties itself and includes the important information of thermophysical properties, such as specific heat at ideal gas. In this paper, the measurement facility for sound speed of hydrogen with spherical acoustic resonator which we made and the theory of acoustic measurement are explained. We have calibrated the radius of the spherical cavity in this ap...

  19. Graph theory

    CERN Document Server

    Gould, Ronald

    2012-01-01

    This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S

  20. Quantum Theory and Beyond

    Science.gov (United States)

    Bastin, Ted

    2009-07-01

    List of participants; Preface; Part I. Introduction: 1. The function of the colloquium - editorial; 2. The conceptual problem of quantum theory from the experimentalist's point of view O. R. Frisch; Part II. Niels Bohr and Complementarity: The Place of the Classical Language: 3. The Copenhagen interpretation C. F. von Weizsäcker; 4. On Bohr's views concerning the quantum theory D. Bohm; Part III. The Measurement Problem: 5. Quantal observation in statistical interpretation H. J. Groenewold; 6. Macroscopic physics, quantum mechanics and quantum theory of measurement G. M. Prosperi; 7. Comment on the Daneri-Loinger-Prosperi quantum theory of measurement Jeffrey Bub; 8. The phenomenology of observation and explanation in quantum theory J. H. M. Whiteman; 9. Measurement theory and complex systems M. A. Garstens; Part IV. New Directions within Quantum Theory: What does the Quantum Theoretical Formalism Really Tell Us?: 10. On the role of hidden variables in the fundamental structure of physics D. Bohm; 11. Beyond what? Discussion: space-time order within existing quantum theory C. W. Kilmister; 12. Definability and measurability in quantum theory Yakir Aharonov and Aage Petersen; 13. The bootstrap idea and the foundations of quantum theory Geoffrey F. Chew; Part V. A Fresh Start?: 14. Angular momentum: an approach to combinatorial space-time Roger Penrose; 15. A note on discreteness, phase space and cohomology theory B. J. Hiley; 16. Cohomology of observations R. H. Atkin; 17. The origin of half-integral spin in a discrete physical space Ted Bastin; Part VI. Philosophical Papers: 18. The unity of physics C. F. von Weizsäcker; 19. A philosophical obstacle to the rise of new theories in microphysics Mario Bunge; 20. The incompleteness of quantum mechanics or the emperor's missing clothes H. R. Post; 21. How does a particle get from A to B?; Ted Bastin; 22. Informational generalization of entropy in physics Jerome Rothstein; 23. Can life explain quantum mechanics? H. H