International Nuclear Information System (INIS)
Walker, T.P.
1993-01-01
The standard model of the hot big bang assumes a homogeneous and isotropic Universe with gravity described by General Relativity and strong and electroweak interactions described by the Standard Model of particle physics. The hot big bang model makes the unavoidable prediction that the production of primordial elements occurred about one minute after the big band (referred to as big bang or primordial nucleosynthesis BBN). This review concerns the range of the primordial abundance of 4 He as predicted by standard BBN (i.e., primordial nucleosynthesis assuming a homogeneous distribution of baryons). In it the author discusses: (1) Uncertainties in the calculation of Y p (the mass fraction of primordial 4 He), (2) The expected range of Y p , (3) How the predictions stack up against the latest observations, and (4) The latest BBN bounds on Ω B h 2 and N ν . 13 refs., 2 figs
Ultra-cold WIMPs relics of non-standard pre-BBN cosmologies
Gelmini, Graciela B
2008-01-01
We point out that in scenarios in which the Universe evolves in a non-standard manner during and after the kinetic decoupling of weakly interacting massive particles (WIMPs), these relics can be much colder than in standard cosmological scenarios (i.e. can be ultra-cold), possibly leading to the formation of smaller first objects in hierarchical structure formation scenarios.
Some nuclear physics aspects of BBN
Coc, Alain
2017-09-01
Primordial or big bang nucleosynthesis (BBN) is now a parameter free theory whose predictions are in good overall agreement with observations. However, the 7 Li calculated abundance is significantly higher than the one deduced from spectroscopic observations. Nuclear physics solutions to this lithium problem have been investigated by experimental means. Other solutions which were considered involve exotic sources of extra neutrons which inevitably leads to an increase of the deuterium abundance, but this seems now excluded by recent deuterium observations.
BBN based Quantitative Assessment of Software Design Specification
International Nuclear Information System (INIS)
Eom, Heung-Seop; Park, Gee-Yong; Kang, Hyun-Gook; Kwon, Kee-Choon; Chang, Seung-Cheol
2007-01-01
Probabilistic Safety Assessment (PSA), which is one of the important methods in assessing the overall safety of a nuclear power plant (NPP), requires quantitative reliability information of safety-critical software, but the conventional reliability assessment methods can not provide enough information for PSA of a NPP. Therefore current PSA which includes safety-critical software does not usually consider the reliability of the software or uses arbitrary values for it. In order to solve this situation this paper proposes a method that can produce quantitative reliability information of safety-critical software for PSA by making use of Bayesian Belief Networks (BBN). BBN has generally been used to model an uncertain system in many research fields including the safety assessment of software. The proposed method was constructed by utilizing BBN which can combine the qualitative and the quantitative evidence relevant to the reliability of safety critical software. The constructed BBN model can infer a conclusion in a formal and a quantitative way. A case study was carried out with the proposed method to assess the quality of software design specification (SDS) of safety-critical software that will be embedded in a reactor protection system. The intermediate V and V results of the software design specification were used as inputs to the BBN model
Introduction to the theory of standard monomials
Seshadri, C S
2016-01-01
The book is a reproduction of a course of lectures delivered by the author in 1983-84 which appeared in the Brandeis Lecture Notes series. The aim of this course was to give an introduction to the series of papers by concentrating on the case of the full linear group. In recent years, there has been great progress in standard monomial theory due to the work of Peter Littelmann. The author’s lectures (reproduced in this book) remain an excellent introduction to standard monomial theory. d-origin: initial; background-clip: initial; background-position: initial; background-repeat: initial;">Standard monomial theory deals with the construction of nice bases of finite dimensional irreducible representations of semi-simple algebraic groups or, in geometric terms, nice bases of coordinate rings of flag varieties (and their Schubert subvarieties) associated with these groups. Besides its intrinsic interest, standard monomial theory has applications to the study of the geometry of Schubert varieties. Standard monomi...
Dynamical 3-Space Predicts Hotter Early Universe: Resolves CMB-BBN 7-Li and 4-He Abundance Anomalies
Directory of Open Access Journals (Sweden)
Cahill R. T.
2010-01-01
Full Text Available The observed abundances of 7 Li and 4 He are significantly inconsistent with the pre- dictions from Big Bang Nucleosynthesis (BBN when using the CDM cosmolog- ical model together with the value for B h 2 = 0 : 0224 0 : 0009 from WMAP CMB fluctuations, with the value from BBN required to fit observed abundances being 0 : 009 < B h 2 < 0 : 013. The dynamical 3-space theory is shown to predict a 20% hot- ter universe in the radiation-dominated epoch, which then results in a remarkable parameter-free agreement between the BBN and the WMAP value for B h 2 . The dy- namical 3-space also gives a parameter-free fit to the supernova redshift data, and pre- dicts that the flawed CDM model would require = 0 : 73 and M = 0 : 27 to fit the 3-space dynamics Hubble expansion, and independently of the supernova data. These results amount to the discovery of new physics for the early universe that is matched by numerous other successful observational and experimental tests.
Dynamical 3-Space Predicts Hotter Early Universe: Resolves CMB-BBN 7-Li and 4-He Abundance Anomalies
Directory of Open Access Journals (Sweden)
Cahill R. T.
2010-01-01
Full Text Available The observed abundances of 7-Li and 4-He are significantly inconsistent with the predictions from Big Bang Nucleosynthesis (BBN when using the $Lambda$CDM cosmological model together with the value for $Omega_B h^2 = 0.0224pm0.0009$ from WMAP CMB fluctuations, with the value from BBN required to fit observed abundances being $0.009 < Omega_B h^2 < 0.013$. The dynamical 3-space theory is shown to predict a 20% hotter universe in the radiation-dominated epoch, which then results in a remarkable parameter-free agreement between the BBN and the WMAP value for $Omega_B h^2$. The dynamical 3-space also gives a parameter-free fit to the supernova redshift data, and predicts that the flawed $Lambda$CDM model would require $Omega_Lambda = 0.73$ and $Omega_M = 0.27$ to fit the 3-space dynamics Hubble expansion, and independently of the supernova data. These results amount to the discovery of new physics for the early universe that is matched by numerous other successful observational and experimental tests.
The cosmic 6Li and 7Li problems and BBN with long-lived charged massive particles
International Nuclear Information System (INIS)
Karsten, Jedamzik
2007-01-01
Charged massive particles (CHAMPs), when present during the Big Bang nucleosynthesis (BBN) era, may significantly alter the synthesis of light elements when compared to a standard BBN scenario. This is due to the formation of bound states with nuclei. This paper presents a detailed numerical and analytical analysis of such CHAMP BBN. All reactions important for predicting light-element yields are calculated within the Born approximation. Three prior neglected effects are treated in detail: (a) photo destruction of bound states due to electromagnetic cascades induced by the CHAMP decay, (b) late-time efficient destruction/production of H 2 , Li 6 , and Li 7 due to reactions on charge Z = 1 nuclei bound to CHAMPs, and (c) CHAMP exchange between nuclei. Each of these effects may induce orders-of-magnitude changes in the final abundance yields. The study focusses on the impact of CHAMPs on a possible simultaneous solution of the Li 6 and Li 7 problems. It is shown that a prior suggested simultaneous solution of the Li 6 and Li 7 problems for a relic decaying at τ x ∼ 1000 s is only very weakly dependent on the relic being neutral or charged, unless its hadronic branching ratio is B h -4 very small. By use of a Monte-Carlo analysis it is shown that within CHAMP BBN the existence of further parameter space for a simultaneous solution of the Li 6 and Li 7 problem for long decay times τ x ≥ 10 6 s seems possible but fairly unlikely. (author)
Hyperfinite and standard unifications for physical theories
Directory of Open Access Journals (Sweden)
Robert A. Herrmann
2001-01-01
Full Text Available A set of physical theories is represented by a nonempty subset {SNjV|j∈ℕ} of the lattice of consequence operators defined on a language Λ. It is established that there exists a unifying injection defined on the nonempty set of significant representations for natural systems M⊂Λ. If W∈M, then W is a hyperfinite ultralogic and ⋃{SNjV(W|j∈ℕ}=W(*W∩Λ. A product hyperfinite ultralogic Π is defined on internal subsets of the product set *Λm and is shown to represent the application of to {W1,…,Wm}⊂M. There also exists a standard unifying injection SW such that W(*W⊂*SW(*W.
Constraining f(T) teleparallel gravity by big bang nucleosynthesis. f(T) cosmology and BBN
Energy Technology Data Exchange (ETDEWEB)
Capozziello, S. [Universita di Napoli ' ' Federico II' ' , Complesso Universitario di Monte Sant' Angelo, Dipartimento di Fisica ' ' E. Pancini' ' , Napoli (Italy); Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Napoli (Italy); Gran Sasso Science Institute, L' Aquila (Italy); Lambiase, G. [University of Salerno, Dipartimento di Fisica E.R. Cainaiello, Fisciano (Italy); INFN, Gruppo Collegato di Salerno, Sezione di Napoli, Fisciano (Italy); Saridakis, E.N. [National Technical University of Athens, Department of Physics, Athens (Greece); Baylor University, CASPER, Physics Department, Waco, TX (United States)
2017-09-15
We use Big Bang Nucleosynthesis (BBN) observational data on the primordial abundance of light elements to constrain f(T) gravity. The three most studied viable f(T) models, namely the power law, the exponential and the square-root exponential are considered, and the BBN bounds are adopted in order to extract constraints on their free parameters. For the power-law model, we find that the constraints are in agreement with those obtained using late-time cosmological data. For the exponential and the square-root exponential models, we show that for reliable regions of parameters space they always satisfy the BBN bounds. We conclude that viable f(T) models can successfully satisfy the BBN constraints. (orig.)
Constraining f( T) teleparallel gravity by big bang nucleosynthesis. f(T) cosmology and BBN
Capozziello, S.; Lambiase, G.; Saridakis, E. N.
2017-09-01
We use Big Bang Nucleosynthesis (BBN) observational data on the primordial abundance of light elements to constrain f( T) gravity. The three most studied viable f( T) models, namely the power law, the exponential and the square-root exponential are considered, and the BBN bounds are adopted in order to extract constraints on their free parameters. For the power-law model, we find that the constraints are in agreement with those obtained using late-time cosmological data. For the exponential and the square-root exponential models, we show that for reliable regions of parameters space they always satisfy the BBN bounds. We conclude that viable f( T) models can successfully satisfy the BBN constraints.
Constraining f(T) teleparallel gravity by big bang nucleosynthesis: f(T) cosmology and BBN.
Capozziello, S; Lambiase, G; Saridakis, E N
2017-01-01
We use Big Bang Nucleosynthesis (BBN) observational data on the primordial abundance of light elements to constrain f ( T ) gravity. The three most studied viable f ( T ) models, namely the power law, the exponential and the square-root exponential are considered, and the BBN bounds are adopted in order to extract constraints on their free parameters. For the power-law model, we find that the constraints are in agreement with those obtained using late-time cosmological data. For the exponential and the square-root exponential models, we show that for reliable regions of parameters space they always satisfy the BBN bounds. We conclude that viable f ( T ) models can successfully satisfy the BBN constraints.
Constraining f(T) teleparallel gravity by big bang nucleosynthesis. f(T) cosmology and BBN
International Nuclear Information System (INIS)
Capozziello, S.; Lambiase, G.; Saridakis, E.N.
2017-01-01
We use Big Bang Nucleosynthesis (BBN) observational data on the primordial abundance of light elements to constrain f(T) gravity. The three most studied viable f(T) models, namely the power law, the exponential and the square-root exponential are considered, and the BBN bounds are adopted in order to extract constraints on their free parameters. For the power-law model, we find that the constraints are in agreement with those obtained using late-time cosmological data. For the exponential and the square-root exponential models, we show that for reliable regions of parameters space they always satisfy the BBN bounds. We conclude that viable f(T) models can successfully satisfy the BBN constraints. (orig.)
Electroweak theory and the Standard Model
CERN. Geneva; Giudice, Gian Francesco
2004-01-01
There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.
The Standard Model is Natural as Magnetic Gauge Theory
DEFF Research Database (Denmark)
Sannino, Francesco
2011-01-01
matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...
Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report
Energy Technology Data Exchange (ETDEWEB)
Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)
2013-10-15
The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)
Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report
International Nuclear Information System (INIS)
Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K.
2013-10-01
The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)
Supersymmetry and String Theory: Beyond the Standard Model
International Nuclear Information System (INIS)
Rocek, Martin
2007-01-01
When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)
Bootstrap Estimates of Standard Errors in Generalizability Theory
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Introduction to Educational Administration: Standards, Theories, and Practice. Second Edition
Fiore, Douglas J.
2009-01-01
Organized around the ISLLC standards, this text introduces students to the concepts and theories of educational leadership. The new edition adds coverage of such topics as data usage, ethics, innovative hiring practices, and student discipline. Appearing in the second edition are chapter-ending sections called "Point-Counterpoint" which prompt…
Consistent constraints on the Standard Model Effective Field Theory
Energy Technology Data Exchange (ETDEWEB)
Berthier, Laure; Trott, Michael [Niels Bohr International Academy, University of Copenhagen,Blegdamsvej 17, DK-2100 Copenhagen (Denmark)
2016-02-10
We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.
Theory construction based on standards of care: a proposed theory of the peaceful end of life.
Ruland, C M; Moore, S M
1998-01-01
The contribution of developing a theory from this standard of care is that it can express a new unifying idea about the phenomenon of peaceful end of life for terminally ill patients. It allows for generating and testing hypotheses that can provide new insights into the nature of this phenomenon and can contribute to increased knowledge about nursing interventions that help patients toward a peaceful end of life. The process of theory development from standards of care as described in this article also can be applied to other phenomena. Clinical practice abounds with opportunities for theory development, yet nurses often do not use theories to guide their practice. Until now, little guidance has been provided to tap the richness of clinical knowledge for the development of middle-range theories. Whereas the method described in this article may still be further refined, it offers a promising approach for the development of theories that are applicable to practice and move beyond the scope of grand theories. Thus deriving theories from standards of care can offer an important contribution to the development of the discipline's scientific knowledge base and enhanced practice.
Lattice Gauge Theories Within and Beyond the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Gelzer, Zechariah John [Iowa U.
2017-01-01
The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \
Standard Model in multiscale theories and observational constraints
Calcagni, Gianluca; Nardelli, Giuseppe; Rodríguez-Fernández, David
2016-08-01
We construct and analyze the Standard Model of electroweak and strong interactions in multiscale spacetimes with (i) weighted derivatives and (ii) q -derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multiscale measures with only one characteristic time, length and energy scale t*, ℓ* and E*, we compute the Lamb shift in the hydrogen atom and constrain the multiscale correction to the ordinary result, getting the absolute upper bound t*28 TeV . Stronger bounds are obtained from the measurement of the fine-structure constant. (ii) In the theory with q -derivatives, considering the muon decay rate and the Lamb shift in light atoms, we obtain the independent absolute upper bounds t*35 MeV . For α0=1 /2 , the Lamb shift alone yields t*450 GeV .
DsixTools: the standard model effective field theory toolkit
Energy Technology Data Exchange (ETDEWEB)
Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)
2017-06-15
We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)
Updated BBN bounds on the cosmological lepton asymmetry for non-zero θ13
International Nuclear Information System (INIS)
Mangano, Gianpiero; Miele, Gennaro; Pastor, Sergio; Pisanti, Ofelia; Sarikas, Srdjan
2012-01-01
We discuss the bounds on the cosmological lepton number from Big Bang Nucleosynthesis (BBN), in light of recent evidences for a large value of the neutrino mixing angle θ 13 , sin 2 θ 13 ≥0.01 at 2σ. The largest asymmetries for electron and μ, τ neutrinos compatible with 4 He and 2 H primordial yields are computed versus the neutrino mass hierarchy and mixing angles. The flavour oscillation dynamics is traced till the beginning of BBN and neutrino distributions after decoupling are numerically computed. The latter contains in general, non-thermal distortion due to the onset of flavour oscillations driven by solar squared mass difference in the temperature range where neutrino scatterings become inefficient to enforce thermodynamical equilibrium. Depending on the value of θ 13 , this translates into a larger value for the effective number of neutrinos, N eff . Upper bounds on this parameter are discussed for both neutrino mass hierarchies. Values for N eff which are large enough to be detectable by the Planck experiment are found only for the (presently disfavoured) range sin 2 θ 13 ≤0.01.
How to use the Standard Model effective field theory
Energy Technology Data Exchange (ETDEWEB)
Henning, Brian; Lu, Xiaochuan [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Murayama, Hitoshi [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Kavli Institute for the Physics and Mathematics of the Universe (WPI),Todai Institutes for Advanced Study, University of Tokyo,Kashiwa 277-8583 (Japan)
2016-01-05
We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.
How to use the Standard Model effective field theory
Henning, Brian; Lu, Xiaochuan; Murayama, Hitoshi
2016-01-01
We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.
Theories of Practice: Raising the Standard of Early Childhood Education
Mooney, Carol Garhart
2015-01-01
As an educator, you care deeply about working with young children and strive for quality in your program. This book explains why learning about foundational theory supports the ways you care for and teach children. With stories, anecdotes, and a discussion about the strong connection between theory and best practices, this guide will help you…
Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1
Energy Technology Data Exchange (ETDEWEB)
Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)
2012-09-15
The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)
BBN-Based Portfolio Risk Assessment for NASA Technology R&D Outcome
Geuther, Steven C.; Shih, Ann T.
2016-01-01
The NASA Aeronautics Research Mission Directorate (ARMD) vision falls into six strategic thrusts that are aimed to support the challenges of the Next Generation Air Transportation System (NextGen). In order to achieve the goals of the ARMD vision, the Airspace Operations and Safety Program (AOSP) is committed to developing and delivering new technologies. To meet the dual challenges of constrained resources and timely technology delivery, program portfolio risk assessment is critical for communication and decision-making. This paper describes how Bayesian Belief Network (BBN) is applied to assess the probability of a technology meeting the expected outcome. The network takes into account the different risk factors of technology development and implementation phases. The use of BBNs allows for all technologies of projects in a program portfolio to be separately examined and compared. In addition, the technology interaction effects are modeled through the application of object-oriented BBNs. The paper discusses the development of simplified project risk BBNs and presents various risk results. The results presented include the probability of project risks not meeting success criteria, the risk drivers under uncertainty via sensitivity analysis, and what-if analysis. Finally, the paper shows how program portfolio risk can be assessed using risk results from BBNs of projects in the portfolio.
Supersymmetry and string theory beyond the standard model
Dine, Michael
2015-01-01
The past decade has witnessed dramatic developments in the fields of experimental and theoretical particle physics and cosmology. This fully updated second edition is a comprehensive introduction to these recent developments and brings this self-contained textbook right up to date. Brand new material for this edition includes the groundbreaking Higgs discovery, results of the WMAP and Planck experiments. Extensive discussion of theories of dynamical electroweak symmetry breaking and a new chapter on the landscape, as well as a completely rewritten coda on future directions gives readers a modern perspective on this developing field. A focus on three principle areas: supersymmetry, string theory, and astrophysics and cosmology provide the structure for this book which will be of great interest to graduates and researchers in the fields of particle theory, string theory, astrophysics and cosmology. The book contains several problems, and password-protected solutions will be available to lecturers at www.cambrid...
Detailed examination of 'standard elementary particle theories' based on measurement with Tristan
International Nuclear Information System (INIS)
Kamae, Tsuneyoshi
1989-01-01
The report discusses possible approaches to detailed analysis of 'standard elementary particle theories' on the basis of measurements made with Tristan. The first section of the report addresses major elementary particles involved in the 'standard theories'. The nature of the gauge particles, leptons, quarks and Higgs particle are briefly outlined. The Higgs particle and top quark have not been discovered, though the Higgs particle is essential in the Weiberg-Salam theory. Another important issue in this field is the cause of the collapse of the CP symmetry. The second section deals with problems which arise in universalizing the concept of the 'standard theories'. What are required to solve these problems include the discovery of supersymmetric particles, discovery of conflicts in the 'standard theories', and accurate determination of fundamental constants used in the 'standard theories' by various different methods. The third and fourth sections address the Weinberg-Salam theory and quantum chromodynamics (QCD). There are four essential parameters for the 'standard theories', three of which are associated with the W-S theory. The mass of the W and Z bosons measured in proton-antiproton collision experiments is compared with that determined by applying the W-S theory to electron-positron experiments. For QCD, it is essential to determine the lambda constant. (N.K.)
How Rhetorical Theories of Genre Address Common Core Writing Standards
Collin, Ross
2013-01-01
This article begins with a review of the forms of writing promoted in the Common Core State Standards. Across content areas, Common Core encourages teachers to attune students' writing to rhetorical concerns of audience, purpose, task, and disciplinary thinking. To address these concerns, teachers might take a rhetorical approach to the study…
Symmetry Breaking, Unification, and Theories Beyond the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Nomura, Yasunori
2009-07-31
A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.
Z2 monopoles in the standard SU(2) lattice gauge theory model
International Nuclear Information System (INIS)
Mack, G.; Petkova, V.B.
1979-04-01
The standard SU(2) lattice gauge theory model without fermions may be considered as a Z 2 model with monopoles and fluctuating coupling constants. At low temperatures β -1 (= small bare coupling constant) the monopoles are confined. (orig.) [de
68Ga-BBN-RGD PET/CT for GRPR and Integrin αvβ3 Imaging in Patients with Breast Cancer.
Zhang, Jingjing; Mao, Feng; Niu, Gang; Peng, Li; Lang, Lixin; Li, Fang; Ying, Hongyan; Wu, Huanwen; Pan, Boju; Zhu, Zhaohui; Chen, Xiaoyuan
2018-01-01
Purpose: This study was to assess a gastrin-releasing peptide receptor (GRPR) and integrin αvβ3 dual targeting tracer 68 Ga-BBN-RGD for positron emission tomography (PET)/computed tomography (CT) imaging of breast cancer and metastasis. Materials and Methods: Twenty-two female patients were recruited either with suspected breast cancer on screening mammography (n = 16) or underwent breast cancer radical mastectomy (n = 6). All the 22 patients underwent PET/CT at 30-45 min after intravenous injection of 68 Ga-BBN-RGD. Eleven out of 22 patients also accepted 68 Ga-BBN PET/CT within 2 weeks for comparison. A final diagnosis was made based on the histopathologic examination of surgical excision or biopsy. Results: Both the primary cancer and metastases showed positive 68 Ga-BBN-RGD accumulation. The T/B ratios of 68 Ga-BBN-RGD accumulation were 2.10 to 9.44 in primary cancer and 1.10 to 3.71 in axillary lymph node metastasis, 3.80 to 10.7 in distant lymph nodes, 2.70 to 5.35 in lung metastasis and 3.17 to 22.8 in bone metastasis, respectively. For primary lesions, the SUVmax from 68 Ga-BBN-RGD PET in ER positive group was higher than that in ER negative group (P breast cancer. 68 Ga-BBN-RGD PET/CT may be of great value in discerning both primary breast cancers, axillary lymph node metastasis and distant metastases.
New extended standard model, dark matters and relativity theory
Hwang, Jae-Kwang
2016-03-01
Three-dimensional quantized space model is newly introduced as the extended standard model. Four three-dimensional quantized spaces with total 12 dimensions are used to explain the universes including ours. Electric (EC), lepton (LC) and color (CC) charges are defined to be the charges of the x1x2x3, x4x5x6 and x7x8x9 warped spaces, respectively. Then, the lepton is the xi(EC) - xj(LC) correlated state which makes 3x3 = 9 leptons and the quark is the xi(EC) - xj(LC) - xk(CC) correlated state which makes 3x3x3 = 27 quarks. The new three bastons with the xi(EC) state are proposed as the dark matters seen in the x1x2x3 space, too. The matter universe question, three generations of the leptons and quarks, dark matter and dark energy, hadronization, the big bang, quantum entanglement, quantum mechanics and general relativity are briefly discussed in terms of this new model. The details can be found in the article titled as ``journey into the universe; three-dimensional quantized spaces, elementary particles and quantum mechanics at https://www.researchgate.net/profile/J_Hwang2''.
Higgs particles in the standard model and supersymmetric theories
International Nuclear Information System (INIS)
Muehlleitner, M.M.
2000-08-01
This thesis presents a theoretical analysis of the properties of the Higgs bosons in the standard model (SM) and the minimal supersymmetric extension (MSSM), which can be investigated at the LHC and e + e - linear colliders. The final goal is the reconstruction of the Higgs potential and thus the verification of the Higgs mechanism. MSSM Higgs boson production processes at future γγ colliders are calculated in several decay channels. Heavy scalar and pseudoscalar Higgs bosons can be discovered in the bb final state in the investigated mass range 200 to 800 GeV for moderate and large values of tanβ. The τ + τ - channel provides a heavy Higgs boson discovery potential for large values of tanβ. Several mechanisms that can be exploited at e + e - linear colliders for the measurement of the lifetime of a SM Higgs boson in the intermediate mass range are analysed. In the WW mode, the lifetime of Higgs scalars with masses below ∝160 GeV can be determined with an error less than 10%. The reconstruction of the Higgs potential requires the measurement of the Higgs self-couplings. The SM and MSSM trilinear Higgs self-couplings are accessible in double and triple Higgs production. A theoretical analysis is presented in the relevant channels at the LHC and e + e - linear colliders. For high luminosities, the SM trilinear Higgs self-coupling can be measured with an accuracy of 20% at a 500 GeV e + e - linear collider. The MSSM coupling among three light Higgs bosons has to be extracted from continuum production. The other trilinear Higgs couplings are measurable in a restricted range of the MSSM parameter space. At the LHC, the Hhh coupling can be probed in resonant decays. (orig.)
Academic Training: An Introduction to the Standard Theory of Electroweak Interactions
PH Department
2011-01-01
27, 28 and 29 April 2011 An introduction to the standard theory of electroweak interactions by Giovanni Ridolfi (INFN, Genova) 27, 28 and 29 April from 11:00 to 12:00, 28 April from 14:30 to 15:30 at CERN ( 222-R-001 - Filtration Plant ) The construction and experimental foundations of the unified theory of weak and electromagnetic interactions will be reviewed. Special attention will be given to the Standard Model symmetry properties and how symmetries must be broken in order to obtain a realistic theory for the observed pattern of masses and mixing among generations and to accommodate longitudinal degrees of freedom for the vector bosons. A careful discussion of the Higgs sector, both in the perturbative and in the strongly interacting regime, will be presented. Finally, the motivations towards extensions of the standard model will be discussed.
An Introduction to the Standard Theory of Electroweak Interactions (1/4)
CERN. Geneva
2011-01-01
The construction and experimental foundations of the unified theory of weak and electromagnetic interactions will be reviewed. Special attention will be given to the Standard Model symmetry properties and how symmetries must be broken in order to obtain a realistic theory for the observed pattern of masses and mixing among generations and to accommodate longitudinal degrees of freedom for the vector bosons. A careful discussion of the Higgs sector, both in the perturbative and in the strongly interacting regime, will be presented. Finally, the motivations towards extensions of the standard model will be discussed.
A CVAR scenario for a standard monetary model using theory-consistent expectations
DEFF Research Database (Denmark)
Juselius, Katarina
2017-01-01
A theory-consistent CVAR scenario describes a set of testable regularities capturing basic assumptions of the theoretical model. Using this concept, the paper considers a standard model for exchange rate determination and shows that all assumptions about the model's shock structure and steady...
Chaos and Complexities Theories. Superposition and Standardized Testing: Are We Coming or Going?
Erwin, Susan
2005-01-01
The purpose of this paper is to explore the possibility of using the principle of "superposition of states" (commonly illustrated by Schrodinger's Cat experiment) to understand the process of using standardized testing to measure a student's learning. Comparisons from literature, neuroscience, and Schema Theory will be used to expound upon the…
Higgs Decay to Two Photons at One Loop in the Standard Model Effective Field Theory.
Hartmann, Christine; Trott, Michael
2015-11-06
We present the calculation of the CP conserving contributions to Γ(h→γγ), from dimension six operators at one-loop order, in the linear standard model effective field theory. We discuss the impact of these corrections on interpreting current and future experimental bounds on this decay.
Using Modern Test Theory to Maintain Standards in Public Qualifications in England
Wheadon, Christopher
2013-01-01
This paper describes how item response theory (IRT) methods of test-equating could be applied to the maintenance of public examination standards in England. IRT methods of test-equating have been sparingly applied to the main public examinations in England, namely the General Certificate of Secondary Education (GCSE), the equivalent of a school…
Complexity in quantum field theory and physics beyond the standard model
International Nuclear Information System (INIS)
Goldfain, Ervin
2006-01-01
Complex quantum field theory (abbreviated c-QFT) is introduced in this paper as an alternative framework for the description of physics beyond the energy range of the standard model. The mathematics of c-QFT is based on fractal differential operators that generalize the momentum operators of conventional quantum field theory (QFT). The underlying premise of our approach is that c-QFT contains the right analytical tools for dealing with the asymptotic regime of QFT. Canonical quantization of c-QFT leads to the following findings: (i) the Fock space of c-QFT includes fractional numbers of particles and antiparticles per state (ii) c-QFT represents a generalization of topological field theory and (iii) classical limit of c-QFT is equivalent to field theory in curved space-time. The first finding provides a field-theoretic motivation for the transfinite discretization approach of El-Naschie's ε (∞) theory. The second and third findings suggest the dynamic unification of boson and fermion fields as particles with fractional spin, as well as the close connection between spin and space-time topology beyond the conventional physics of the standard model
Renormalization Group Equations of d=6 Operators in the Standard Model Effective Field Theory
CERN. Geneva
2015-01-01
The one-loop renormalization group equations for the Standard Model (SM) Effective Field Theory (EFT) including dimension-six operators are calculated. The complete 2499 × 2499 one-loop anomalous dimension matrix of the d=6 Lagrangian is obtained, as well as the contribution of d=6 operators to the running of the parameters of the renormalizable SM Lagrangian. The presence of higher-dimension operators has implications for the flavor problem of the SM. An approximate holomorphy of the one-loop anomalous dimension matrix is found, even though the SM EFT is not a supersymmetric theory.
Lepton number violation in theories with a large number of standard model copies
International Nuclear Information System (INIS)
Kovalenko, Sergey; Schmidt, Ivan; Paes, Heinrich
2011-01-01
We examine lepton number violation (LNV) in theories with a saturated black hole bound on a large number of species. Such theories have been advocated recently as a possible solution to the hierarchy problem and an explanation of the smallness of neutrino masses. On the other hand, the violation of the lepton number can be a potential phenomenological problem of this N-copy extension of the standard model as due to the low quantum gravity scale black holes may induce TeV scale LNV operators generating unacceptably large rates of LNV processes. We show, however, that this issue can be avoided by introducing a spontaneously broken U 1(B-L) . Then, due to the existence of a specific compensation mechanism between contributions of different Majorana neutrino states, LNV processes in the standard model copy become extremely suppressed with rates far beyond experimental reach.
Energy Technology Data Exchange (ETDEWEB)
Degrande, Celine [CERN, Theory Division, Geneva 23 (Switzerland); Fuks, Benjamin [Sorbonne Universites, UPMC Univ. Paris 06, Paris (France); CNRS, Paris (France); Mawatari, Kentarou [Universite Grenoble-Alpes, Laboratoire de Physique Subatomique et de Cosmologie, Grenoble (France); Vrije Universiteit Brussel, Theoretische Natuurkunde and IIHE/ELEM, International Solvay Institutes, Brussels (Belgium); Mimasu, Ken [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom); Universite catholique de Louvain, Centre for Cosmology, Particle Physics and Phenomenology (CP3), Louvain-la-Neuve (Belgium); Sanz, Veronica [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom)
2017-04-15
We study the impact of dimension-six operators of the standard model effective field theory relevant for vector-boson fusion and associated Higgs boson production at the LHC. We present predictions at the next-to-leading order accuracy in QCD that include matching to parton showers and that rely on fully automated simulations. We show the importance of the subsequent reduction of the theoretical uncertainties in improving the possible discrimination between effective field theory and standard model results, and we demonstrate that the range of the Wilson coefficient values allowed by a global fit to LEP and LHC Run I data can be further constrained by LHC Run II future results. (orig.)
Degrande, Celine; Mawatari, Kentarou; Mimasu, Ken; Sanz, Veronica
2017-04-25
We study the impact of dimension-six operators of the standard model effective field theory relevant for vector-boson fusion and associated Higgs boson production at the LHC. We present predictions at the next-to-leading order accuracy in QCD that include matching to parton showers and that rely on fully automated simulations. We show the importance of the subsequent reduction of the theoretical uncertainties in improving the possible discrimination between effective field theory and standard model results, and we demonstrate that the range of the Wilson coefficient values allowed by a global fit to LEP and LHC Run I data can be further constrained by LHC Run II future results.
International Nuclear Information System (INIS)
Castro, Carlos
2006-01-01
We construct the Clifford-space tensorial-gauge fields generalizations of Yang-Mills theories and the Standard Model that allows to predict the existence of new particles (bosons, fermions) and tensor-gauge fields of higher-spins in the 10 Tev regime. We proceed with a detailed discussion of the unique D 4 - D 5 - E 6 - E 7 - E 8 model of Smith based on the underlying Clifford algebraic structures in D = 8, and which furnishes all the properties of the Standard Model and Gravity in four-dimensions, at low energies. A generalization and extension of Smith's model to the full Clifford-space is presented when we write explicitly all the terms of the extended Clifford-space Lagrangian. We conclude by explaining the relevance of multiple-foldings of D = 8 dimensions related to the modulo 8 periodicity of the real Cliford algebras and display the interplay among Clifford, Division, Jordan, and Exceptional algebras, within the context of D = 26, 27, 28 dimensions, corresponding to bosonic string, M and F theory, respectively, advanced earlier by Smith. To finalize we describe explicitly how the E 8 x E 8 Yang-Mills theory can be obtained from a Gauge Theory based on the Clifford (16) group
The New BBN Model with the Photon Cooling, X Particle, and the Primordial Magnetic Field
Yamazaki, Dai G.; Kusakabe, Motohiko; Kajino, Toshitaka; Mathews, Grant. J.; Cheoun, Myung-Ki
The Big bang nucleosynthesis theory accurately reproduces the abundances of light elements in the Universe, except for 7Li abundance. Calculated 7Li abundance with the baryon to photon ratio fixed by the observations of the cosmic microwave background (CMB) is inconsistent with the observed 7Li abundance on the surface of metal-poor halo stars, and this problem is called "Li problem". Previous studies proposing solutions of this 7Li problem include photon cooling (possibly via the Bose-Einstein condensation of a scalar particle), the decay of a long-lived X particle (possibly the next-to-lightest supersymmetric particle), or an energy density of a primordial magnetic field (PMF). We mention analyzed results of these solutions both separately and in concert, and the constraint on the X particles and the PMF parameters from observed light element abundances with likelihood analysis. We can discover parameter ranges of the X particles which can solve the Li problem and constrain the energy density of the PMF.
Variation of fundamental constants and the triple-alpha reaction in Population III stars and BBN
International Nuclear Information System (INIS)
Coc, Alain
2012-01-01
The effect of variations of the fundamental constants on the thermonuclear rate of the triple alpha reaction, 4 He(αα, γ) 12 C, that bridges the gap between 4 He and 12 C is investigated. We have followed the evolution of 15 and 60 M sun zero metallicity stellar models, up to the end of core helium burning. The calculated oxygen and carbon abundances resulting from helium burning can then be used to constrain the variation of the fundamental constants. To investigate the effect of an enhanced triple alpha reaction rate in Big-Bang Nucleosynthesis, we first evaluated Standard Big-Bang Nucleosynthesis CNO production with a network of more than 400 reactions using the TALYS code to calculate missing rates.
New Constraints on Dark Matter Effective Theories from Standard Model Loops
Crivellin, Andreas; Procura, Massimiliano
2014-01-01
We consider an effective field theory for a gauge singlet Dirac dark matter (DM) particle interacting with the Standard Model (SM) fields via effective operators suppressed by the scale $\\Lambda \\gtrsim 1$ TeV. We perform a systematic analysis of the leading loop contributions to spin-independent (SI) DM--nucleon scattering using renormalization group evolution between $\\Lambda$ and the low-energy scale probed by direct detection experiments. We find that electroweak interactions induce operator mixings such that operators that are naively velocity-suppressed and spin-dependent can actually contribute to SI scattering. This allows us to put novel constraints on Wilson coefficients that were so far poorly bounded by direct detection. Constraints from current searches are comparable to LHC bounds, and will significantly improve in the near future. Interestingly, the loop contribution we find is maximally isospin violating even if the underlying theory is isospin conserving.
From basic survival analytic theory to a non-standard application
Zimmermann, Georg
2017-01-01
Georg Zimmermann provides a mathematically rigorous treatment of basic survival analytic methods. His emphasis is also placed on various questions and problems, especially with regard to life expectancy calculations arising from a particular real-life dataset on patients with epilepsy. The author shows both the step-by-step analyses of that dataset and the theory the analyses are based on. He demonstrates that one may face serious and sometimes unexpected problems, even when conducting very basic analyses. Moreover, the reader learns that a practically relevant research question may look rather simple at first sight. Nevertheless, compared to standard textbooks, a more detailed account of the theory underlying life expectancy calculations is needed in order to provide a mathematically rigorous framework. Contents Regression Models for Survival Data Model Checking Procedures Life Expectancy Target Groups Researchers, lecturers, and students in the fields of mathematics and statistics Academics and experts work...
The standard theory of particle physics Essays to celebrate CERN’s 60th anniversary
Maiani, Luciano
2016-01-01
The book gives a quite complete and up-to-date picture of the Standard Theory with an historical perspective, with a collection of articles written by some of the protagonists of present particle physics. The theoretical developments are described together with the most up-to-date experimental tests, including the discovery of the Higgs Boson and the measurement of its mass as well as the most precise measurements of the top mass, giving the reader a complete description of our present understanding of particle physics.
The standard model from a gauge theory in ten dimensions via CSDR
International Nuclear Information System (INIS)
Farakos, K.; Kapetanakis, D.; Koutsoumbas, G.; Zoupanos, G.
1988-01-01
We present a gauge theory in ten dimensions based on the gauge group E 8 which is dimensionally reduced, according to the coset space dimensional reduction (CSDR) scheme, to the standard model SU 3c xSU 2L xU 1 , which breaks further to SU 3c xU 1em . We use the coset space Sp 4 /(SU 2 xU 1 )xZ 2 . The model gives similar predictions for sin 2 θ w and proton decay as the minimal SU 5 GUT. Natural choices of parameters suggest that the Higgs masses are as predicted by the Coleman-Weinberg radiative mechanism. (orig.)
Mathematical gauge theory with applications to the standard model of particle physics
Hamilton, Mark J D
2017-01-01
The Standard Model is the foundation of modern particle and high energy physics. This book explains the mathematical background behind the Standard Model, translating ideas from physics into a mathematical language and vice versa. The first part of the book covers the mathematical theory of Lie groups and Lie algebras, fibre bundles, connections, curvature and spinors. The second part then gives a detailed exposition of how these concepts are applied in physics, concerning topics such as the Lagrangians of gauge and matter fields, spontaneous symmetry breaking, the Higgs boson and mass generation of gauge bosons and fermions. The book also contains a chapter on advanced and modern topics in particle physics, such as neutrino masses, CP violation and Grand Unification. This carefully written textbook is aimed at graduate students of mathematics and physics. It contains numerous examples and more than 150 exercises, making it suitable for self-study and use alongside lecture courses. Only a basic knowledge of d...
Van Lange, Paul A M
2013-02-01
The construction and development of theory is one of the central routes to scientific progress. But what exactly constitutes a good theory? What is it that people might expect from an ideal theory? This article advances a new model, which delineates truth, abstraction, progress, and applicability as standards (TAPAS) for a good theory. After providing the rationale for TAPAS, this article evaluates several social-psychological theories in terms of TAPAS, especially classic theories, and illustrates its utility with some more recent theoretical contributions of social psychology. This article concludes by outlining recommendations for effective theory construction and development, such as the utility of meta-analytic approaches for pursuing truth, the utility of theory-oriented courses and journals for pursuing abstraction, and the utility of adversarial collaboration for pursuing progress, and reaching out to major personal or societal issues for pursuing applicability.
2004-2005 Academic Training Programme: Electroweak Theory and the Standard Model
Françoise Benz
2004-01-01
6, 7, 8, 9 and 10 December LECTURE SERIES 6, 7, 8, 9, 10 December from 11:00 to 12:00 - Main Auditorium, bldg. 500 on 6, 7, 8, 10 December, TH Auditorium, bldg. 4 3-006 on 9 December Electroweak Theory and the Standard Model R. BARBIERI / CERN-PH-TH There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development /test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector. Transparencies available at: http://agenda.cern.ch/fullAgenda.php?ida=a042577 ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch If you wish to participate in one of the following courses, please discuss with your supervisor and apply electronically directly from the course description pages that can ...
2004-2005 Academic Training Programme: Electroweak Theory and the Standard Model
Françoise Benz
2004-01-01
6, 7, 8, 9 and 10 December LECTURE SERIES 6, 7, 8, 9, 10 December from 11:00 to 12:00 - Main Auditorium, bldg. 500 on 6, 7, 8, 10 December, TH Auditorium, bldg. 4 3-006 on 9 December Electroweak Theory and the Standard Model R. BARBIERI / CERN-PH-TH There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development /test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector. ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch Si vous désirez participer à l'un des cours suivants, veuillez en discuter avec votre superviseur et vous inscrire électroniquement en direct depuis les pages de description des cours dans le Web que vous trouvez &ag...
Feynman rules for the Standard Model Effective Field Theory in R ξ -gauges
Dedes, A.; Materkowska, W.; Paraskevas, M.; Rosiek, J.; Suxho, K.
2017-06-01
We assume that New Physics effects are parametrized within the Standard Model Effective Field Theory (SMEFT) written in a complete basis of gauge invariant operators up to dimension 6, commonly referred to as "Warsaw basis". We discuss all steps necessary to obtain a consistent transition to the spontaneously broken theory and several other important aspects, including the BRST-invariance of the SMEFT action for linear R ξ -gauges. The final theory is expressed in a basis characterized by SM-like propagators for all physical and unphysical fields. The effect of the non-renormalizable operators appears explicitly in triple or higher multiplicity vertices. In this mass basis we derive the complete set of Feynman rules, without resorting to any simplifying assumptions such as baryon-, lepton-number or CP conservation. As it turns out, for most SMEFT vertices the expressions are reasonably short, with a noticeable exception of those involving 4, 5 and 6 gluons. We have also supplemented our set of Feynman rules, given in an appendix here, with a publicly available Mathematica code working with the FeynRules package and producing output which can be integrated with other symbolic algebra or numerical codes for automatic SMEFT amplitude calculations.
Constraining the top-Higgs sector of the standard model effective field theory
Cirigliano, V.; Dekens, W.; de Vries, J.; Mereghetti, E.
2016-08-01
Working in the framework of the Standard Model effective field theory, we study chirality-flipping couplings of the top quark to Higgs and gauge bosons. We discuss in detail the renormalization-group evolution to lower energies and investigate direct and indirect contributions to high- and low-energy C P -conserving and C P -violating observables. Our analysis includes constraints from collider observables, precision electroweak tests, flavor physics, and electric dipole moments. We find that indirect probes are competitive or dominant for both C P -even and C P -odd observables, even after accounting for uncertainties associated with hadronic and nuclear matrix elements, illustrating the importance of including operator mixing in constraining the Standard Model effective field theory. We also study scenarios where multiple anomalous top couplings are generated at the high scale, showing that while the bounds on individual couplings relax, strong correlations among couplings survive. Finally, we find that enforcing minimal flavor violation does not significantly affect the bounds on the top couplings.
Impact of flavor and Higgs physics on theories beyond the standard model
Energy Technology Data Exchange (ETDEWEB)
Casagrande, Sandro
2013-02-13
Quantum effects of physics beyond the Standard Model receive strong indirect constraints from precisely measured collider observables. In the conceptual part of this thesis, we apply the generic relations between particle interactions in perturbatively unitary theories to calculate one-loop amplitudes for flavor physics. We provide template results applicable for any model of this class. We also investigate example models that are partly and such that are not perturbatively unitary: the Littlest Higgs model and Randall-Sundrum models. The latter have a unique coupling structure, which we cover exhaustively. We find strong constraints on the Randall-Sundrum models and numerically compare those from flavor, electroweak precision, and Higgs physics by performing detailed parameter scans. We observe interesting correlations between flavor observables, and we find that constraints from Higgs production and decays are already competitive.
Standard model from a gauge theory in ten dimensions via CSDR
Energy Technology Data Exchange (ETDEWEB)
Farakos, K.; Kapetanakis, D.; Koutsoumbas, G.; Zoupanos, G.
1988-09-01
We present a gauge theory in ten dimensions based on the gauge group E/sub 8/ which is dimensionally reduced, according to the coset space dimensional reduction (CSDR) scheme, to the standard model SU/sub 3c/xSU/sub 2L/xU/sub 1/, which breaks further to SU/sub 3c/xU/sub 1em/. We use the coset space Sp/sub 4//(SU/sub 2/xU/sub 1/)xZ/sub 2/. The model gives similar predictions for sin /sup 2/theta/sub w/ and proton decay as the minimal SU/sub 5/ GUT. Natural choices of parameters suggest that the Higgs masses are as predicted by the Coleman-Weinberg radiative mechanism.
Impact of flavor and Higgs physics on theories beyond the standard model
International Nuclear Information System (INIS)
Casagrande, Sandro
2013-01-01
Quantum effects of physics beyond the Standard Model receive strong indirect constraints from precisely measured collider observables. In the conceptual part of this thesis, we apply the generic relations between particle interactions in perturbatively unitary theories to calculate one-loop amplitudes for flavor physics. We provide template results applicable for any model of this class. We also investigate example models that are partly and such that are not perturbatively unitary: the Littlest Higgs model and Randall-Sundrum models. The latter have a unique coupling structure, which we cover exhaustively. We find strong constraints on the Randall-Sundrum models and numerically compare those from flavor, electroweak precision, and Higgs physics by performing detailed parameter scans. We observe interesting correlations between flavor observables, and we find that constraints from Higgs production and decays are already competitive.
Cardiac response to low-energy field pacing challenges the standard theory of defibrillation.
Caldwell, Bryan J; Trew, Mark L; Pertsov, Arkady M
2015-06-01
The electric response of myocardial tissue to periodic field stimuli has attracted significant attention as the basis for low-energy antifibrillation pacing, potentially more effective than traditional single high-energy shocks. In conventional models, an electric field produces a highly nonuniform response of the myocardial wall, with discrete excitations, or hot spots (HS), occurring at cathodal tissue surfaces or large coronary vessels. We test this prediction using novel 3-dimensional tomographic optical imaging. Experiments were performed in isolated coronary perfused pig ventricular wall preparations stained with near-infrared voltage-sensitive fluorescent dye DI-4-ANBDQBS. The 3-dimensional coordinates of HS were determined using alternating transillumination. To relate HS formation with myocardial structures, we used ultradeep confocal imaging (interrogation depths, >4 mm). The peak HS distribution is located deep inside the heart wall, and the depth is not significantly affected by field polarity. We did not observe the strong colocalization of HS with major coronary vessels anticipated from theory. Yet, we observed considerable lateral displacement of HS with field polarity reversal. Models that de-emphasized lateral intracellular coupling and accounted for resistive heterogeneity in the extracellular space showed similar HS distributions to the experimental observations. The HS distributions within the myocardial wall and the significant lateral displacements with field polarity reversal are inconsistent with standard theories of defibrillation. Extended theories based on enhanced descriptions of cellular scale electric mechanisms may be necessary. The considerable lateral displacement of HS with field polarity reversal supports the hypothesis of biphasic stimuli in low-energy antifibrillation pacing being advantageous. © 2015 American Heart Association, Inc.
The role of framing effect in assessment of quality of life according to standard gambling theory
Directory of Open Access Journals (Sweden)
Songul Cinaroglu
2015-08-01
Full Text Available Measuring health outcomes includes risk and uncertainty. Quality of life assessment in health care has two main properties one is they are personal, the other is they are reflecting personal preferences. Because of patient preferences includes risk and uncertainty standard gambling theory used which is one of the quantitative techniques for assessment of patient preferences. Framing effect which is based on social psychology, shows that positive and negative framed information effects decision making. For this reason in this study we aim to discuss the role of framing effect on quality of life assessments when standart gambling theory was used. Results of this study show that compare to traditional framing effect, medical framework reveal opposite results. Patients show risk seeking behavior in positive framework and they show risk aversion behaviour in negative framework. We think that the results of this study provides useful information for understanding of how framing make a bias in asssessment of patient preferences. [TAF Prev Med Bull 2015; 14(4.000: 346-352
Liao, Yi; Ma, Xiao-Dong
2018-03-01
We study two aspects of higher dimensional operators in standard model effective field theory. We first introduce a perturbative power counting rule for the entries in the anomalous dimension matrix of operators with equal mass dimension. The power counting is determined by the number of loops and the difference of the indices of the two operators involved, which in turn is defined by assuming that all terms in the standard model Lagrangian have an equal perturbative power. Then we show that the operators with the lowest index are unique at each mass dimension d, i.e., (H † H) d/2 for even d ≥ 4, and (LT∈ H)C(LT∈ H) T (H † H)(d-5)/2 for odd d ≥ 5. Here H, L are the Higgs and lepton doublet, and ∈, C the antisymmetric matrix of rank two and the charge conjugation matrix, respectively. The renormalization group running of these operators can be studied separately from other operators of equal mass dimension at the leading order in power counting. We compute their anomalous dimensions at one loop for general d and find that they are enhanced quadratically in d due to combinatorics. We also make connections with classification of operators in terms of their holomorphic and anti-holomorphic weights. Supported by the National Natural Science Foundation of China under Grant Nos. 11025525, 11575089, and by the CAS Center for Excellence in Particle Physics (CCEPP)
Statistical approach to Higgs boson couplings in the standard model effective field theory
Murphy, Christopher W.
2018-01-01
We perform a parameter fit in the standard model effective field theory (SMEFT) with an emphasis on using regularized linear regression to tackle the issue of the large number of parameters in the SMEFT. In regularized linear regression, a positive definite function of the parameters of interest is added to the usual cost function. A cross-validation is performed to try to determine the optimal value of the regularization parameter to use, but it selects the standard model (SM) as the best model to explain the measurements. Nevertheless as proof of principle of this technique we apply it to fitting Higgs boson signal strengths in SMEFT, including the latest Run-2 results. Results are presented in terms of the eigensystem of the covariance matrix of the least squares estimators as it has a degree model-independent to it. We find several results in this initial work: the SMEFT predicts the total width of the Higgs boson to be consistent with the SM prediction; the ATLAS and CMS experiments at the LHC are currently sensitive to non-resonant double Higgs boson production. Constraints are derived on the viable parameter space for electroweak baryogenesis in the SMEFT, reinforcing the notion that a first order phase transition requires fairly low-scale beyond the SM physics. Finally, we study which future experimental measurements would give the most improvement on the global constraints on the Higgs sector of the SMEFT.
Energy Technology Data Exchange (ETDEWEB)
Ahn, C.
1989-08-01
We study two aspects of one loop structures in quantum field theories which describe two different areas of particle physics: the one loop unitarity behavior of the Standard Model of electroweak interactions and modular invariance of string model theory. Loop expansion has its importance in that it contains quantum fluctuations due to all physical states in the theory. Therefore, by studying the various models to one loop, we can understand how the contents of the theory can contribute to physically measurable quantities and how the consistency at quantum level restricts the physical states of the theory, as well. In the first half of the thesis, we study one loop corrections to the process {ital e}{sup +}{ital e}{sup {minus}} {yields} {ital W}{sup +}{ital W}{sup {minus}}. In this process, there is a delicate unitarity-saving cancellation between s-channel and t-channel tree level Feynman diagrams. If the one loop contribution due to heavy particles corrects the channels asymmetrically, the cancellation, hence unitarity, will be delayed up to the mass scale of these heavy particles. We refer to this phenomena as the unitarity delay effect. Due to this effect, cross section below these mass scales can have significant radiative corrections which may provide an appropriate window through which we can see the high energy structure of the Standard Model from relatively low energy experiments. In the second half, we will show how quantum consistency can restrict the physical states in string theory. 53 refs., 13 figs.
International Nuclear Information System (INIS)
Ahn, C.
1989-08-01
We study two aspects of one loop structures in quantum field theories which describe two different areas of particle physics: the one loop unitarity behavior of the Standard Model of electroweak interactions and modular invariance of string model theory. Loop expansion has its importance in that it contains quantum fluctuations due to all physical states in the theory. Therefore, by studying the various models to one loop, we can understand how the contents of the theory can contribute to physically measurable quantities and how the consistency at quantum level restricts the physical states of the theory, as well. In the first half of the thesis, we study one loop corrections to the process e + e - → W + W - . In this process, there is a delicate unitarity-saving cancellation between s-channel and t-channel tree level Feynman diagrams. If the one loop contribution due to heavy particles corrects the channels asymmetrically, the cancellation, hence unitarity, will be delayed up to the mass scale of these heavy particles. We refer to this phenomena as the unitarity delay effect. Due to this effect, cross section below these mass scales can have significant radiative corrections which may provide an appropriate window through which we can see the high energy structure of the Standard Model from relatively low energy experiments. In the second half, we will show how quantum consistency can restrict the physical states in string theory. 53 refs., 13 figs
Troyan, Francis J.
2014-01-01
Recent educational standards have refocused the goals of foreign language (FL) instruction on "the purpose of communication" (ACTFL, 2012, p. 1) across the three modes of communication (interpersonal, interpretive, and presentational). To this end, this article considers a linguistically based genre theory as a means of enhancing…
Wieseman, Katherine Claire
1998-08-01
The purpose of this study was: (1) to contribute to current understandings of teachers' perspectives of standards-based science education reform, and (2) to describe and document the relationships between the personal theories about science teaching and learning held by three "unique" experienced primary teachers and the teachers' understandings of a state-level standards-based vision of school science. These teachers were unique because they had been members of a two year-long district-level standards-based science curriculum reform initiative. This study was framed by the conception of personal theories as a representation of teacher beliefs. This qualitative case study extended from the summer of 1996 through the spring of 1998. Methods of data collection were: interviewing and informal conversations, participant observation, video and audio taping, generation of field notes, and examination of documents and artifacts. Data analysis involved constant comparison of data, generation of categories, description of properties of and relationships between categories, and theorizing about the data. The results of this qualitative case study revealed that, overall, direct involvement in the science curriculum reform effort was a positive and meaningful professional development experience providing opportunities for personal and professional growth. For two of the teachers, the experience signified a turning point in the history of their personal theories about science teaching and learning. Although the reform effort was philosophically grounded in a state-level standards-based vision of school science, implementing the new hands-on science program, adopted as a dimension of the work of the reform effort, was in the forefront of teachers' thoughts. A standards vision played a secondary role as a referrent for thinking about the teaching of science. The findings suggest that teachers' personal theories could act as mediating forces in teachers' endeavors to understand
International Nuclear Information System (INIS)
Lane, Stephanie R.; Nanda, Prasanta; Rold, Tammy L.; Sieckman, Gary L.; Figueroa, Said D.; Hoffman, Timothy J.; Jurisson, Silvia S.; Smith, Charles J.
2010-01-01
Gastrin-releasing peptide receptors (GRPr) are a member of the bombesin (BBN) receptor family. GRPr are expressed in high numbers on specific human cancers, including human prostate cancer. Therefore, copper-64 ( 64 Cu) radiolabeled BBN(7-14)NH 2 conjugates could have potential for diagnosis of human prostate cancer via positron-emission tomography (PET). The aim of this study was to produce [ 64 Cu-NO2A-(X)-BBN(7-14)NH 2 ] conjugates for prostate cancer imaging, where X=pharmacokinetic modifier (beta-alanine, 5-aminovaleric acid, 6-aminohexanoic acid, 8-aminooctanoic acid, 9-aminonanoic acid or para-aminobenzoic acid) and NO2A=1,4,7-triazacyclononane-1,4-diacetic acid [a derivative of NOTA (1,4,7-triazacyclononane-1,4,7-triacetic acid)]. Methods: [(X)-BBN(7-14)NH 2 ] Conjugates were synthesized by solid-phase peptide synthesis (SPPS), after which NOTA was added via manual conjugation. The new peptide conjugates were radiolabeled with 64 Cu radionuclide. The receptor-binding affinity was determined in human prostate PC-3 cells, and tumor-targeting efficacy was determined in PC-3 tumor-bearing severely combined immunodeficient (SCID) mice. Whole-body maximum intensity microPET/CT images of PC-3 tumor-bearing SCID mice were obtained 18 h postinjection (pi). Results: Competitive binding assays in PC-3 cells indicated high receptor-binding affinity for the [NO2A-(X)-BBN(7-14)NH 2 ] and [ nat Cu-NO2A-(X)-BBN(7-14)NH 2 ] conjugates. In vivo biodistribution studies of the [ 64 Cu-NO2A-(X)-BBN(7-14)NH 2 ] conjugates at 1, 4 and 24 h pi showed very high uptake of the tracer in GRPr-positive tissue with little accumulation and retention in nontarget tissues. High-quality, high-contrast microPET images were obtained, with xenografted tumors being clearly visible at 18 h pi. Conclusions: NO2A chelator sufficiently stabilizes copper(II) radiometal under in vivo conditions, producing conjugates with very high uptake and retention in targeted GRPr. Preclinical evaluation of these
A Unitary and Renormalizable Theory of the Standard Model in Ghost-Free Light-Cone Gauge
Energy Technology Data Exchange (ETDEWEB)
Brodsky, Stanley J.
2002-02-15
Light-front (LF) quantization in light-cone (LC) gauge is used to construct a unitary and simultaneously renormalizable theory of the Standard Model. The framework derived earlier for QCD is extended to the Glashow, Weinberg, and Salam (GWS) model of electroweak interaction theory. The Lorentz condition is automatically satisfied in LF-quantized QCD in the LC gauge for the free massless gauge field. In the GWS model, with the spontaneous symmetry breaking present, we find that the 't Hooft condition accompanies the LC gauge condition corresponding to the massive vector boson. The two transverse polarization vectors for the massive vector boson may be chosen to be the same as found in QCD. The non-transverse and linearly independent third polarization vector is found to be parallel to the gauge direction. The corresponding sum over polarizations in the Standard model, indicated by K{sub {mu}{nu}}(k); has several simplifying properties similar to the polarization sum D{sub {mu}{nu}}(k) in QCD. The framework is ghost-free, and the interaction Hamiltonian of electroweak theory can be expressed in a form resembling that of covariant theory, except for few additional instantaneous interactions which can be treated systematically. The LF formulation also provides a transparent discussion of the Goldstone Boson (or Electroweak) Equivalence Theorem, as the illustrations show.
Kumar, Coleen P
2007-01-01
This paper aims to illustrate the process of theory-based nursing practice by presenting a case study of a clinical nurse specialist's assessment and care of a woman with type 2 diabetes. Orem's self-care deficit theory and standardized nursing language, NANDA, NIC (Nursing Interventions Classification), and NOC (Nursing Outcomes Classification), guided assessment and the identification of outcomes and interventions related to the client's management of diabetes. Theory-based nursing care and standardized nursing language enhanced the client's ability to self-manage the chronic illness: diabetes. Nursing theory and standardized nursing language enhance communication among nurses and support a client's ability to self-manage a chronic illness.
Directory of Open Access Journals (Sweden)
Conceição Aparecida Dornelas
2012-02-01
Full Text Available PURPOSE: To determine the effects of green propolis extracted in L-lysine (WSDP and of L- lysine for 40 weeks on induced rat bladder carcinogenesis. METHODS: The animals (groups I, II, III, IV, V and VI received BBN during 14 weeks. Group I was treated with propolis 30 days prior received BBN, and then these animals were treated daily with propolis; Groups II and III was treated with subcutaneous and oral propolis (respectively concurrently with BBN. The animals of Group IV were treated L-lysine; Group V received water subcutaneous; and Group VI received only to BBN. Among the animals not submitted to carcinogenesis induction, Group VII received propolis, Group VIII received L-lysine and Group IX received water. RESULTS: The carcinoma incidence in Group I was lower than that of control (Group VI. The carcinoma multiplicity in Group IV was greater than in Group VI. All animals treated with L-lysine developed carcinomas, and they were also more invasive in Group IV than in controls. On the other hand, Group VIII showed no bladder lesions. CONCLUSION: The WSDP is chemopreventive against rat bladder carcinogenesis, if administered 30 days prior to BBN , and that L-lysine causes promotion of bladder carcinogenesis.OBJETIVO: Determinar os efeitos da própolis verde extraída em L - Lisina (WSDP e da L-Lisina por 40 semanas em ratos induzidos a carcinogênese de bexiga. MÉTODOS: Os animais (grupos I, II, III, IV, V e VI receberam BBN por 14 semanas. O grupo I foi tratado com própolis 30 dias antes de receber BBN e em seguida estes animais foram tratados diariamente com própolis; Os grupos II e III foram tratados com própolis subcutânea e oral (respectivamente e concorretemente com BBN. Os animais do grupo IV foram tratados com L- Lisina; o grupo V recebeu água subcutânea; o grupo VI recebeu apenas BBN. Entre os animais não submetidos a indução de carcinogênese, Grupo VII, receberam própolis, Grupo VIII, receberam L-Lisina e Grupo IX
Color-Blind Leadership: A Critical Race Theory Analysis of the ISLLC and ELCC Standards
Davis, Bradley W.; Gooden, Mark A.; Micheaux, Donna J.
2015-01-01
Purpose: Working from the driving research question--"is the explicit consideration of race present in the ISLLC and ELCC standards?"--this article explores the implications of a school leadership landscape reliant on a collection of color-blind leadership standards to guide the preparation and practice of school leaders. In doing so, we…
Integrating the Demonstration Orientation and Standards-Based Models of Achievement Goal Theory
Wynne, Heather Marie
2014-01-01
Achievement goal theory and thus, the empirical measures stemming from the research, are currently divided on two conceptual approaches, namely the reason versus aims-based models of achievement goals. The factor structure and predictive utility of goal constructs from the Patterns of Adaptive Learning Strategies (PALS) and the latest two versions…
The standards and norms for the Slovenian academic libraries between theory and practice
Directory of Open Access Journals (Sweden)
Mojca Dolgan-Petrič
1998-01-01
Full Text Available Based on statistical data, contemporary trends within academic libraries in the Republic of Slovenia are presented. The substantially slow increase of library materials, along with lesser increase of staff, contradicts the growing number of library visits and circulation. The paper elaborates on outstanding differences regarding personnel, financial, and spatial issues of the libraries concerned. The comparison between the professional and govemmental standards and norms is presented. At a time when academic libraries are faced with the dilemma of growing demand from users at the same time as budgetary constraint there is an urgent need to develop a list of performance indicators and to modernize professional standards and norms for academic libraries.The elaboration of new standards must be based on empirical studies. Only realistic and clearly defined norms would enable the implementation of standards and improve the quality and efficiency of academic libraries.
Shirazi, Mandana; Emami, Amir Hosein; Mirmoosavi, ,Seyed Jamal; Alavinia, Seyed Mohammad; Zamanian, Hadi; Fathollahbeigi, Faezeh; Masiello, Italo
2014-01-01
Background: Effective leadership is of prime importance in any organization and it goes through changes based on accepted health promotion and behavior change theory. Although there are many leadership styles, transformational leadership, which emphasizes supportive leadership behaviors, seems to be an appropriate style in many settings particularly in the health care and educational sectors which are pressured by high turnover and safety demands. Iran has been moving rapidly forward and its ...
Energy Technology Data Exchange (ETDEWEB)
Kneur, J.L
2006-06-15
This document is divided into 2 parts. The first part describes a particular re-summation technique of perturbative series that can give a non-perturbative results in some cases. We detail some applications in field theory and in condensed matter like the calculation of the effective temperature of Bose-Einstein condensates. The second part deals with the minimal supersymmetric standard model. We present an accurate calculation of the mass spectrum of supersymmetric particles, a calculation of the relic density of supersymmetric black matter, and the constraints that we can infer from models.
Barrett-Tatum, Jennifer
2015-01-01
The English Language Arts Common Core State Standards and corresponding assessments brought about many changes for educators, their literacy instruction, and the literacy learning of their students. This study examined the day-to-day literacy instruction of two primary grade teachers during their first year of full CCSS implementation. Engestr?m's…
On the Failure of Standard Completeness in PiMTL for Infinite Theories
Czech Academy of Sciences Publication Activity Database
Horčík, Rostislav
2007-01-01
Roč. 158, č. 6 (2007), s. 619-624 ISSN 0165-0114 Source of funding: V - iné verejné zdroje Keywords : strong standard completeness * monoidal t-norm based logic (MTL) * basic fuzzy logic (BL) * product logic * Lukasiewicz logic * PiMTL * IMTL Subject RIV: BA - General Mathematics Impact factor: 1.373, year: 2007
The standard model of particle physics: an introduction to the theory
Fawzi, B
2002-01-01
The key concepts of gauge invariance and spontaneous symmetry breaking that helped build the Standard Model of particle physics are introduced. A short description of radiative corrections that have made the model pass all precision tests, in particular those from LEP, is presented. (authors)
Group theory for the standard model of particle physics and beyond
Barnes, Ken J
2010-01-01
Symmetries and Conservation LawsLagrangian and Hamiltonian Mechanics Quantum MechanicsCoupled Oscillators: Normal Modes One-Dimensional Fields: Waves The Final Step: Lagrange-Hamilton Quantum Field TheoryQuantum Angular MomentumIndex Notation Quantum Angular Momentum Result Matrix Representations Spin 1/2Addition of Angular Momenta Clebsch-Gordan CoefficientsMatrix Representation of Direct (Outer, Kronecker) Products Change of BasisTensors and Tensor OperatorsScalars Scalar FieldsInvariant Functions Contravariant Vectors (t ?index at top) Covariant Vectors (Co = Goes Below) NotesTensorsRotatio
Probing CP-violating Higgs and gauge-boson couplings in the Standard Model effective field theory
Energy Technology Data Exchange (ETDEWEB)
Ferreira, Felipe [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom); Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil); Fuks, Benjamin [Sorbonne Universites, Universite Pierre et Marie Curie (Paris 06), UMR 7589, LPTHE, Paris (France); CNRS, UMR 7589, LPTHE, Paris (France); Institut Universitaire de France, Paris (France); Sanz, Veronica [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom); Sengupta, Dipan [Universite Grenoble-Alpes, CNRS/IN2P3, Laboratoire de Physique Subatomique et de Cosmologie, Grenoble (France); Michigan State University, Department of Physics and Astronomy, East Lansing (United States)
2017-10-15
We study the phenomenological consequences of several CP-violating structures that could arise in the Standard Model effective field theory framework. Focusing on operators involving electroweak gauge and/or Higgs bosons, we derive constraints originating from Run I LHC data. We then study the capabilities of the present and future LHC runs at higher energies to further probe associated CP-violating phenomena and we demonstrate how differential information can play a key role. We consider both traditional four-lepton probes of CP-violation in the Higgs sector and novel new physics handles based on varied angular and non-angular observables. (orig.)
Shirazi, Mandana; Emami, Amir Hosein; Mirmoosavi, Seyed Jamal; Alavinia, Seyed Mohammad; Zamanian, Hadi; Fathollahbeigi, Faezeh; Masiello, Italo
2014-01-01
Effective leadership is of prime importance in any organization and it goes through changes based on accepted health promotion and behavior change theory. Although there are many leadership styles, transformational leadership, which emphasizes supportive leadership behaviors, seems to be an appropriate style in many settings particularly in the health care and educational sectors which are pressured by high turnover and safety demands. Iran has been moving rapidly forward and its authorities have understood and recognized the importance of matching leadership styles with effective and competent care for success in health care organizations. This study aimed to develop the Supportive Leadership Behaviors Scale based on accepted health and educational theories and to psychometrically test it in the Iranian context. The instrument was based on items from established questionnaires. A pilot study validated the instrument which was also cross-validated via re-translation. After validation, 731 participants answered the questionnaire. The instrument was finalized and resulted in a 20-item questionnaire using the exploratory factor analysis, which yielded four factors of support for development, integrity, sincerity and recognition and explaining the supportive leadership behaviors (all above 0.6). Mapping these four measures of leadership behaviors can be beneficial to determine whether effective leadership could support innovation and improvements in medical education and health care organizations on the national level. The reliability measured as Cronbach's alpha was 0.84. This new instrument yielded four factors of support for development, integrity, sincerity and recognition and explaining the supportive leadership behaviors which are applicable in health and educational settings and are helpful in improving self -efficacy among health and academic staff.
Energy Technology Data Exchange (ETDEWEB)
Walker-Loud, Andre [College of William and Mary, Williamsburg, VA (United States)
2016-10-14
The research supported by this grant is aimed at probing the limits of the Standard Model through precision low-energy nuclear physics. The work of the PI (AWL) and additional personnel is to provide theory input needed for a number of potentially high-impact experiments, notably, hadronic parity violation, Dark Matter direct detection and searches for permanent electric dipole moments (EDMs) in nucleons and nuclei. In all these examples, a quantitative understanding of low-energy nuclear physics from the fundamental theory of strong interactions, Quantum Chromo-Dynamics (QCD), is necessary to interpret the experimental results. The main theoretical tools used and developed in this work are the numerical solution to QCD known as lattice QCD (LQCD) and Effective Field Theory (EFT). This grant is supporting a new research program for the PI, and as such, needed to be developed from the ground up. Therefore, the first fiscal year of this grant, 08/01/2014-07/31/2015, has been spent predominantly establishing this new research effort. Very good progress has been made, although, at this time, there are not many publications to show for the effort. After one year, the PI accepted a job at Lawrence Berkeley National Laboratory, so this final report covers just a single year of five years of the grant.
International Nuclear Information System (INIS)
Yang, Jie; Fan, Aiwu; Liu, Wei; Jacobi, Anthony M.
2014-01-01
Highlights: • A design method of heat exchangers motivated by constructal theory is proposed. • A genetic algorithm is applied and the TEMA standards are rigorously followed. • Three cases are studied to illustrate the advantage of the proposed design method. • The design method will reduce the total cost compared to two other methods. - Abstract: A modified optimization design approach motivated by constructal theory is proposed for shell-and-tube heat exchangers in the present paper. In this method, a shell-and-tube heat exchanger is divided into several in-series heat exchangers. The Tubular Exchanger Manufacturers Association (TEMA) standards are rigorously followed for all design parameters. The total cost of the whole shell-and-tube heat exchanger is set as the objective function, including the investment cost for initial manufacture and the operational cost involving the power consumption to overcome the frictional pressure loss. A genetic algorithm is applied to minimize the cost function by adjusting parameters such as the tube and shell diameters, tube length and tube arrangement. Three cases are studied which indicate that the modified design approach can significantly reduce the total cost compared to the original design method and traditional genetic algorithm design method
Precision Higgs Physics, Effective Field Theory, and Dark Matter
Henning, Brian Quinn
The recent discovery of the Higgs boson calls for detailed studies of its properties. As precision measurements are indirect probes of new physics, the appropriate theoretical framework is effective field theory. In the first part of this thesis, we present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on the UV model concerned. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. The covariant derivative expansion dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of renormalization group running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. With a detailed understanding of how to use the SM EFT, we then turn to applications and study in detail two well-motivated test cases. The first is singlet scalar field that enables the first-order electroweak phase transition for baryogenesis; the second example is due to scalar tops in the MSSM. We find both Higgs and electroweak measurements are sensitive probes of these cases. The second part of this thesis centers around dark matter, and consists of two studies. In the first, we examine the effects of relic dark matter annihilations on big bang nucleosynthesis (BBN). The magnitude of these effects scale simply with the dark matter mass and annihilation cross-section, which we derive. Estimates based on these scaling behaviors indicate that BBN severely constrains hadronic and radiative dark
The standard model as a low-energy effective theory. What is triggering the Higgs mechanism?
Energy Technology Data Exchange (ETDEWEB)
Jegerlehner, Fred [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Humboldt Univ., Berlin (Germany). Inst. fuer Physik
2013-04-15
The discovery of the Higgs by ATLAS and CMS at the LHC not only provided the last missing building block of the electroweak Standard Model, the mass of the Higgs has been found to have a very peculiar value about 125 GeV, which is such that vacuum stability is extending up to the Planck scale. This may have much deeper drawback than anticipated so far. The impact on the running of the SM gauge, Yukawa and Higgs couplings up to the Planck scale has been discussed in several articles recently. Here we consider the impact on the running masses and we discuss the role of quadratic divergences within the Standard Model. The change of sign of the coefficient of the quadratically divergent terms showing up at about {mu}{sub 0}{proportional_to}7 x 10{sup 16} GeV may be understood as a first order phase transition restoring the symmetric phase, while its large negative values at lower scales triggers the Higgs mechanism, running parameters evolve in such a way that the symmetry is restored two orders of magnitude before the Planck scale. Thus, the electroweak phase transition takes place at the scale {mu}{sub 0} and not at the electroweak scale {upsilon}{proportional_to}250 GeV. The SM Higgs system and its phase transition could play a key role for the inflation of the early universe. Also baryogenesis has to be reconsidered under the aspect that perturbative arguments surprisingly work up to the Planck scale.
Afify, Mohammed Kamal
2018-01-01
The present study aims to identify standards of interactive digital concepts maps design and their measurement indicators as a tool to develop, organize and administer e-learning content in the light of Meaningful Learning Theory and Constructivist Learning Theory. To achieve the objective of the research, the author prepared a list of E-learning…
Clifford, Scott; Iyengar, Vijeth; Cabeza, Roberto; Sinnott-Armstrong, Walter
2015-12-01
Research on the emotional, cognitive, and social determinants of moral judgment has surged in recent years. The development of moral foundations theory (MFT) has played an important role, demonstrating the breadth of morality. Moral psychology has responded by investigating how different domains of moral judgment are shaped by a variety of psychological factors. Yet, the discipline lacks a validated set of moral violations that span the moral domain, creating a barrier to investigating influences on judgment and how their neural bases might vary across the moral domain. In this paper, we aim to fill this gap by developing and validating a large set of moral foundations vignettes (MFVs). Each vignette depicts a behavior violating a particular moral foundation and not others. The vignettes are controlled on many dimensions including syntactic structure and complexity making them suitable for neuroimaging research. We demonstrate the validity of our vignettes by examining respondents' classifications of moral violations, conducting exploratory and confirmatory factor analysis, and demonstrating the correspondence between the extracted factors and existing measures of the moral foundations. We expect that the MFVs will be beneficial for a wide variety of behavioral and neuroimaging investigations of moral cognition.
Alternative theory to overcome drawbacks of the standard inversion of seismic data
Smaglichenko, Tatyana A.; Horiuchi, Shigeki; Nikolaev, Alexey V.; Jacoby, Wolfgang R.
2010-05-01
Standard least squares method gives the possibility to cope with overdetermined inconsistent systems of linear equations. It is accepted that the more measurements will participate in the general inversion,the better solution will be obtained. However seismic data is special case of geophysical observations, which in practice have a large range of values of measurements for similar seismic traces. The processing of such contradictory observations together can lead to the lost of the adequate solution. In order to avoid the problem we use the differentiated approach that subdivides measurements into sets (sub-systems), which are formed by means of clusters of seismic activity and the different registration stations. This means that the initial matrix is divided into cells, each of which is non-sparse matrix and it has the lower order. The inversion of the subsystem is performed by using the methods, which are distinctive from the least-square technique. In the case of the sufficient number of independent observations we use the traditional algebraic method to build the basic minor of matrix, which provides the consistency of the sub-system. In the case when the seismic observations repeat each other (the sub-system becomes underdetermined) we apply the CSSA technique, previously developed. The stable solution is found on the base of the comparing outcomes of sub-systems for a given block. Testing results show that the differentiated approach is more reliable when overdetermined inconsistent system is solved. When we have deal with underdetermined system and consequently with the problem of the nonuniqueness of solutions, we can obtain the solution, which differs from the standard. However our solution accurately satisfies to the observation data and thus it also becomes the candidate for the correct inversion result. The effectiveness of new approach compared with the standard methods had been confirmed by applications both approaches to the synthetic and the real
Wu, Yue-Liang
2014-04-01
To understand better the quantum structure of field theory and standard model in particle physics, it is necessary to investigate carefully the divergence structure in quantum field theories (QFTs) and work out a consistent framework to avoid infinities. The divergence has got us into trouble since developing quantum electrodynamics in 1930s. Its treatment via the renormalization scheme is satisfied not by all physicists, like Dirac and Feynman who have made serious criticisms. The renormalization group analysis reveals that QFTs can in general be defined fundamentally with the meaningful energy scale that has some physical significance, which motivates us to develop a new symmetry-preserving and infinity-free regularization scheme called loop regularization (LORE). A simple regularization prescription in LORE is realized based on a manifest postulation that a loop divergence with a power counting dimension larger than or equal to the space-time dimension must vanish. The LORE method is achieved without modifying original theory and leads the divergent Feynman loop integrals well-defined to maintain the divergence structure and meanwhile preserve basic symmetries of original theory. The crucial point in LORE is the presence of two intrinsic energy scales which play the roles of ultraviolet cutoff Mc and infrared cutoff μs to avoid infinities. As Mc can be made finite when taking appropriately both the primary regulator mass and number to be infinity to recover the original integrals, the two energy scales Mc and μs in LORE become physically meaningful as the characteristic energy scale and sliding energy scale, respectively. The key concept in LORE is the introduction of irreducible loop integrals (ILIs) on which the regularization prescription acts, which leads to a set of gauge invariance consistency conditions between the regularized tensor-type and scalar-type ILIs. An interesting observation in LORE is that the evaluation of ILIs with ultraviolet
Standardization in clinical enzymology: a challenge for the theory of metrological traceability.
Infusino, Ilenia; Schumann, Gerhard; Ceriotti, Ferruccio; Panteghini, Mauro
2010-03-01
The goal of standardization for measurement of the catalytic concentration of enzymes is to achieve comparable results in human samples, independent of the reagent kits, instruments, and laboratory where the assay is performed. To pursue this objective, the IFCC has established reference systems for the most important clinical enzymes. These systems are based on the following requirements: a) reference methods, well described and evaluated extensively; b) suitable reference materials; and c) reference laboratories operating in a highly controlled manner. When these reference systems are used appropriately, the diagnostic industry can assign traceable values to commercial calibrators. Clinical laboratories that use procedures with validated calibrators to measure human specimens can now obtain values that are traceable to higher-order reference procedures. These reference systems constitute the structure of the traceability chain to which the routine methods can be linked via an appropriate calibration process, provided that they have a comparable specificity (i.e., they are measuring the same catalytic quantity).
Preliminary results on 3D channel modeling: From theory to standardization
Kammoun, Abla
2014-06-01
Three dimensional (3D) beamforming (also elevation beamforming) is now gaining interest among researchers in wireless communication. The reason can be attributed to its potential for enabling a variety of strategies such as sector or user specific elevation beamforming and cell-splitting. Since these techniques cannot be directly supported by current LTE releases, the 3GPP is now working on defining the required technical specifications. In particular, a large effort is currently being made to get accurate 3D channel models that support the elevation dimension. This step is necessary as it will evaluate the potential of 3D and full dimensional (FD) beamforming techniques to benefit from the richness of real channels. This work aims at presenting the on-going 3GPP study item \\'study on 3D-channel model for elevation beamforming and FD-MIMO studies for LTE\\' and positioning it with respect to previous standardization works. © 2014 IEEE.
Directory of Open Access Journals (Sweden)
Alexandre Gonzales
2013-05-01
Full Text Available Accounting has been undergoing significant changes, and among these changes is the creation of a standard for the accounting of small and medium enterprises, in line with international accounting standards for companies of this size. This rule arose from the development of a pronouncement by the Accounting Pronouncements Committee – CPC, which subsequently was approved by the Federal Accounting Council - CFC through specific resolutions. The present study aimed to analyze trends of accounting professionals and business managers concerning the adoption of Technical Pronouncement emitted by Accounting Pronouncements Committee for Small and Medium Enterprises. Considering the level of complexity of the operations performed by companies of this size, the lack of oversight by specific entities and the question of enforcement of these pronouncements, it was used the game theory to determine possible strategies adopted by the accountants and business managers regarding to the effective adoption of pronouncement for SMEs. The study was characterized as a descriptive research, using bibliographical and field research. With the use of surveys sought to identify the perceptions of accounting professionals regarding the adoption of the pronouncement. It was found that this statement constitutes a valid legal standard, endowed with legal effectiveness and technical efficiency, but lacking social effectiveness, due to the low level of efforts for its adoption, both by accounting professionals and by firms.
The standard model and some new directions. [for scientific theory of Active Galactic Nuclei
Blandford, R. D.; Rees, M. J.
1992-01-01
A 'standard' model of Active Galactic Nuclei (AGN), based upon a massive black hole surrounded by a thin accretion disk, is defined. It is argued that, although there is good evidence for the presence of black holes and orbiting gas, most of the details of this model are either inadequate or controversial. Magnetic field may be responsible for the confinement of continuum and line-emitting gas, for the dynamical evolution of accretion disks and for the formation of jets. It is further argued that gaseous fuel is supplied in molecular form and that this is responsible for thermal re-radiation, equatorial obscuration and, perhaps, the broad line gas clouds. Stars may also supply gas close to the black hole, especially in low power AGN and they may be observable in discrete orbits as probes of the gravitational field. Recent observations suggest that magnetic field, stars, dusty molecular gas and orientation effects must be essential components of a complete description of AGN. The discovery of quasars with redshifts approaching 5 is an important clue to the mechanism of galaxy formation.
Energy Technology Data Exchange (ETDEWEB)
Brower, Richard C. [Boston Univ., MA (United States). Physics and ECE Depts.
2016-11-08
This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data that will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.
International Nuclear Information System (INIS)
Brower, Richard C.
2016-01-01
This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data that will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.
Energy Technology Data Exchange (ETDEWEB)
Mankoc Borstnik, N.S. [University of Ljubljana (Slovenia); Nielsen, H.B.F. [Niels Bohr Institute, Copenhagen (Denmark)
2017-12-15
The standard model has for massless quarks and leptons ''miraculously'' no triangle anomalies due to the fact that the sum of all possible traces T r[τ{sup Ai}τ{sup Bj}τ{sup Ck}] - where τ{sup Ai}, τ{sup Bi} and τ{sup Ck} are the generators of one, of two or of three of the groups SU(3), SU(2) and U(1) - over the representations of one family of the left handed fermions and anti-fermions (and separately of the right handed fermions and anti-fermions), contributing to the triangle currents, is equal to zero.{sup [1-4]} It is demonstrated in this paper that this cancellation of the standard model triangle anomaly follows straightforwardly if the SO(3, 1), SU(2), U(1) and SU(3) are the subgroups of the orthogonal group SO(13, 1), as it is in the spin-charge-family theory.{sup [5-22]} We comment on the SO(10) anomaly cancellation, which works if handedness and charges are related ''by hand''. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Bounds on Scalar Masses in Theories of Moduli Stabilization
Acharya, Bobby Samir; Kuflik, Eric
2014-01-01
In recent years it has been realised that pre-BBN decays of moduli can be a significant source of dark matter production, giving a `non-thermal WIMP miracle' and substantially reduced fine-tuning in cosmological axion physics. We study moduli masses and sharpen the claim that moduli dominated the pre-BBN Universe. We conjecture that in any string theory with stabilized moduli there will be at least one modulus field whose mass is of order (or less than) the gravitino mass and we prove this for a large class of models based on Calabi-Yau extra dimensions. Cosmology then generically requires the gravitino mass not be less than about 30 TeV and the cosmological history of the Universe is non-thermal prior to BBN. Stable LSP's produced in these decays can account for the observed dark matter if they are `wino-like,' which is consistent with the PAMELA data for positrons and antiprotons. With WIMP dark matter, there is an upper limit on the gravitino mass of order 250 TeV. We briefly consider implications for the ...
Energy Technology Data Exchange (ETDEWEB)
Wang, Chao; Xu, Zhijie; Lai, Canhai; Sun, Xin
2018-07-01
The standard two-film theory (STFT) is a diffusion-based mechanism that can be used to describe gas mass transfer across liquid film. Fundamental assumptions of the STFT impose serious limitations on its ability to predict mass transfer coefficients. To better understand gas absorption across liquid film in practical situations, a multiphase computational fluid dynamics (CFD) model fully equipped with mass transport and chemistry capabilities has been developed for solvent-based carbon dioxide (CO2) capture to predict the CO2 mass transfer coefficient in a wetted wall column. The hydrodynamics is modeled using a volume of fluid method, and the diffusive and reactive mass transfer between the two phases is modeled by adopting a one-fluid formulation. We demonstrate that the proposed CFD model can naturally account for the influence of many important factors on the overall mass transfer that cannot be quantitatively explained by the STFT, such as the local variation in fluid velocities and properties, flow instabilities, and complex geometries. The CFD model also can predict the local mass transfer coefficient variation along the column height, which the STFT typically does not consider.
van Lange, P.A.M.
2013-01-01
The construction and development of theory is one of the central routes to scientific progress. But what exactly constitutes a good theory? What is it that people might expect from an ideal theory? This article advances a new model, which delineates truth, abstraction, progress, and applicability as
Moradi, Bonnie; Dirks, Danielle; Matteson, Alicia V.
2005-01-01
This study extends the literature on eating disorder symptomatology by testing, based on extant literature on objectification theory (B. L. Fredrickson & T. Roberts, 1997) and the role of sociocultural standards of beauty (e.g., L. J. Heinberg, J. K. Thompson, & S. Stormer, 1995), a model that examines (a) links of reported sexual objectification…
Chonody, Jill M; Teater, Barbra
2016-01-01
Outward appearance is one of the means by which age is determined, and fear of looking old may stem from fears about social identity and death. This study explored how social identity theory and terror management theory may help to explain the dread of looking old. University students from the United States, England, and Australia (N = 1,042) completed a questionnaire regarding their attitudes about aging and older adults. Results indicated that sex, age, beliefs about personal aging, and death anxiety explained 30.4% of the variance for participants' dread of looking old. Theoretical hypotheses were supported by our findings.
Zylstra, A B; Herrmann, H W; Johnson, M Gatu; Kim, Y H; Frenje, J A; Hale, G; Li, C K; Rubery, M; Paris, M; Bacher, A; Brune, C R; Forrest, C; Glebov, V Yu; Janezic, R; McNabb, D; Nikroo, A; Pino, J; Sangster, T C; Séguin, F H; Seka, W; Sio, H; Stoeckl, C; Petrasso, R D
2016-07-15
Light nuclei were created during big-bang nucleosynthesis (BBN). Standard BBN theory, using rates inferred from accelerator-beam data, cannot explain high levels of ^{6}Li in low-metallicity stars. Using high-energy-density plasmas we measure the T(^{3}He,γ)^{6}Li reaction rate, a candidate for anomalously high ^{6}Li production; we find that the rate is too low to explain the observations, and different than values used in common BBN models. This is the first data directly relevant to BBN, and also the first use of laboratory plasmas, at comparable conditions to astrophysical systems, to address a problem in nuclear astrophysics.
International Nuclear Information System (INIS)
Genova, R.T.; Martin, I.R.; Rodriguez-Mendoza, U.R.; Lahoz, F.; Lozano-Gorrin, A.D.; Nunez, P.; Gonzalez-Platas, J.; Lavin, V.
2004-01-01
The optical characterisation of Pr 3+ ions in transparent SiO 2 -Al 2 O 3 -CdF 2 -PbF 2 -YF 3 based glass and glass-ceramic have been performed. From absorption and emission spectra the oscillator strengths of the 4f 2 -4f 2 electronic transitions have been obtained. The intensity parameters have been calculated using both the Judd-Ofelt theory and the modified theory developed by Kornienko, Kaminskii and Dunina. A comparison of the experimental oscillator strengths, the spontaneous emission probabilities and the lifetimes of the 3 P 0 level and those calculated using the above theoretical procedures has been performed for both samples. The root mean square deviation found using the standard Judd-Ofelt theory is larger than the value obtained with the modified treatment
Procacci, Piero; Chelli, Riccardo
2017-05-09
The present paper is intended to be a comprehensive assessment and rationalization, from a statistical mechanics perspective, of existing alchemical theories for binding free energy calculations of ligand-receptor systems. In detail, the statistical mechanics foundation of noncovalent interactions in ligand-receptor systems is revisited, providing a unifying treatment that encompasses the most important variants in the alchemical approaches from the seminal double annihilation method [ Jorgensen et al. J. Chem. Phys. 1988 ; 89 , 3742 ] to the double decoupling method [ Gilson et al. Biophys. J. 1997 ; 72 , 1047 ] and the Deng and Roux alchemical theory [ Deng and Roux J. Chem. Theory Comput. 2006 ; 2 , 1255 ]. Connections and differences between the various alchemical approaches are highlighted and discussed.
Richter Lagha, Regina A; Boscardin, Christy K; May, Win; Fung, Cha-Chi
2012-08-01
Scoring clinical assessments in a reliable and valid manner using criterion-referenced standards remains an important issue and directly affects decisions made regarding examinee proficiency. This generalizability study of students' clinical performance examination (CPX) scores examines the reliability of those scores and of their interpretation, particularly according to a newly introduced, "critical actions" criterion-referenced standard and scoring approach. The authors applied a generalizability framework to the performance scores of 477 third-year students attending three different medical schools in 2008. The norm-referenced standard included all station checklist items. The criterion-referenced standard included only those items deemed critical to patient care by a faculty panel. The authors calculated and compared variance components and generalizability coefficients for each standard across six common stations. Norm-referenced scores had moderate generalizability (ρ = 0.51), whereas criterion-referenced scores showed low dependability (φ = 0.20). The estimated 63% of measurement error associated with the person-by-station interaction suggests case specificity. Increasing the number of stations on the CPX from 6 to 24, an impractical solution both for cost and time, would still yield only moderate dependability (φ = 0.50). Though the performance assessment of complex skills, like clinical competence, seems intrinsically valid, careful consideration of the scoring standard and approach is needed to avoid misinterpretation of proficiency. Further study is needed to determine how best to improve the reliability of criterion-referenced scores, by implementing changes to the examination structure, the process of standard-setting, or both.
MARMOSET: The Path from LHC Data to the New Standard Model via On-Shell Effective Theories
Energy Technology Data Exchange (ETDEWEB)
Arkani-Hamed, Nima; Schuster, Philip; Toro, Natalia; /Harvard U., Phys. Dept.; Thaler, Jesse; /UC, Berkeley /LBL, Berkeley; Wang, Lian-Tao; /Princeton U.; Knuteson, Bruce; /MIT, LNS; Mrenna, Stephen; /Fermilab
2007-03-01
We describe a coherent strategy and set of tools for reconstructing the fundamental theory of the TeV scale from LHC data. We show that On-Shell Effective Theories (OSETs) effectively characterize hadron collider data in terms of masses, production cross sections, and decay modes of candidate new particles. An OSET description of the data strongly constrains the underlying new physics, and sharply motivates the construction of its Lagrangian. Simulating OSETs allows efficient analysis of new-physics signals, especially when they arise from complicated production and decay topologies. To this end, we present MARMOSET, a Monte Carlo tool for simulating the OSET version of essentially any new-physics model. MARMOSET enables rapid testing of theoretical hypotheses suggested by both data and model-building intuition, which together chart a path to the underlying theory. We illustrate this process by working through a number of data challenges, where the most important features of TeV-scale physics are reconstructed with as little as 5 fb{sup -1} of simulated LHC signals.
Egeberg, Helen M.; McConney, Andrew; Price, Anne
2016-01-01
This article reviews the conceptual and empirical research on classroom management to ascertain the extent to which there is consistency between the "advice" found in the research literature and the professional standards for teachers and initial teacher education, in regards to knowledge and perspectives about effective classroom…
Weck, Philippe F; Kim, Eunja; Jové-Colón, Carlos F
2015-07-28
The structural, mechanical and thermodynamic properties of 1 : 1 layered dioctahedral kaolinite clay, with ideal Al2Si2O5(OH)4 stoichiometry, were investigated using density functional theory corrected for dispersion interactions (DFT-D2). The bulk moduli of 56.2 and 56.0 GPa predicted at 298 K using the Vinet and Birch-Murnaghan equations of state, respectively, are in good agreement with the recent experimental value of 59.7 GPa reported for well-crystallized samples. The isobaric heat capacity computed for uniaxial deformation of kaolinite along the stacking direction reproduces calorimetric data within 0.7-3.0% from room temperature up to its thermal stability limit.
AUTHOR|(INSPIRE)INSPIRE-00345539
A search for a heavy right-handed $W_{R}$ boson, and heavy right-handed neutrinos $N_{\\ell}$ ($\\ell = e, \\mu$) performed by the CMS experiment is summarized here. Using the 2.6 fb$^{-1}$ of integrated luminosity recorded by the CMS experiment in 2015 at a center-of-mass energy of 13 TeV, this search seeks evidence of a $W_{R}$ boson and $N_{\\ell}$ neutrinos in events with two leptons and two jets. The data do not significantly exceed expected backgrounds, and are consistent with expected results of the Standard Theory given uncertainties. For Standard Theory extensions with strict left-right symmetry, and assuming only one $N_{\\ell}$ flavor contributes significantly to the $W_{R}$ decay width, mass limits are set in the two-dimensional $(M_{W_{R}}, M_{N_{\\ell}})$ plane at 95\\% confidence level. The limits extend to a $W_{R}$ mass of 3.3 TeV in the electron channel and 3.5 TeV in the muon channel, and span a wide range of $M_{N_{\\ell}}$ masses below $M_{W_{R}}$.
Loring, FH
2014-01-01
Summarising the most novel facts and theories which were coming into prominence at the time, particularly those which had not yet been incorporated into standard textbooks, this important work was first published in 1921. The subjects treated cover a wide range of research that was being conducted into the atom, and include Quantum Theory, the Bohr Theory, the Sommerfield extension of Bohr's work, the Octet Theory and Isotopes, as well as Ionisation Potentials and Solar Phenomena. Because much of the material of Atomic Theories lies on the boundary between experimentally verified fact and spec
Invariant Theory (IT) & Standard Monomial Theory (SMT)
Indian Academy of Sciences (India)
2013-07-06
Jul 6, 2013 ... Imagine several points in the (usual 2-dimensional) plane, say 4 of them, coloured red, blue, green, and yellow, scattered around a given central point. Introducing co-ordinate axes in the usual fashion (with origin at the central point), we can represent the points by ordered pairs (xred, yred), (xblue, yblue), ...
Invariant Theory (IT) & Standard Monomial Theory (SMT)
Indian Academy of Sciences (India)
2013-07-06
Jul 6, 2013 ... Introducing co-ordinate axes in the usual fashion (with origin at the central point), we can represent the points by ordered pairs (xred, yred), (xblue, yblue), (xgreen, ygreen),. (xyellow, yyellow). ..... What are the (polynomial) invariants in this case? The dot products x2 red + y2 red, . . . (of every point with itself) ...
Challenges to the standard model of Big Bang nucleosynthesis
International Nuclear Information System (INIS)
Steigman, G.
1993-01-01
Big Bang nucleosynthesis provides a unique probe of the early evolution of the Universe and a crucial test of the consistency of the standard hot Big Bang cosmological model. Although the primordial abundances of 2 H, 3 He, 4 He, and 7 Li inferred from current observational data are in agreement with those predicted by Big Bang nucleosynthesis, recent analysis has severely restricted the consistent range for the nucleon-to-photon ratio: 3.7 ≤ η 10 ≤ 4.0. Increased accuracy in the estimate of primordial 4 he and observations of Be and B in Pop II stars are offering new challenges to the standard model and suggest that no new light particles may be allowed (N ν BBN ≤ 3.0, where N ν is the number of equivalent light neutrinos). 23 refs
The BBN Knowledge Acquisition Project: Phase 2
1990-11-01
2.3.1 Window Display Operations 13 3. Extended Classification in KREME 15 4. KREME and SFL in CLOS 19 4.1 A Meta-class based Frame system 19 4.1.1...Display Operations - This command brings up a menu of operations on the window, as described below. 2.3.1 Window Display Operations Each window can, in
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
International Nuclear Information System (INIS)
Quigg, C.
1984-09-01
The SU(3)/sub c/ circle crossSU(2)/sub L/circle crossU(1)/sub Y/ gauge theory of ineractions among quarks and leptons is briefly described, and some recent notable successes of the theory are mentioned. Some shortcomings in our ability to apply the theory are noted, and the incompleteness of the standard model is exhibited. Experimental hints that Nature may be richer in structure than the minimal theory are discussed. 23 references
International Nuclear Information System (INIS)
Jarlskog, C.
An introduction to the unified gauge theories of weak and electromagnetic interactions is given. The ingredients of gauge theories and symmetries and conservation laws lead to discussion of local gauge invariance and QED, followed by weak interactions and quantum flavor dynamics. The construction of the standard SU(2)xU(1) model precedes discussion of the unification of weak and electromagnetic interactions and weak neutral current couplings in this model. Presentation of spontaneous symmetry breaking and spontaneous breaking of a local symmetry leads to a spontaneous breaking scheme for the standard SU(2)xU(1) model. Consideration of quarks, leptons, masses and the Cabibbo angles, of the four quark and six quark models and CP violation lead finally to grand unification, followed by discussion of mixing angles in the Georgi-Glashow model, the Higgses of the SU(5) model and proton/ neutron decay in SU(5). (JIW)
Hashiguchi, Koichi
2014-01-01
This book was written to serve as the standard textbook of elastoplasticity for students, engineers and researchers in the field of applied mechanics. The present second edition is improved thoroughly from the first edition by selecting the standard theories from various formulations and models, which are required to study the essentials of elastoplasticity steadily and effectively and will remain universally in the history of elastoplasticity. It opens with an explanation of vector-tensor analysis and continuum mechanics as a foundation to study elastoplasticity theory, extending over various strain and stress tensors and their rates. Subsequently, constitutive equations of elastoplastic and viscoplastic deformations for monotonic, cyclic and non-proportional loading behavior in a general rate and their applications to metals and soils are described in detail, and constitutive equations of friction behavior between solids and its application to the prediction of stick-slip phenomena are delineated. In additi...
CERN. Geneva HR-RFA
2006-01-01
Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.
International Nuclear Information System (INIS)
Gaillard, M.K.
1990-04-01
The unresolved issues of the standard model are reviewed, with emphasis on the gauge hierarchy problem. A possible mechanism for generating a hierarchy in the context of superstring theory is described. 24 refs
String theory and particle physics
International Nuclear Information System (INIS)
Uranga, Angel
2006-01-01
I will provide a basic introduction to string theory as a unified theory of gravitational and gauge interactions. I will review recent constructions of string theory models leading at low energies to the Standard Model of particle interactions, and which include interesting new phenomenology beyond the standard model, like supersymmetry, boranes, and (possible large) extra dimensions
Diestel, Reinhard
2017-01-01
This standard textbook of modern graph theory, now in its fifth edition, combines the authority of a classic with the engaging freshness of style that is the hallmark of active mathematics. It covers the core material of the subject with concise yet reliably complete proofs, while offering glimpses of more advanced methods in each field by one or two deeper results, again with proofs given in full detail. The book can be used as a reliable text for an introductory course, as a graduate text, and for self-study. From the reviews: “This outstanding book cannot be substituted with any other book on the present textbook market. It has every chance of becoming the standard textbook for graph theory.”Acta Scientiarum Mathematiciarum “Deep, clear, wonderful. This is a serious book about the heart of graph theory. It has depth and integrity. ”Persi Diaconis & Ron Graham, SIAM Review “The book has received a very enthusiastic reception, which it amply deserves. A masterly elucidation of modern graph theo...
American Society for Testing and Materials. Philadelphia
2005-01-01
1.1 This test method covers the calculation from heat transfer theory of the stagnation enthalpy from experimental measurements of the stagnation-point heat transfer and stagnation pressure. 1.2 Advantages 1.2.1 A value of stagnation enthalpy can be obtained at the location in the stream where the model is tested. This value gives a consistent set of data, along with heat transfer and stagnation pressure, for ablation computations. 1.2.2 This computation of stagnation enthalpy does not require the measurement of any arc heater parameters. 1.3 Limitations and ConsiderationsThere are many factors that may contribute to an error using this type of approach to calculate stagnation enthalpy, including: 1.3.1 TurbulenceThe turbulence generated by adding energy to the stream may cause deviation from the laminar equilibrium heat transfer theory. 1.3.2 Equilibrium, Nonequilibrium, or Frozen State of GasThe reaction rates and expansions may be such that the gas is far from thermodynamic equilibrium. 1.3.3 Noncat...
Altarelli, Guido
1999-01-01
Introduction structure of gauge theories. The QEDand QCD examples. Chiral theories. The electroweak theory. Spontaneous symmetry breaking. The Higgs mechanism Gauge boson and fermion masses. Yukawa coupling. Charges current couplings. The Cabibo-Kobayashi-Maskawa matrix and CP violation. Neutral current couplings. The Glasow-Iliopoulos-Maiani mechanism. Gauge boson and Higgs coupling. Radiative corrections and loops. Cancellation of the chiral anomaly. Limits on the Higgs comparaison. Problems of the Standard Model. Outlook.
DEFF Research Database (Denmark)
Wæver, Ole
2009-01-01
Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism and refle......Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism...... and reflectivism. Yet, ironically, there has been little attention to Waltz's very explicit and original arguments about the nature of theory. This article explores and explicates Waltz's theory of theory. Central attention is paid to his definition of theory as ‘a picture, mentally formed' and to the radical anti...
Diestel, Reinhard
2012-01-01
HauptbeschreibungThis standard textbook of modern graph theory, now in its fourth edition, combinesthe authority of a classic with the engaging freshness of style that is the hallmarkof active mathematics. It covers the core material of the subject with concise yetreliably complete proofs, while offering glimpses of more advanced methodsin each field by one or two deeper results, again with proofs given in full detail.The book can be used as a reliable text for an introductory course, as a graduatetext, and for self-study. Rezension"Deep, clear, wonderful. This is a serious book about the
Hall, Marshall
2018-01-01
This 1959 text offers an unsurpassed resource for learning and reviewing the basics of a fundamental and ever-expanding area. "This remarkable book undoubtedly will become a standard text on group theory." - American Scientist.
International Nuclear Information System (INIS)
Wilczek, F.
1993-01-01
The standard model of particle physics is highly successful, although it is obviously not a complete or final theory. In this presentation the author argues that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Essentially, this presentation is a record of the author's own judgement of what the central clues for physics beyond the standard model are, and also it is an attempt at some pedagogy. 14 refs., 6 figs
Cooney, Adeline
2011-01-01
This paper explores ways to enhance and demonstrate rigour in a grounded theory study. Grounded theory is sometimes criticised for a lack of rigour. Beck (1993) identified credibility, auditability and fittingness as the main standards of rigour for qualitative research methods. These criteria were evaluated for applicability to a Straussian grounded theory study and expanded or refocused where necessary. The author uses a Straussian grounded theory study (Cooney, In press) to examine how the revised criteria can be applied when conducting a grounded theory study. Strauss and Corbin (1998b) criteria for judging the adequacy of a grounded theory were examined in the context of the wider literature examining rigour in qualitative research studies in general and grounded theory studies in particular. A literature search for 'rigour' and 'grounded theory' was carried out to support this analysis. Criteria are suggested for enhancing and demonstrating the rigour of a Straussian grounded theory study. These include: cross-checking emerging concepts against participants' meanings, asking experts if the theory 'fit' their experiences, and recording detailed memos outlining all analytical and sampling decisions. IMPLICATIONS FOR RESEARCH PRACTICE: The criteria identified have been expressed as questions to enable novice researchers to audit the extent to which they are demonstrating rigour when writing up their studies. However, it should not be forgotten that rigour is built into the grounded theory method through the inductive-deductive cycle of theory generation. Care in applying the grounded theory methodology correctly is the single most important factor in ensuring rigour.
DEFF Research Database (Denmark)
Løvengreen, Hans Henrik
2002-01-01
In this set of notes, we present some of the basic theory underlying the discipline of programming with concurrent processes/threads. The notes are intended to supplement a standard textbook on concurrent programming.......In this set of notes, we present some of the basic theory underlying the discipline of programming with concurrent processes/threads. The notes are intended to supplement a standard textbook on concurrent programming....
Directory of Open Access Journals (Sweden)
Gabriel Constantino Blain
2014-01-01
Full Text Available The Standardized Precipitation Index (SPI is a mathematical algorithm developed for detecting and characterizing precipitation departures with regard to an expected regional climate condition. Thus, this study aimed to verify the possibility of using the time-independent general extreme value distribution (GEV for modeling the probability of occurrence of both SPI annual maxima (the maximum monthly SPI value; SPImax and SPI annual minima (the minimum monthly SPI value; SPImim obtained from the weather station of Campinas, State of São Paulo, Brazil (1891-2011 and to evaluate the presence of trends, temporal persistence and periodical components in these two datasets. The goodness-of-fit tests used in this study quantify the agreement between the empirical cumulative distribution and the GEV cumulative function. Our results have indicated that such parametric function can be used to assess the probability of occurrence of SPImin and SPImax values. No significant serial correlation and no trend were detected in both series. For the SPImim, the wavelet analysis has detected a dominant mode in the 4-8 year band. Future studies should focus on the development of a GEV model capable of accounting for such feature. No dominant mode was found for the annual monthly SPI maximums.
Extended Theories of Gravitation
Directory of Open Access Journals (Sweden)
Fatibene Lorenzo
2013-09-01
Full Text Available Extended theories of gravitation are naturally singled out by an analysis inspired by the Ehelers-Pirani-Schild framework. In this framework the structure of spacetime is described by a Weyl geometry which is enforced by dynamics. Standard General Relativity is just one possible theory within the class of extended theories of gravitation. Also all Palatini f(R theories are shown to be extended theories of gravitation. This more general setting allows a more general interpretation scheme and more general possible couplings between gravity and matter. The definitions and constructions of extended theories will be reviewed. A general interpretation scheme will be considered for extended theories and some examples will be considered.
Energy Technology Data Exchange (ETDEWEB)
Gerlich, G. [Universitaet Carolo-Wilhelmina, Braunschweig (Germany)
1992-07-01
The first three of these axioms describe quantum theory and classical mechanics as statistical theories from the very beginning. With these, it can be shown in which sense a more general than the conventional measure theoretic probability theory is used in quantum theory. One gets this generalization defining transition probabilities on pairs of events (not sets of pairs) as a fundamental, not derived, concept. A comparison with standard theories of stochastic processes gives a very general formulation of the non existence of quantum theories with hidden variables. The Cartesian product of probability spaces can be given a natural algebraic structure, the structure of an orthocomplemented, orthomodular, quasimodular, not modular, not distributive lattice, which can be compared with the quantum logic (lattice of all closed subspaces of an infinite dimensional Hilbert space). It is shown how our given system of axioms suggests generalized quantum theories, especially Schroedinger equations, for phase space amplitudes. 38 refs., 3 figs., 1 tab.
Fundamentals of electroweak theory
Hořejší, Jiří
2002-01-01
This monograph of Prof. Horejší is based on a series of his lectures which took place at Faculty of Mathematics and Physics of Charles University during 1990s. The author gives a thorough and easy-to-read account of the basic principles of the standard model of electroweak interactions, describes various theories of electromagnetic and weak interactions, and explains the gauge theory of electroweak interactions. Five appendices expound on some special techniques of the Standard Model, used in the main body of the text. Thanks to the author's pedagogical skills and professional erudition, the book can be read just with a preliminary knowledge of quantum field theory.
Boonstra, Harm Jan Hugo
1996-01-01
The physics of elementary particles is currently described in terms of a very successful theory called the standard model. It describes all known elementary particles and their interactions except gravitational interactions. The standard model accommodates the quarks and the leptons which are the
Burgess, Cliff; Moore, Guy
2012-04-01
List of illustrations; List of tables; Preface; Acknowledgments; Part I. Theoretical Framework: 1. Field theory review; 2. The standard model: general features; 3. Cross sections and lifetimes; Part II. Applications: Leptons: 4. Elementary boson decays; 5. Leptonic weak interactions: decays; 6. Leptonic weak interactions: collisions; 7. Effective Lagrangians; Part III. Applications: Hadrons: 8. Hadrons and QCD; 9. Hadronic interactions; Part IV. Beyond the Standard Model: 10. Neutrino masses; 11. Open questions, proposed solutions; Appendix A. Experimental values for the parameters; Appendix B. Symmetries and group theory review; Appendix C. Lorentz group and the Dirac algebra; Appendix D. ξ-gauge Feynman rules; Appendix E. Metric convention conversion table; Select bibliography; Index.
Introduction to quantum field theory
International Nuclear Information System (INIS)
Kazakov, D.I.
1988-01-01
The lectures appear to be a continuation to the introduction to elementary principles of the quantum field theory. The work is aimed at constructing the formalism of standard particle interaction model. Efforts are made to exceed the limits of the standard model in the quantum field theory context. Grand unification models including strong and electrical weak interactions, supersymmetric generalizations of the standard model and grand unification theories and, finally, supergravitation theories including gravitation interaction to the universal scheme, are considered. 3 refs.; 19 figs.; 2 tabs
Evans, N
2003-01-01
String theory began life in the late 1960s as an attempt to understand the properties of nuclear matter such as protons and neutrons. Although it was not successful it has since developed a life of its own as a possible theory of everything - with the potential to incorporate quantum gravity as well as the other forces of nature. However, in a remarkable about face in the last five years, it has now been discovered that string theory and the standard theory of nuclear matter - QCD - might in fact describe the same physics. This is an exciting development that was the centre of discussion at a major workshop in Seattle in February. After spending 30 years as a possible theory of everything, string theory is returning to its roots to describe the interactions of quarks and gluons. (U.K.)
International Nuclear Information System (INIS)
Pleitez, V.
1994-01-01
The search for physics laws beyond the standard model is discussed in a general way, and also some topics on supersymmetry theories. An approach is made on recent possibilities rise in the leptonic sector. Finally, models with SU(3) c X SU(2) L X U(1) Y symmetry are considered as alternatives for the extensions of the elementary particles standard model. 36 refs., 1 fig., 4 tabs
DEFF Research Database (Denmark)
Game Theory is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in game theory. We hear their views on game theory, its aim, scope, use, the future direction of game theory and how their work fits in these respects....
DEFF Research Database (Denmark)
Hendricks, Vincent F.
Game Theory is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in game theory. We hear their views on game theory, its aim, scope, use, the future direction of game theory and how their work fits in these respects....
CERN. Geneva
2005-01-01
The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future. Supersymmetry, grand unification, extra dimensions and string theory will be presented.
When is a theory a theory? A case example.
Alkin, Marvin C
2017-08-01
This discussion comments on the approximately 20years history of writings on the prescriptive theory called Empowerment Evaluation. To do so, involves examining how "Empowerment Evaluation Theory" has been defined at various points of time (particularly 1996 and now in 2015). Defining a theory is different from judging the success of a theory. This latter topic has been addressed elsewhere by Michael Scriven, Michael Patton, and Brad Cousins. I am initially guided by the work of Robin Miller (2010) who has written on the issue of how to judge the success of a theory. In doing so, she provided potential standards for judging the adequacy of theories. My task is not judging the adequacy or success of the Empowerment Evaluation prescriptive theory in practice, but determining how well the theory is delineated. That is, to what extent do the writings qualify as a prescriptive theory. Copyright © 2016 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Chan Hongmo.
1987-10-01
The paper traces the development of the String Theory, and was presented at Professor Sir Rudolf Peierls' 80sup(th) Birthday Symposium. The String theory is discussed with respect to the interaction of strings, the inclusion of both gauge theory and gravitation, inconsistencies in the theory, and the role of space-time. The physical principles underlying string theory are also outlined. (U.K.)
Beyond the Standard Model (1/5)
CERN. Geneva
2000-01-01
After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.
Beyond the Standard Model (5/5)
CERN. Geneva
2000-01-01
After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.
Beyond the Standard Model (3/5)
CERN. Geneva
2000-01-01
After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.
Beyond the Standard Model (2/5)
CERN. Geneva
2000-01-01
After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.
Beyond the Standard Model (4/5)
CERN. Geneva
2000-01-01
After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.
Big bang nucleosynthesis: An update
Energy Technology Data Exchange (ETDEWEB)
Olive, Keith A. [William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States)
2013-07-23
An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, {sup 4}He, and {sup 7}Li is discussed and compared to their observational determination. While concordance for D and {sup 4}He is satisfactory, the prediction for {sup 7}Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed.
Langacker, Paul
2017-01-01
This new edition of The Standard Model and Beyond presents an advanced introduction to the physics and formalism of the standard model and other non-abelian gauge theories. It provides a solid background for understanding supersymmetry, string theory, extra dimensions, dynamical symmetry breaking, and cosmology. In addition to updating all of the experimental and phenomenological results from the first edition, it contains a new chapter on collider physics; expanded discussions of Higgs, neutrino, and dark matter physics; and many new problems. The book first reviews calculational techniques in field theory and the status of quantum electrodynamics. It then focuses on global and local symmetries and the construction of non-abelian gauge theories. The structure and tests of quantum chromodynamics, collider physics, the electroweak interactions and theory, and the physics of neutrino mass and mixing are thoroughly explored. The final chapter discusses the motivations for extending the standard model and examin...
String theory or field theory?
International Nuclear Information System (INIS)
Marshakov, A.V.
2002-01-01
The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments, which are our concern in this review [ru
International Nuclear Information System (INIS)
Francaviglia, M.
1990-01-01
Although general relativity is a well-established discipline the theory deserves efforts aimed at producing alternative or more general frameworks for investigating the classical properties of gravity. These are either devoted to producing alternative viewpoints or interpretations of standard general relativity, or at constructing, discussing and proposing experimental tests for alternative descriptions of the dynamics of the gravitational field and its interaction (or unification) with external matter fields. Classical alternative theories of gravitation can roughly classified as follows; theories based on a still 4-dimensional picture, under the assumption that the dynamics of the gravitational field is more complicated than Einstein's and theories based on higher-dimensional pictures. This leads to supergravity and strings which are not included here. Theories based on higher-dimensional pictures on the assumption that space-time is replaced by a higher-dimensional manifold. Papers on these classifications are reviewed. (author)
Supersymmetric gauge theories from string theory
International Nuclear Information System (INIS)
Metzger, St.
2005-12-01
This thesis presents various ways to construct four-dimensional quantum field theories from string theory. In a first part we study the generation of a supersymmetric Yang-Mills theory, coupled to an adjoint chiral superfield, from type IIB string theory on non-compact Calabi-Yau manifolds, with D-branes wrapping certain sub-cycles. Properties of the gauge theory are then mapped to the geometric structure of the Calabi-Yau space. Even if the Calabi-Yau geometry is too complicated to evaluate the geometric integrals explicitly, one can then always use matrix model perturbation theory to calculate the effective superpotential. The second part of this work covers the generation of four-dimensional super-symmetric gauge theories, carrying several important characteristic features of the standard model, from compactifications of eleven-dimensional supergravity on G 2 -manifolds. If the latter contain conical singularities, chiral fermions are present in the four-dimensional gauge theory, which potentially lead to anomalies. We show that, locally at each singularity, these anomalies are cancelled by the non-invariance of the classical action through a mechanism called 'anomaly inflow'. Unfortunately, no explicit metric of a compact G 2 -manifold is known. Here we construct families of metrics on compact weak G 2 -manifolds, which contain two conical singularities. Weak G 2 -manifolds have properties that are similar to the ones of proper G 2 -manifolds, and hence the explicit examples might be useful to better understand the generic situation. Finally, we reconsider the relation between eleven-dimensional supergravity and the E 8 x E 8 -heterotic string. This is done by carefully studying the anomalies that appear if the supergravity theory is formulated on a ten-manifold times the interval. Again we find that the anomalies cancel locally at the boundaries of the interval through anomaly inflow, provided one suitably modifies the classical action. (author)
Eves, Howard
1980-01-01
The usefulness of matrix theory as a tool in disciplines ranging from quantum mechanics to psychometrics is widely recognized, and courses in matrix theory are increasingly a standard part of the undergraduate curriculum.This outstanding text offers an unusual introduction to matrix theory at the undergraduate level. Unlike most texts dealing with the topic, which tend to remain on an abstract level, Dr. Eves' book employs a concrete elementary approach, avoiding abstraction until the final chapter. This practical method renders the text especially accessible to students of physics, engineeri
['Gold standard', not 'golden standard'
Claassen, J.A.H.R.
2005-01-01
In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same
Big-bang nucleosynthesis in the new cosmology
International Nuclear Information System (INIS)
Fields, B.D.
2005-01-01
Big bang nucleosynthesis (BBN) describes the production of the lightest elements in the first minutes of cosmic time. I will review the physics of cosmological element production, and the observations of the primordial element abundances. The comparison between theory and observation has heretofore provided our earliest probe of the universe, and given the best measure of the cosmic baryon content. However, BBN has now taken a new role in cosmology, in light of new precision measurements of the cosmic microwave background (CMB). Recent CMB anisotropy data yield a wealth of cosmological parameters; in particular, the baryon-to-photon ratio η = n B /n γ is measured to high precision. The confrontation between the BBN and CMB 'baryometers' poses a new and stringent test of the standard cosmology; the status of this test will be discussed. Moreover, it is now possible to recast the role of BBN by using the CMB to fix the baryon density and even some light element abundances. This strategy sharpens BBN into a more powerful probe of early universe physics, and of galactic nucleosynthesis processes. The impact of the CMB results on particle physics beyond the Standard Model, and on non-standard cosmology, will be illustrated. Prospects for improvement of these bounds via additional astronomical observations and nuclear experiments will be discussed, as will the lingering 'lithium problem.' (author)
International Nuclear Information System (INIS)
Uehara, S.
1985-01-01
Of all supergravity theories, the maximal, i.e., N = 8 in 4-dimension or N = 1 in 11-dimension, theory should perform the unification since it owns the highest degree of symmetry. As to the N = 1 in d = 11 theory, it has been investigated how to compactify to the d = 4 theories. From the phenomenological point of view, local SUSY GUTs, i.e., N = 1 SUSY GUTs with soft breaking terms, have been studied from various angles. The structures of extended supergravity theories are less understood than those of N = 1 supergravity theories, and matter couplings in N = 2 extended supergravity theories are under investigation. The harmonic superspace was recently proposed which may be useful to investigate the quantum effects of extended supersymmetry and supergravity theories. As to the so-called Kaluza-Klein supergravity, there is another possibility. (Mori, K.)
Johnstone, PT
2014-01-01
Focusing on topos theory's integration of geometric and logical ideas into the foundations of mathematics and theoretical computer science, this volume explores internal category theory, topologies and sheaves, geometric morphisms, other subjects. 1977 edition.
Strings - Links between conformal field theory, gauge theory and gravity
International Nuclear Information System (INIS)
Troost, J.
2009-05-01
String theory is a candidate framework for unifying the gauge theories of interacting elementary particles with a quantum theory of gravity. The last years we have made considerable progress in understanding non-perturbative aspects of string theory, and in bringing string theory closer to experiment, via the search for the Standard Model within string theory, but also via phenomenological models inspired by the physics of strings. Despite these advances, many deep problems remain, amongst which a non-perturbative definition of string theory, a better understanding of holography, and the cosmological constant problem. My research has concentrated on various theoretical aspects of quantum theories of gravity, including holography, black holes physics and cosmology. In this Habilitation thesis I have laid bare many more links between conformal field theory, gauge theory and gravity. Most contributions were motivated by string theory, like the analysis of supersymmetry preserving states in compactified gauge theories and their relation to affine algebras, time-dependent aspects of the holographic map between quantum gravity in anti-de-Sitter space and conformal field theories in the bulk, the direct quantization of strings on black hole backgrounds, the embedding of the no-boundary proposal for a wave-function of the universe in string theory, a non-rational Verlinde formula and the construction of non-geometric solutions to supergravity
DEFF Research Database (Denmark)
Linder, Stefan; Foss, Nicolai Juul
2015-01-01
Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....
Stellinga, B.; Mügge, D.
2014-01-01
The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed
Information theory of molecular systems
Nalewajski, Roman F
2006-01-01
As well as providing a unified outlook on physics, Information Theory (IT) has numerous applications in chemistry and biology owing to its ability to provide a measure of the entropy/information contained within probability distributions and criteria of their information ""distance"" (similarity) and independence. Information Theory of Molecular Systems applies standard IT to classical problems in the theory of electronic structure and chemical reactivity. The book starts by introducing the basic concepts of modern electronic structure/reactivity theory based upon the Density Functional Theory
DEFF Research Database (Denmark)
Linder, Stefan; Foss, Nicolai Juul
Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex a...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism.......Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...
Harris, Tina
2015-04-29
Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.
Number theory via Representation theory
Indian Academy of Sciences (India)
2014-11-09
Number theory via Representation theory. Eknath Ghate. November 9, 2014. Eightieth Annual Meeting, Chennai. Indian Academy of Sciences1. 1. This is a non-technical 20 minute talk intended for a general Academy audience.
International Nuclear Information System (INIS)
Schwarz, J.H.
1985-01-01
Dual string theories, initially developed as phenomenological models of hadrons, now appear more promising as candidates for a unified theory of fundamental interactions. Type I superstring theory (SST I), is a ten-dimensional theory of interacting open and closed strings, with one supersymmetry, that is free from ghosts and tachyons. It requires that an SO(eta) or Sp(2eta) gauge group be used. A light-cone-gauge string action with space-time supersymmetry automatically incorporates the superstring restrictions and leads to the discovery of type II superstring theory (SST II). SST II is an interacting theory of closed strings only, with two D=10 supersymmetries, that is also free from ghosts and tachyons. By taking six of the spatial dimensions to form a compact space, it becomes possible to reconcile the models with our four-dimensional perception of spacetime and to define low-energy limits in which SST I reduces to N=4, D=4 super Yang-Mills theory and SST II reduces to N=8, D=4 supergravity theory. The superstring theories can be described by a light-cone-gauge action principle based on fields that are functionals of string coordinates. With this formalism any physical quantity should be calculable. There is some evidence that, unlike any conventional field theory, the superstring theories provide perturbatively renormalizable (SST I) or finite (SST II) unifications of gravity with other interactions
Dependence theory via game theory
Grossi, D.; Turrini, P.
2011-01-01
In the multi-agent systems community, dependence theory and game theory are often presented as two alternative perspectives on the analysis of social interaction. Up till now no research has been done relating these two approaches. The unification presented provides dependence theory with the sort
String theory or field theory?
International Nuclear Information System (INIS)
Marshakov, Andrei V
2002-01-01
The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of string theory in the modern picture of the physical world. Even though quantum field theory describes a wide range of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments which are our concern in this review. (reviews of topical problems)
Higher YM Theories and the Compactification in String Theory
Król, J.
2007-11-01
The Higher YM theories generalize these of the standard model of particles. From the string theory side, the geometrical constructions of gerbes appear, when describing non-vanishing B-fields on branes. Both, higher YM and gerbes are proved to be the same mathematical object. Thus, a natural candidate for the intermediate stage of the compactification in string theory appears. Moreover, replacing smooth 2-spaces by some categories of smooth topoi gives rise to the generalized spacetime based on topoi.
BBN technical memorandum W1310 hydroacoustic network capability studies
Energy Technology Data Exchange (ETDEWEB)
Angell, J., LLNL
1997-12-01
This report summarizes work performed under contract to Lawrence Livermore National Laboratory during the period 1 August to 30 November 1997. Four separate tasks were undertaken during this period which investigated various aspects of hydroacoustic network performance using the Hydroacoustic Coverage Assessment Model (HydroCAM). The purpose of this report is to document each of these tasks.
BBN PLUM: MUC-4 Test Results and Analysis
National Research Council Canada - National Science Library
Weischedel, Ralph; Ayuso, Damaris; Boisen, Sean; Fox, Heidi; Gish, Herbert; Ingria, Robert
1992-01-01
Our mid-term to long-term goals in data extraction from text for the next one to three years are to achieve much greater portability to new languages and new domains, greater robustness, and greater scalability...
Perkembangan Proses Pembuatan Biodiesel sebagai Bahan Bakar Nabati (BBN
Directory of Open Access Journals (Sweden)
Joelianingsih
2006-12-01
Full Text Available As energy dernands increase and fossil fuel reservas are limited, research is directed towards alternative renewable fluls. A potential diesel fuel substitusi is biodiesel, obtained from fatty acids methyl esters (FAME and produced by the transesterfication reaction of triglyceride or free fatty acid (FFA of vegetable oils with short-chain alcohol, mainly methanol. Most of the currently of alcohol. Although the removal of the excess alcohol can be easily achieved by distillation, however the removal of catlyst and the by-product formed from its reaction with the reactants is complicated while several methode for glycerol purification have been reported. The disadvantages resulting from the use of a catalyst and itsremoval from theproducts can beeliminated if a non-catalytic reaction of the vegetable oils with alcohol can be realized and a simpler and cheaper process can be developed.indonesia has the opportunity to expand oil palm and other plantations such as jatropha curcas (jarak pagarin order to provide sufficient amount of crude oil for development of biodiesel industry.
Quantum corrections in classicalon theories
Energy Technology Data Exchange (ETDEWEB)
Asimakis, P.; Brouzakis, N., E-mail: nbruzak@phys.uoa.gr; Katsis, A.; Tetradis, N.
2015-04-09
We use the heat kernel in order to compute the one-loop effective action on a classicalon background. We find that the UV divergences are suppressed relative to the predictions of standard perturbation theory in the interior of the classicalon. There is a strong analogy with the suppression of quantum fluctuations in Galileon theories, within the regions where the Vainshtein mechanism operates (discussed in (arXiv:1401.2775)). Both classicalon and Galileon theories display reduced UV sensitivity on certain backgrounds.
DEFF Research Database (Denmark)
Aarseth, Espen
2012-01-01
In this article I present a narrative theory of games, building on standard narra-tology, as a solution to the conundrum that has haunted computer game studies from the start: How to approach software that combines games and stories?......In this article I present a narrative theory of games, building on standard narra-tology, as a solution to the conundrum that has haunted computer game studies from the start: How to approach software that combines games and stories?...
String theory and cosmological singularities
Indian Academy of Sciences (India)
recent times, string theory is providing new perspectives of such singularities which may lead to an understanding of these in the standard framework of time evolution in quantum mechanics. In this article, we describe some of these approaches. Keywords. String theory; cosmological singularities. PACS Nos 11.25.
International Nuclear Information System (INIS)
Souza, Manoelito M. de
1997-01-01
We discuss the physical meaning and the geometric interpretation of implementation in classical field theories. The origin of infinities and other inconsistencies in field theories is traced to fields defined with support on the light cone; a finite and consistent field theory requires a light-cone generator as the field support. Then, we introduce a classical field theory with support on the light cone generators. It results on a description of discrete (point-like) interactions in terms of localized particle-like fields. We find the propagators of these particle-like fields and discuss their physical meaning, properties and consequences. They are conformally invariant, singularity-free, and describing a manifestly covariant (1 + 1)-dimensional dynamics in a (3 = 1) spacetime. Remarkably this conformal symmetry remains even for the propagation of a massive field in four spacetime dimensions. We apply this formalism to Classical electrodynamics and to the General Relativity Theory. The standard formalism with its distributed fields is retrieved in terms of spacetime average of the discrete field. Singularities are the by-products of the averaging process. This new formalism enlighten the meaning and the problem of field theory, and may allow a softer transition to a quantum theory. (author)
Stokes, A V
1986-01-01
Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se
Aubin, Jean-Pierre; Saint-Pierre, Patrick
2011-01-01
Viability theory designs and develops mathematical and algorithmic methods for investigating the adaptation to viability constraints of evolutions governed by complex systems under uncertainty that are found in many domains involving living beings, from biological evolution to economics, from environmental sciences to financial markets, from control theory and robotics to cognitive sciences. It involves interdisciplinary investigations spanning fields that have traditionally developed in isolation. The purpose of this book is to present an initiation to applications of viability theory, explai
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
What's Next for Big Bang Nucleosynthesis?
International Nuclear Information System (INIS)
Cyburt, R.H.
2005-01-01
Big bang nucleosynthesis (BBN) plays an important role in the standard hot big bang cosmology. BBN theory is used to predict the primordial abundances of the lightest elements, hydrogen, helium and lithium. Comparison between the predicted and observationally determined light element abundances provides a general test of concordance and can be used to fix the baryon content in the universe. Measurements of the cosmic microwave background (CMB) anisotropies now supplant BBN as the premier baryometer, especially with the latest results from the WMAP satellite. With the WMAP baryon density, the test of concordance can be made even more precise. Any disagreement between theory predictions and observations requires careful discussion. Several possibilities exist to explain discrepancies; (1) observational systematics (either physical or technical) may not be properly treated in determining primordial light element abundances (2) nuclear inputs that determine the BBN predictions may have unknown systematics or may be incomplete, and (3) physics beyond that included in the standard BBN scenario may need to be included in the theory calculation. Before we can be absolutely sure new physics is warranted, points (1) and (2) must be addressed and ruled out. All of these scenarios rely on experimental or observational data to make definitive statements of their applicability and range of validity, which currently is not at the level necessary to discern between these possibilities with high confidence. Thus, new light element abundance observations and nuclear experiments are needed to probe these further. Assuming concordance is established, one can use the light element observations to explore the evolution from their primordial values. This can provide useful information on stellar evolution, cosmic rays and other nuclear astrophysics. When combined with detailed models, BBN, the CMB anisotropy and nuclear astrophysics can provide us with information about the populations
Cox, David A
2012-01-01
Praise for the First Edition ". . .will certainly fascinate anyone interested in abstract algebra: a remarkable book!"—Monatshefte fur Mathematik Galois theory is one of the most established topics in mathematics, with historical roots that led to the development of many central concepts in modern algebra, including groups and fields. Covering classic applications of the theory, such as solvability by radicals, geometric constructions, and finite fields, Galois Theory, Second Edition delves into novel topics like Abel’s theory of Abelian equations, casus irreducibili, and the Galo
Dufwenberg, Martin
2011-03-01
Game theory is a toolkit for examining situations where decision makers influence each other. I discuss the nature of game-theoretic analysis, the history of game theory, why game theory is useful for understanding human psychology, and why game theory has played a key role in the recent explosion of interest in the field of behavioral economics. WIREs Cogni Sci 2011 2 167-173 DOI: 10.1002/wcs.119 For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.
Hashiguchi, Koichi
2009-01-01
This book details the mathematics and continuum mechanics necessary as a foundation of elastoplasticity theory. It explains physical backgrounds with illustrations and provides descriptions of detailed derivation processes..
DEFF Research Database (Denmark)
Henningsson, Stefan
2016-01-01
competitive, national customs and regional economic organizations are seeking to establish a standardized solution for digital reporting of customs data. However, standardization has proven hard to achieve in the socio-technical e-Customs solution. In this chapter, the authors identify and describe what has......International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...... to be harmonized in order for a global company to perceive e-Customs as standardized. In doing so, they contribute an explanation of the challenges associated with using a standardization mechanism for harmonizing socio-technical information systems....
DEFF Research Database (Denmark)
Henningsson, Stefan
2014-01-01
competitive, national customs and regional economic organizations are seeking to establish a standardized solution for digital reporting of customs data. However, standardization has proven hard to achieve in the socio-technical e-Customs solution. In this chapter, the authors identify and describe what has......International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...... to be harmonized in order for a global company to perceive e-Customs as standardized. In doing so, they contribute an explanation of the challenges associated with using a standardization mechanism for harmonizing socio-technical information systems....
International Nuclear Information System (INIS)
Agnihotri, Newal
2003-01-01
The article describes the benefits of and required process and recommendations for implementing the standardization of training in the nuclear power industry in the United States and abroad. Current Information and Communication Technologies (ICT) enable training standardization in the nuclear power industry. The delivery of training through the Internet, Intranet and video over IP will facilitate this standardization and bring multiple benefits to the nuclear power industry worldwide. As the amount of available qualified and experienced professionals decreases because of retirements and fewer nuclear engineering institutions, standardized training will help increase the number of available professionals in the industry. Technology will make it possible to use the experience of retired professionals who may be interested in working part-time from a remote location. Well-planned standardized training will prevent a fragmented approach among utilities, and it will save the industry considerable resources in the long run. It will also ensure cost-effective and safe nuclear power plant operation
What Is Sociolinguistic Theory?
Coupland, Nikolas
1998-01-01
Discusses three positions regarding the definition of sociolinguistic theory: (1) sociolinguistic theory is proper linguistic theory; (2) sociolinguistic theory is an accumulation of mini-theories; and (3) sociolinguistic theory as social theory. (Author/VWL)
International Nuclear Information System (INIS)
Bartlett, R.; Kirtman, B.; Davidson, E.R.
1978-01-01
After noting some advantages of using perturbation theory some of the various types are related on a chart and described, including many-body nonlinear summations, quartic force-field fit for geometry, fourth-order correlation approximations, and a survey of some recent work. Alternative initial approximations in perturbation theory are also discussed. 25 references
Indian Academy of Sciences (India)
This article tries to outline what game theory is all about. It illustrates game theory's fundamental solution concept viz., Nash equilibrium, using various examples. The Genesis. In the late thirties, the mathematician John von Neumann turned his prodigious innovative talents towards economics. This brief encounter of his with ...
Directory of Open Access Journals (Sweden)
Ion N.Chiuta
2009-05-01
Full Text Available The paper determines relations for shieldingeffectiveness relative to several variables, includingmetal type, metal properties, thickness, distance,frequency, etc. It starts by presenting some relationshipsregarding magnetic, electric and electromagnetic fieldsas a pertinent background to understanding and applyingfield theory. Since literature about electromagneticcompatibility is replete with discussions about Maxwellequations and field theory only a few aspects arepresented.
1999-11-08
In these lectures I will build up the concept of field theory using the language of Feynman diagrams. As a starting point, field theory in zero spacetime dimensions is used as a vehicle to develop all the necessary techniques: path integral, Feynman diagrams, Schwinger-Dyson equations, asymptotic series, effective action, renormalization etc. The theory is then extended to more dimensions, with emphasis on the combinatorial aspects of the diagrams rather than their particular mathematical structure. The concept of unitarity is used to, finally, arrive at the various Feynman rules in an actual, four-dimensional theory. The concept of gauge-invariance is developed, and the structure of a non-abelian gauge theory is discussed, again on the level of Feynman diagrams and Feynman rules.
Manturov, Vassily
2004-01-01
Since discovery of the Jones polynomial, knot theory has enjoyed a virtual explosion of important results and now plays a significant role in modern mathematics. In a unique presentation with contents not found in any other monograph, Knot Theory describes, with full proofs, the main concepts and the latest investigations in the field.The book is divided into six thematic sections. The first part discusses "pre-Vassiliev" knot theory, from knot arithmetics through the Jones polynomial and the famous Kauffman-Murasugi theorem. The second part explores braid theory, including braids in different spaces and simple word recognition algorithms. A section devoted to the Vassiliev knot invariants follows, wherein the author proves that Vassiliev invariants are stronger than all polynomial invariants and introduces Bar-Natan''s theory on Lie algebra respresentations and knots.The fourth part describes a new way, proposed by the author, to encode knots by d-diagrams. This method allows the encoding of topological obje...
Liu, Baoding
2015-01-01
When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...
Lukeš, Jaroslav; Netuka, Ivan; Veselý, Jiří
1988-01-01
Within the tradition of meetings devoted to potential theory, a conference on potential theory took place in Prague on 19-24, July 1987. The Conference was organized by the Faculty of Mathematics and Physics, Charles University, with the collaboration of the Institute of Mathematics, Czechoslovak Academy of Sciences, the Department of Mathematics, Czech University of Technology, the Union of Czechoslovak Mathematicians and Physicists, the Czechoslovak Scientific and Technical Society, and supported by IMU. During the Conference, 69 scientific communications from different branches of potential theory were presented; the majority of them are in cluded in the present volume. (Papers based on survey lectures delivered at the Conference, its program as well as a collection of problems from potential theory will appear in a special volume of the Lecture Notes Series published by Springer-Verlag). Topics of these communications truly reflect the vast scope of contemporary potential theory. Some contributions deal...
DEFF Research Database (Denmark)
Hjørland, Birger
2009-01-01
Concept theory is an extremely broad, interdisciplinary and complex field of research related to many deep fields with very long historical traditions without much consensus. However, information science and knowledge organization cannot avoid relating to theories of concepts. Knowledge...... organizing systems (e.g. classification systems, thesauri and ontologies) should be understood as systems basically organizing concepts and their semantic relations. The same is the case with information retrieval systems. Different theories of concepts have different implications for how to construe......, evaluate and use such systems. Based on "a post-Kuhnian view" of paradigms this paper put forward arguments that the best understanding and classification of theories of concepts is to view and classify them in accordance with epistemological theories (empiricism, rationalism, historicism and pragmatism...
Sarason, Donald
2007-01-01
Complex Function Theory is a concise and rigorous introduction to the theory of functions of a complex variable. Written in a classical style, it is in the spirit of the books by Ahlfors and by Saks and Zygmund. Being designed for a one-semester course, it is much shorter than many of the standard texts. Sarason covers the basic material through Cauchy's theorem and applications, plus the Riemann mapping theorem. It is suitable for either an introductory graduate course or an undergraduate course for students with adequate preparation. The first edition was published with the title Notes on Co
Standardization of depression measurement
DEFF Research Database (Denmark)
Wahl, Inka; Löwe, Bernd; Bjørner, Jakob
2014-01-01
OBJECTIVES: To provide a standardized metric for the assessment of depression severity to enable comparability among results of established depression measures. STUDY DESIGN AND SETTING: A common metric for 11 depression questionnaires was developed applying item response theory (IRT) methods. Data...... of 33,844 adults were used for secondary analysis including routine assessments of 23,817 in- and outpatients with mental and/or medical conditions (46% with depressive disorders) and a general population sample of 10,027 randomly selected participants from three representative German household surveys....... RESULTS: A standardized metric for depression severity was defined by 143 items, and scores were normed to a general population mean of 50 (standard deviation = 10) for easy interpretability. It covers the entire range of depression severity assessed by established instruments. The metric allows...
Scalar strong interaction hadron theory
Hoh, Fang Chao
2015-01-01
The scalar strong interaction hadron theory, SSI, is a first principles' and nonlocal theory at quantum mechanical level that provides an alternative to low energy QCD and Higgs related part of the standard model. The quark-quark interaction is scalar rather than color-vectorial. A set of equations of motion for mesons and another set for baryons have been constructed. This book provides an account of the present state of a theory supposedly still at its early stage of development. This work will facilitate researchers interested in entering into this field and serve as a basis for possible future development of this theory.
Lectures on quantum field theory
Das, Ashok
2008-01-01
This book consists of the lectures for a two-semester course on quantum field theory, and as such is presented in a quite informal and personal manner. The course starts with relativistic one-particle systems, and develops the basics of quantum field theory with an analysis of the representations of the Poincaré group. Canonical quantization is carried out for scalar, fermion, Abelian and non-Abelian gauge theories. Covariant quantization of gauge theories is also carried out with a detailed description of the BRST symmetry. The Higgs phenomenon and the standard model of electroweak interactio
Lubliner, Jacob
2008-01-01
The aim of Plasticity Theory is to provide a comprehensive introduction to the contemporary state of knowledge in basic plasticity theory and to its applications. It treats several areas not commonly found between the covers of a single book: the physics of plasticity, constitutive theory, dynamic plasticity, large-deformation plasticity, and numerical methods, in addition to a representative survey of problems treated by classical methods, such as elastic-plastic problems, plane plastic flow, and limit analysis; the problem discussed come from areas of interest to mechanical, structural, and
Schmidli, Hanspeter
2017-01-01
This book provides an overview of classical actuarial techniques, including material that is not readily accessible elsewhere such as the Ammeter risk model and the Markov-modulated risk model. Other topics covered include utility theory, credibility theory, claims reserving and ruin theory. The author treats both theoretical and practical aspects and also discusses links to Solvency II. Written by one of the leading experts in the field, these lecture notes serve as a valuable introduction to some of the most frequently used methods in non-life insurance. They will be of particular interest to graduate students, researchers and practitioners in insurance, finance and risk management.
Andrews, George E
1994-01-01
Although mathematics majors are usually conversant with number theory by the time they have completed a course in abstract algebra, other undergraduates, especially those in education and the liberal arts, often need a more basic introduction to the topic.In this book the author solves the problem of maintaining the interest of students at both levels by offering a combinatorial approach to elementary number theory. In studying number theory from such a perspective, mathematics majors are spared repetition and provided with new insights, while other students benefit from the consequent simpl
DEFF Research Database (Denmark)
Smith, Shelley
This paper came about within the context of a 13-month research project, Focus Area 1 - Method and Theory, at the Center for Public Space Research at the Royal Academy of the Arts School of Architecture in Copenhagen, Denmark. This project has been funded by RealDania. The goals of the research...... project, Focus Area 1 - Method and Theory, which forms the framework for this working paper, are: * To provide a basis from which to discuss the concept of public space in a contemporary architectural and urban context - specifically relating to theory and method * To broaden the discussion of the concept...
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Bestvina, Mladen; Vogtmann, Karen
2014-01-01
Geometric group theory refers to the study of discrete groups using tools from topology, geometry, dynamics and analysis. The field is evolving very rapidly and the present volume provides an introduction to and overview of various topics which have played critical roles in this evolution. The book contains lecture notes from courses given at the Park City Math Institute on Geometric Group Theory. The institute consists of a set of intensive short courses offered by leaders in the field, designed to introduce students to exciting, current research in mathematics. These lectures do not duplicate standard courses available elsewhere. The courses begin at an introductory level suitable for graduate students and lead up to currently active topics of research. The articles in this volume include introductions to CAT(0) cube complexes and groups, to modern small cancellation theory, to isometry groups of general CAT(0) spaces, and a discussion of nilpotent genus in the context of mapping class groups and CAT(0) gro...
EFFECTIVE ACTIONS FOR HETEROTIC STRING THEORY
SUELMANN, H
Heterotic String Theory is an attempt to construct a description of nature that is more satisfying than the Standard Model. A major problem is that it is very difficult to do explicit calculations in string theory. Therefore, it is useful to construct a 'normal' field theory that approximates HST.
Modern actuarial risk theory: using R
Kaas, R.; Goovaerts, M.; Dhaene, J.; Denuit, M.
2008-01-01
Modern Actuarial Risk Theory -- Using R contains what every actuary needs to know about non-life insurance mathematics. It starts with the standard material like utility theory, individual and collective model and basic ruin theory. Other topics are risk measures and premium principles, bonus-malus
SAIDANI Lassaad
2017-01-01
The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here. At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution. Two constants have been necessary to define the dynam...
SAIDANI Lassaad
2015-01-01
The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here. At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution. Two constants have been necessary to define the dynam...
Gould, Ronald
2012-01-01
This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S
Lyndon, Roger C
2001-01-01
From the reviews: "This book (...) defines the boundaries of the subject now called combinatorial group theory. (...)it is a considerable achievement to have concentrated a survey of the subject into 339 pages. This includes a substantial and useful bibliography; (over 1100 ÄitemsÜ). ...the book is a valuable and welcome addition to the literature, containing many results not previously available in a book. It will undoubtedly become a standard reference." Mathematical Reviews, AMS, 1979.
Extensions of the Standard Model
Zwirner, Fabio
1996-01-01
Rapporteur talk at the International Europhysics Conference on High Energy Physics, Brussels (Belgium), July 27-August 2, 1995. This talk begins with a brief general introduction to the extensions of the Standard Model, reviewing the ideology of effective field theories and its practical implications. The central part deals with candidate extensions near the Fermi scale, focusing on some phenomenological aspects of the Minimal Supersymmetric Standard Model. The final part discusses some possible low-energy implications of further extensions near the Planck scale, namely superstring theories.
Riehle, Fritz
2006-01-01
Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards
International Nuclear Information System (INIS)
Friedberg, R; Hohenberg, P C
2014-01-01
completion of the theory requires a macroscopic mechanism for selecting a physical framework, which is part of the macroscopic theory (MAQM). The selection of a physical framework involves the breaking of the microscopic ‘framework symmetry’, which can proceed either phenomenologically as in the standard quantum measurement theory, or more fundamentally by considering the quantum system under study to be a subsystem of a macroscopic quantum system. The decoherent histories formulation of Gell-Mann and Hartle, as well as that of Omnès, are theories of this fundamental type, where the physical framework is selected by a coarse-graining procedure in which the physical phenomenon of decoherence plays an essential role. Various well-known interpretations of QM are described from the perspective of CQT. Detailed definitions and proofs are presented in the appendices. (key issues reviews)
Indian Academy of Sciences (India)
.86: Ethernet over LAPS. Standard in China and India. G.7041: Generic Framing Procedure (GFP). Supports Ethernet as well as other data formats (e.g., Fibre Channel); Protocol of ... IEEE 802.3x for flow control of incoming Ethernet data ...
Creating COMFORT: A Communication-Based Model for Breaking Bad News
Villagran, Melinda; Goldsmith, Joy; Wittenberg-Lyles, Elaine; Baldwin, Paula
2010-01-01
This study builds upon existing protocols for breaking bad news (BBN), and offers an interaction-based approach to communicating comfort to patients and their families. The goal was to analyze medical students' (N = 21) videotaped standardized patient BBN interactions after completing an instructional unit on a commonly used BBN protocol, commonly…
Gohberg, Israel
2001-01-01
rii application of linear operators on a Hilbert space. We begin with a chapter on the geometry of Hilbert space and then proceed to the spectral theory of compact self adjoint operators; operational calculus is next presented as a nat ural outgrowth of the spectral theory. The second part of the text concentrates on Banach spaces and linear operators acting on these spaces. It includes, for example, the three 'basic principles of linear analysis and the Riesz Fredholm theory of compact operators. Both parts contain plenty of applications. All chapters deal exclusively with linear problems, except for the last chapter which is an introduction to the theory of nonlinear operators. In addition to the standard topics in functional anal ysis, we have presented relatively recent results which appear, for example, in Chapter VII. In general, in writ ing this book, the authors were strongly influenced by re cent developments in operator theory which affected the choice of topics, proofs and exercises. One ...
International Nuclear Information System (INIS)
Akhmeteli, Andrey
2012-01-01
Is it possible to offer a 'no drama' quantum theory? Something as simple (in principle) as classical electrodynamics - a theory described by a system of partial differential equations in 3+1 dimensions, but reproducing unitary evolution of a quantum field theory in the configuration space? The following results suggest an affirmative answer: 1. The scalar field can be algebraically eliminated from scalar electrodynamics; the resulting equations describe independent evolution of the electromagnetic field. 2. After introduction of a complex 4-potential (producing the same electromagnetic field as the standard real 4-potential), the spinor field can be algebraically eliminated from spinor electrodynamics; the resulting equations describe independent evolution of the electromagnetic field. 3. The resulting theories for the electromagnetic field can be embedded into quantum field theories. Another fundamental result: in a general case, the Dirac equation is equivalent to a 4th order partial differential equations for just one component, which can be made real by a gauge transform. Issues related to the Bell theorem are discussed.
Energy Technology Data Exchange (ETDEWEB)
Brandt, Bastian B. [Institute for Theoretical Physics, Goethe-University of Frankfurt,60438 Frankfurt (Germany); Institute for Theoretical Physics, University of Regensburg,93040 Regensburg (Germany); Lohmayer, Robert; Wettig, Tilo [Institute for Theoretical Physics, University of Regensburg,93040 Regensburg (Germany)
2016-11-14
We explore an alternative discretization of continuum SU(N{sub c}) Yang-Mills theory on a Euclidean spacetime lattice, originally introduced by Budzcies and Zirnbauer. In this discretization the self-interactions of the gauge field are induced by a path integral over N{sub b} auxiliary boson fields, which are coupled linearly to the gauge field. The main progress compared to earlier approaches is that N{sub b} can be as small as N{sub c}. In the present paper we (i) extend the proof that the continuum limit of the new discretization reproduces Yang-Mills theory in two dimensions from gauge group U(N{sub c}) to SU(N{sub c}), (ii) derive refined bounds on N{sub b} for non-integer values, and (iii) perform a perturbative calculation to match the bare parameter of the induced gauge theory to the standard lattice coupling. In follow-up papers we will present numerical evidence in support of the conjecture that the induced gauge theory reproduces Yang-Mills theory also in three and four dimensions, and explore the possibility to integrate out the gauge fields to arrive at a dual formulation of lattice QCD.
Some topics in quantum field theory
International Nuclear Information System (INIS)
Symanzik, K.
1981-10-01
After a few general remarks on lattice theory, I describe the relation of lattice to continuum theory on the basis of perturbation theory, and deduce herefrom the principles of constructing 'improved' lattice actions. Then I briefly describe some recent perturbative and nonperturbative results in continuum theory. Finally, I point out a few recent approaches of more speculative nature that appear to merit particular attention. In the appendix, a few standard formulae from renormalization group analysis are collected for reference. (orig./HSI)
Kodaira, Kunihiko
2017-01-01
This book deals with the classical theory of Nevanlinna on the value distribution of meromorphic functions of one complex variable, based on minimum prerequisites for complex manifolds. The theory was extended to several variables by S. Kobayashi, T. Ochiai, J. Carleson, and P. Griffiths in the early 1970s. K. Kodaira took up this subject in his course at The University of Tokyo in 1973 and gave an introductory account of this development in the context of his final paper, contained in this book. The first three chapters are devoted to holomorphic mappings from C to complex manifolds. In the fourth chapter, holomorphic mappings between higher dimensional manifolds are covered. The book is a valuable treatise on the Nevanlinna theory, of special interests to those who want to understand Kodaira's unique approach to basic questions on complex manifolds.
International Nuclear Information System (INIS)
Kenyon, I.R.
1986-01-01
Modern theories of the interactions between fundamental particles are all gauge theories. In the case of gravitation, application of this principle to space-time leads to Einstein's theory of general relativity. All the other interactions involve the application of the gauge principle to internal spaces. Electromagnetism serves to introduce the idea of a gauge field, in this case the electromagnetic field. The next example, the strong force, shows unique features at long and short range which have their origin in the self-coupling of the gauge fields. Finally the unification of the description of the superficially dissimilar electromagnetic and weak nuclear forces completes the picture of successes of the gauge principle. (author)
DEFF Research Database (Denmark)
Carroll, Joseph; Clasen, Mathias; Jonsson, Emelie
2017-01-01
Biocultural theory is an integrative research program designed to investigate the causal interactions between biological adaptations and cultural constructions. From the biocultural perspective, cultural processes are rooted in the biological necessities of the human life cycle: specifically human...... and ideological beliefs, and artistic practices such as music, dance, painting, and storytelling. Establishing biocultural theory as a program that self-consciously encompasses the different particular forms of human evolutionary research could help scholars and scientists envision their own specialized areas...... of research as contributions to a coherent, collective research program. This article argues that a mature biocultural paradigm needs to be informed by at least 7 major research clusters: (a) gene-culture coevolution; (b) human life history theory; (c) evolutionary social psychology; (d) anthropological...
International Nuclear Information System (INIS)
Peccei, R.D.
1986-01-01
Possible small extensions of the standard model are considered, which are motivated by the strong CP problem and by the baryon asymmetry of the Universe. Phenomenological arguments are given which suggest that imposing a PQ symmetry to solve the strong CP problem is only tenable if the scale of the PQ breakdown is much above M W . Furthermore, an attempt is made to connect the scale of the PQ breakdown to that of the breakdown of lepton number. It is argued that in these theories the same intermediate scale may be responsible for the baryon number of the Universe, provided the Kuzmin Rubakov Shaposhnikov (B+L) erasing mechanism is operative. (orig.)
Donagi, Ron; Pantev, Tony; Waldram, Dan; Donagi, Ron; Ovrut, Burt; Pantev, Tony; Waldram, Dan
2002-01-01
We describe a family of genus one fibered Calabi-Yau threefolds with fundamental group ${\\mathbb Z}/2$. On each Calabi-Yau $Z$ in the family we exhibit a positive dimensional family of Mumford stable bundles whose symmetry group is the Standard Model group $SU(3)\\times SU(2)\\times U(1)$ and which have $c_{3} = 6$. We also show that for each bundle $V$ in our family, $c_{2}(Z) - c_{2}(V)$ is the class of an effective curve on $Z$. These conditions ensure that $Z$ and $V$ can be used for a phenomenologically relevant compactification of Heterotic M-theory.
International Nuclear Information System (INIS)
Sitenko, A.
1991-01-01
This book emerged out of graduate lectures given by the author at the University of Kiev and is intended as a graduate text. The fundamentals of non-relativistic quantum scattering theory are covered, including some topics, such as the phase-function formalism, separable potentials, and inverse scattering, which are not always coverded in textbooks on scattering theory. Criticisms of the text are minor, but the reviewer feels an inadequate index is provided and the citing of references in the Russian language is a hindrance in a graduate text
Stewart, Ian
2003-01-01
Ian Stewart's Galois Theory has been in print for 30 years. Resoundingly popular, it still serves its purpose exceedingly well. Yet mathematics education has changed considerably since 1973, when theory took precedence over examples, and the time has come to bring this presentation in line with more modern approaches.To this end, the story now begins with polynomials over the complex numbers, and the central quest is to understand when such polynomials have solutions that can be expressed by radicals. Reorganization of the material places the concrete before the abstract, thus motivating the g
Towards a Theory Grounded Theory of Language
Prince, Christopher G.; Mislivec, Eric J.; Kosolapov, Oleksandr V.; Lykken, Troy R.
2002-01-01
In this paper, we build upon the idea of theory grounding and propose one specific form of theory grounding, a theory of language. Theory grounding is the idea that we can imbue our embodied artificially intelligent systems with theories by modeling the way humans, and specifically young children, develop skills with theories. Modeling theory development promises to increase the conceptual and behavioral flexibility of these systems. An example of theory development in children is the social ...
Energy Technology Data Exchange (ETDEWEB)
Peskin, M.E.
1997-05-01
These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.
International Nuclear Information System (INIS)
Peskin, M.E.
1997-05-01
These lectures constitute a short course in ''Beyond the Standard Model'' for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e + e - colliders
Riles, K
1998-01-01
The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.
Phenomenology of unified gauge theories
International Nuclear Information System (INIS)
Ellis, J.
1983-01-01
Part I of these lectures treats the standard Glashow-Weinberg-Salam model of weak and electromagnetic interactions, discussing in turn its basic structure and weak neutral currents, charged currents, mixing angles and CP violation, and the phenomenology of weak vector and Higgs bosons. Part II of the lectures discusses the structure of theories of dynamical symmetry breaking such as technicolour, phenomenological consequences, frustrations and alternatives. The third part of these lectures offers the standard menu of grand unified theories (GUTs) of the strong, weak and electromagnetic interactions, including an hors d'oeuvre of constraints on the parameters of the standard model, a main course of baryon number violating processes, and desserts which violate lepton number and CP. The fourth and final part goes through different attempts to remedy the inadequacies of previous theories by invoking supersymmetry and reaching out towards gravitation. (orig./HSI)
Dual Symmetry in Gauge Theories
Koshkarov, A. L.
1997-01-01
Continuous dual symmetry in electrodynamics, Yang-Mills theory and gravitation is investigated. Dual invariant which leads to badly nonlinear motion equations is chosen as a Lagrangian of the pure classical dual nonlinear electrodynamics. In a natural manner some dual angle which is determined by the electromagnetic strengths at the point of the time-space appears in the model. Motion equations may well be interpreted as the equations of the standard Maxwell theory with source. Alternative in...
Theory Advances in BSM Physics
McCullough, Matthew
2016-01-01
Rather than attempting to summarise the full spectrum of recent advances in Beyond the Standard Model (BSM) theory, which are many, in this talk I will instead take the opportunity to focus on two frameworks related to the hierarchy problem currently receiving significant attention. They are the `Twin Higgs' and the `Relaxion'. I will summarise the basic underlying structure of these theories at a non-expert level and highlight some interesting phenomenological signatures or outstanding problems.
International Nuclear Information System (INIS)
Friedrich, Harald
2013-01-01
Written by the author of the widely acclaimed textbook. Theoretical Atomic Physics Includes sections on quantum reflection, tunable Feshbach resonances and Efimov states. Useful for advanced students and researchers. This book presents a concise and modern coverage of scattering theory. It is motivated by the fact that experimental advances have shifted and broadened the scope of applications where concepts from scattering theory are used, e.g. to the field of ultracold atoms and molecules, which has been experiencing enormous growth in recent years, largely triggered by the successful realization of Bose-Einstein condensates of dilute atomic gases in 1995. In the present treatment, special attention is given to the role played by the long-range behaviour of the projectile-target interaction, and a theory is developed, which is well suited to describe near-threshold bound and continuum states in realistic binary systems such as diatomic molecules or molecular ions. The level of abstraction is kept as low as at all possible, and deeper questions related to mathematical foundations of scattering theory are passed by. The book should be understandable for anyone with a basic knowledge of nonrelativistic quantum mechanics. It is intended for advanced students and researchers, and it is hoped that it will be useful for theorists and experimentalists alike.
DEFF Research Database (Denmark)
Monthoux, Pierre Guillet de; Statler, Matt
2014-01-01
The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer’s Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specifically...
Indian Academy of Sciences (India)
profession in their now classic book 'Theory of Games and Economic. Behavior'. In this book, they developed ... Professor at Indira Gandhi. Institute of Development. Research .... academic career offers a complete contrast to the fairy tale career of John Nash Jr along with whom he shared a Nobel Prize in 1994. Rational ...
Plummer, MD
1986-01-01
This study of matching theory deals with bipartite matching, network flows, and presents fundamental results for the non-bipartite case. It goes on to study elementary bipartite graphs and elementary graphs in general. Further discussed are 2-matchings, general matching problems as linear programs, the Edmonds Matching Algorithm (and other algorithmic approaches), f-factors and vertex packing.
Hall, Marshall
2011-01-01
Includes proof of van der Waerden's 1926 conjecture on permanents, Wilson's theorem on asymptotic existence, and other developments in combinatorics since 1967. Also covers coding theory and its important connection with designs, problems of enumeration, and partition. Presents fundamentals in addition to latest advances, with illustrative problems at the end of each chapter. Enlarged appendixes include a longer list of block designs.
DEFF Research Database (Denmark)
Christensen, Lars Thøger
2016-01-01
feel associated with the organization in question. People take pride in working for companies that are positively evaluated by the general public and use such reputations to boost the images they hold of themselves. Thus, for internal audiences the reputation of their organization is a mirror in which...... covers the theory of autocommunication and its implications for corporate reputation and managerial applications....
International Nuclear Information System (INIS)
Tang, W.M.
2001-01-01
This is a summary of the advances in magnetic fusion energy theory research presented at the 17th International Atomic Energy Agency Fusion Energy Conference from 19 24 October, 1998 in Yokohama, Japan. Theory and simulation results from this conference provided encouraging evidence of significant progress in understanding the physics of thermonuclear plasmas. Indeed, the grand challenge for this field is to acquire the basic understanding that can readily enable the innovations which would make fusion energy practical. In this sense, research in fusion energy is increasingly able to be categorized as fitting well the 'Pasteur's Quadrant' paradigm, where the research strongly couples basic science ('Bohr's Quadrant') to technological impact ('Edison's Quadrant'). As supported by some of the work presented at this conference, this trend will be further enhanced by advanced simulations. Eventually, realistic three-dimensional modeling capabilities, when properly combined with rapid and complete data interpretation of results from both experiments and simulations, can contribute to a greatly enhanced cycle of understanding and innovation. Plasma science theory and simulation have provided reliable foundations for this improved modeling capability, and the exciting advances in high-performance computational resources have further accelerated progress. There were 68 papers presented at this conference in the area of magnetic fusion energy theory
Toso, Robert B.
2000-01-01
Inspired by William Glasser's Reality Therapy ideas, Control Theory (CT) is a disciplinary approach that stresses people's ability to control only their own behavior, based on internal motivations to satisfy five basic needs. At one North Dakota high school, CT-trained teachers are the program's best recruiters. (MLH)
DEFF Research Database (Denmark)
Bjerg, Ole; Presskorn-Thygesen, Thomas
2017-01-01
’. It is demonstrated how such a designation relegates these questions and explanations beyond the realm of meaningful discourse. In addition, Agamben’s concept of sovereignty is applied to explore the political effects of using the concept of conspiracy theory. The exceptional epistemological status assigned...
Indian Academy of Sciences (India)
San jay Jain for this opportunity. I thank the editor-in-charge of the paper for valuable comments . This is an expository article; no originality, other than that of the ..... and left with equal probability 1/2. The last strategy, that of randomizing, is what is known as a 'mixed strategy'. Classical game theory gives no clues as to what ...
DEFF Research Database (Denmark)
Bertelsen, Olav Wedege; Bødker, Susanne
2003-01-01
the young HCI research tradition. But HCI was already facing problems: lack of consideration for other aspects of human behavior, for interaction with other people, for culture. Cognitive science-based theories lacked means to address several issues that came out of the empirical projects....
Lee, William H K.
2016-01-01
A complex system consists of many interacting parts, generates new collective behavior through self organization, and adaptively evolves through time. Many theories have been developed to study complex systems, including chaos, fractals, cellular automata, self organization, stochastic processes, turbulence, and genetic algorithms.
Zwirner, F
1992-01-01
We summarize the present status of low-energy supersymmetry, exemplified by the Minimal Supersymmetric extension of the Standard Model (MSSM). We review the searches for Supersymmetric particles and supersymmetric Higgs bosons. We conclude with some comments on the open theoretical problems related to spontaneous supersymmetry breaking in the underlying fundamental theory.
Beyond the Standard Model course
CERN. Geneva HR-RFA
2006-01-01
The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future, at LHC and elsewhere. Supersymmetry, grand unification, extra dimensions and a glimpse of string theory will be presented.
Item Banking with Embedded Standards
MacCann, Robert G.; Stanley, Gordon
2009-01-01
An item banking method that does not use Item Response Theory (IRT) is described. This method provides a comparable grading system across schools that would be suitable for low-stakes testing. It uses the Angoff standard-setting method to obtain item ratings that are stored with each item. An example of such a grading system is given, showing how…
Observation of interstellar lithium in the low-metallicity Small Magellanic Cloud.
Howk, J Christopher; Lehner, Nicolas; Fields, Brian D; Mathews, Grant J
2012-09-06
The primordial abundances of light elements produced in the standard theory of Big Bang nucleosynthesis (BBN) depend only on the cosmic ratio of baryons to photons, a quantity inferred from observations of the microwave background. The predicted primordial (7)Li abundance is four times that measured in the atmospheres of Galactic halo stars. This discrepancy could be caused by modification of surface lithium abundances during the stars' lifetimes or by physics beyond the Standard Model that affects early nucleosynthesis. The lithium abundance of low-metallicity gas provides an alternative constraint on the primordial abundance and cosmic evolution of lithium that is not susceptible to the in situ modifications that may affect stellar atmospheres. Here we report observations of interstellar (7)Li in the low-metallicity gas of the Small Magellanic Cloud, a nearby galaxy with a quarter the Sun's metallicity. The present-day (7)Li abundance of the Small Magellanic Cloud is nearly equal to the BBN predictions, severely constraining the amount of possible subsequent enrichment of the gas by stellar and cosmic-ray nucleosynthesis. Our measurements can be reconciled with standard BBN with an extremely fine-tuned depletion of stellar Li with metallicity. They are also consistent with non-standard BBN.
Illiopoulos, Jean
2007-01-01
A fundamental theory about particles and their interactions exists since 35 years. Physicists are baited to take at fault this "standard model" and to remplace it by a more complet theory. (7 pages + photos)
Energy Technology Data Exchange (ETDEWEB)
Metzger, St
2005-12-15
This thesis presents various ways to construct four-dimensional quantum field theories from string theory. In a first part we study the generation of a supersymmetric Yang-Mills theory, coupled to an adjoint chiral superfield, from type IIB string theory on non-compact Calabi-Yau manifolds, with D-branes wrapping certain sub-cycles. Properties of the gauge theory are then mapped to the geometric structure of the Calabi-Yau space. Even if the Calabi-Yau geometry is too complicated to evaluate the geometric integrals explicitly, one can then always use matrix model perturbation theory to calculate the effective superpotential. The second part of this work covers the generation of four-dimensional super-symmetric gauge theories, carrying several important characteristic features of the standard model, from compactifications of eleven-dimensional supergravity on G{sub 2}-manifolds. If the latter contain conical singularities, chiral fermions are present in the four-dimensional gauge theory, which potentially lead to anomalies. We show that, locally at each singularity, these anomalies are cancelled by the non-invariance of the classical action through a mechanism called 'anomaly inflow'. Unfortunately, no explicit metric of a compact G{sub 2}-manifold is known. Here we construct families of metrics on compact weak G{sub 2}-manifolds, which contain two conical singularities. Weak G{sub 2}-manifolds have properties that are similar to the ones of proper G{sub 2}-manifolds, and hence the explicit examples might be useful to better understand the generic situation. Finally, we reconsider the relation between eleven-dimensional supergravity and the E{sub 8} x E{sub 8}-heterotic string. This is done by carefully studying the anomalies that appear if the supergravity theory is formulated on a ten-manifold times the interval. Again we find that the anomalies cancel locally at the boundaries of the interval through anomaly inflow, provided one suitably modifies the
A dynamical theory of nucleation
Lutsko, James F.
2013-05-01
A dynamical theory of nucleation based on fluctuating hydrodynamics is described. It is developed in detail for the case of diffusion-limited nucleation appropriate to colloids and macro-molecules in solution. By incorporating fluctuations, realistic fluid-transport and realistic free energy models the theory is able to give a unified treatment of both the pre-critical development of fluctuations leading to a critical cluster as well as of post-critical growth. Standard results from classical nucleation theory are shown to follow in the weak noise limit while the generality of the theory allows for many extensions including the description of very high supersaturations (small clusters), multiple order parameters and strong-noise effects to name a few. The theory is applied to homogeneous and heterogeneous nucleation of a model globular protein in a confined volume and it is found that nucleation depends critically on the existence of long-wavelength, small-amplitude density fluctuations.
Quantum corrections in classicalon theories
Directory of Open Access Journals (Sweden)
P. Asimakis
2015-04-01
Full Text Available We use the heat kernel in order to compute the one-loop effective action on a classicalon background. We find that the UV divergences are suppressed relative to the predictions of standard perturbation theory in the interior of the classicalon. There is a strong analogy with the suppression of quantum fluctuations in Galileon theories, within the regions where the Vainshtein mechanism operates (discussed in arXiv:1401.2775. Both classicalon and Galileon theories display reduced UV sensitivity on certain backgrounds.
Quantum corrections in classicalon theories
Asimakis, P.; Brouzakis, N.; Katsis, A.; Tetradis, N.
2015-04-01
We use the heat kernel in order to compute the one-loop effective action on a classicalon background. We find that the UV divergences are suppressed relative to the predictions of standard perturbation theory in the interior of the classicalon. There is a strong analogy with the suppression of quantum fluctuations in Galileon theories, within the regions where the Vainshtein mechanism operates (discussed in arxiv:arXiv:1401.2775). Both classicalon and Galileon theories display reduced UV sensitivity on certain backgrounds.
Electromagnetic reciprocity in antenna theory
Stumpf, Martin
2018-01-01
The reciprocity theorem is among the most intriguing concepts in wave field theory and has become an integral part of almost all standard textbooks on electromagnetic (EM) theory. This book makes use of the theorem to quantitatively describe EM interactions concerning general multiport antenna systems. It covers a general reciprocity-based description of antenna systems, their EM scattering properties, and further related aspects. Beginning with an introduction to the subject, Electromagnetic Reciprocity in Antenna Theory provides readers first with the basic prerequisites before offering coverage of the equivalent multiport circuit antenna representations, EM coupling between multiport antenna systems and their EM interactions with scatterers, accompanied with the corresponding EM compensation theorems.
Fundamental principles of quantum theory
International Nuclear Information System (INIS)
Bugajski, S.
1980-01-01
After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)
Helms, Lester L
2014-01-01
Potential Theory presents a clear path from calculus to classical potential theory and beyond, with the aim of moving the reader into the area of mathematical research as quickly as possible. The subject matter is developed from first principles using only calculus. Commencing with the inverse square law for gravitational and electromagnetic forces and the divergence theorem, the author develops methods for constructing solutions of Laplace's equation on a region with prescribed values on the boundary of the region. The latter half of the book addresses more advanced material aimed at those with the background of a senior undergraduate or beginning graduate course in real analysis. Starting with solutions of the Dirichlet problem subject to mixed boundary conditions on the simplest of regions, methods of morphing such solutions onto solutions of Poisson's equation on more general regions are developed using diffeomorphisms and the Perron-Wiener-Brelot method, culminating in application to Brownian motion. In ...
DEFF Research Database (Denmark)
Jensen, Klaus Bruhn
2016-01-01
This article revisits the place of normative and other practical issues in the wider conceptual architecture of communication theory, building on the tradition of philosophical pragmatism. The article first characterizes everyday concepts of communication as the accumulated outcome of natural...... evolution and history: practical resources for human existence and social coexistence. Such practical concepts have served as the point of departure for diverse theoretical conceptions of what communication is. The second part of the article highlights the past neglect and current potential of normative...... communication theories that ask, in addition, what communication ought to be, and what it could be, taking the relationship between communication and justice as a case in point. The final section returns to empirical conceptualizations of different institutions, practices and discourses of communication...
DEFF Research Database (Denmark)
Stein, Irene F.; Stelter, Reinhard
2011-01-01
Communication theory covers a wide variety of theories related to the communication process (Littlejohn, 1999). Communication is not simply an exchange of information, in which we have a sender and a receiver. This very technical concept of communication is clearly outdated; a human being...... is not a data processing device. In this chapter, communication is understood as a process of shared meaning-making (Bruner, 1990). Human beings interpret their environment, other people, and themselves on the basis of their dynamic interaction with the surrounding world. Meaning is essential because people...... ascribe specific meanings to their experiences, their actions in life or work, and their interactions. Meaning is reshaped, adapted, and transformed in every communication encounter. Furthermore, meaning is cocreated in dialogues or in communities of practice, such as in teams at a workplace or in school...
2015-01-01
A one-sentence definition of operator theory could be: The study of (linear) continuous operations between topological vector spaces, these being in general (but not exclusively) Fréchet, Banach, or Hilbert spaces (or their duals). Operator theory is thus a very wide field, with numerous facets, both applied and theoretical. There are deep connections with complex analysis, functional analysis, mathematical physics, and electrical engineering, to name a few. Fascinating new applications and directions regularly appear, such as operator spaces, free probability, and applications to Clifford analysis. In our choice of the sections, we tried to reflect this diversity. This is a dynamic ongoing project, and more sections are planned, to complete the picture. We hope you enjoy the reading, and profit from this endeavor.
Dynamics of the standard model
Donoghue, John F; Holstein, Barry R
2014-01-01
Describing the fundamental theory of particle physics and its applications, this book provides a detailed account of the Standard Model, focusing on techniques that can produce information about real observed phenomena. The book begins with a pedagogic account of the Standard Model, introducing essential techniques such as effective field theory and path integral methods. It then focuses on the use of the Standard Model in the calculation of physical properties of particles. Rigorous methods are emphasized, but other useful models are also described. This second edition has been updated to include recent theoretical and experimental advances, such as the discovery of the Higgs boson. A new chapter is devoted to the theoretical and experimental understanding of neutrinos, and major advances in CP violation and electroweak physics have been given a modern treatment. This book is valuable to graduate students and researchers in particle physics, nuclear physics and related fields.
Wierman, John C.
1982-01-01
An introduction is provided to the mathematical tools and problems of percolation theory. A discussion of Bernoulli percolation models shows the role of graph duality and correlation inequalities in the recent determination of the critical probability in the square, triangular, and hexagonal lattice bond models. An introduction to first passage percolation concentrates on the problems of existence of optimal routes, length of optimal routes, and conditions for convergence of first passage tim...
International Nuclear Information System (INIS)
Casten, R F
2015-01-01
This paper discusses some simple issues that arise in testing models, with a focus on models for low energy nuclear structure. By way of simplified examples, we illustrate some dangers in blind statistical assessments, pointing out especially the need to include theoretical uncertainties, the danger of over-weighting precise or physically redundant experimental results, the need to assess competing theories with independent and physically sensitive observables, and the value of statistical tests properly evaluated. (paper)
DEFF Research Database (Denmark)
Guillet de Monthoux, Pierre; Statler, Matt
2017-01-01
The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer's Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specifically......, they trace a genealogy of social sculpture, Schwungspiel, poetic creation, and spiritual science, and suggest that Scharmer's work integrates these concepts into a pragmatic pedagogy that has implications for business practice as well as business education....
Friedrich, Harald
2016-01-01
This corrected and updated second edition of "Scattering Theory" presents a concise and modern coverage of the subject. In the present treatment, special attention is given to the role played by the long-range behaviour of the projectile-target interaction, and a theory is developed, which is well suited to describe near-threshold bound and continuum states in realistic binary systems such as diatomic molecules or molecular ions. It is motivated by the fact that experimental advances have shifted and broadened the scope of applications where concepts from scattering theory are used, e.g. to the field of ultracold atoms and molecules, which has been experiencing enormous growth in recent years, largely triggered by the successful realization of Bose-Einstein condensates of dilute atomic gases in 1995. The book contains sections on special topics such as near-threshold quantization, quantum reflection, Feshbach resonances and the quantum description of scattering in two dimensions. The level of abstraction is k...
String theory compactifications
Graña, Mariana
2017-01-01
The lectures in this book provide graduate students and non-specialist researchers with a concise introduction to the concepts and formalism required to reduce the ten-dimensional string theories to the observable four-dimensional space-time - a procedure called string compactification. The text starts with a very brief introduction to string theory, first working out its massless spectrum and showing how the condition on the number of dimensions arises. It then dwells on the different possible internal manifolds, from the simplest to the most relevant phenomenologically, thereby showing that the most elegant description is through an extension of ordinary Riemannian geometry termed generalized geometry, which was first introduced by Hitchin. Last but not least, the authors review open problems in string phenomenology, such as the embedding of the Standard Model and obtaining de Sitter solutions.
Undergraduate Lecture Notes in Topological Quantum Field Theory
Ivancevic, Vladimir G.; Ivancevic, Tijana T.
2008-01-01
These third-year lecture notes are designed for a 1-semester course in topological quantum field theory (TQFT). Assumed background in mathematics and physics are only standard second-year subjects: multivariable calculus, introduction to quantum mechanics and basic electromagnetism. Keywords: quantum mechanics/field theory, path integral, Hodge decomposition, Chern-Simons and Yang-Mills gauge theories, conformal field theory
Prestage, John D.; Tjoelker, Robert L.; Maleki, Lute
2000-01-01
In this paper we review the development of Hg(+) microwave frequency standards for use in high reliability and continuous operation applications. In recent work we have demonstrated short-term frequency stability of 3 x 10(exp -14)/nu(sub tau) when a cryogenic oscillator of stability 2-3 x 10(exp 15) was used a the local oscillator. The trapped ion frequency standard employs a Hg-202 discharge lamp to optically pump the trapped Hg(+)-199 clock ions and a helium buffer gas to cool the ions to near room temperature. We describe a small Hg(+) ion trap based frequency standard with an extended linear ion trap (LITE) architecture which separates the optical state selection region from the clock resonance region. This separation allows the use of novel trap configurations in the resonance region since no optical pumping is carried out there. A method for measuring the size of an ion cloud inside a linear trap with a 12-rod trap is currently being investigated. At approx. 10(exp -12), the 2nd order Doppler shift for trapped mercury ion frequency standards is one of the largest frequency offsets and its measurement to the 1% level would represent an advance in insuring the very long-term stability of these standards to the 10(exp -14) or better level. Finally, we describe atomic clock comparison experiments that can probe for a time variation of the fine structure constant, alpha = e(exp 2)/2(pi)hc, at the level of 10(exp -20)/year as predicted in some Grand Unified String Theories.
Neutrinos: Theory and Phenomenology
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen
2013-10-22
The theory and phenomenology of neutrinos will be addressed, especially that relating to the observation of neutrino flavor transformations. The current status and implications for future experiments will be discussed with special emphasis on the experiments that will determine the neutrino mass ordering, the dominant flavor content of the neutrino mass eigenstate with the smallest electron neutrino content and the size of CP violation in the neutrino sector. Beyond the neutrino Standard Model, the evidence for and a possible definitive experiment to confirm or refute the existence of light sterile neutrinos will be briefly discussed.
DEFF Research Database (Denmark)
Wisniewski, Rafal
2003-01-01
The work is intended to provide some insight about concurrency theory using ideas from geometry and algebraic topology. We define a topological space containing all traces of execution of the computer program and the information about how time flows. This is the main difference with standard...... topological reasoning in which there is no information about relation "in time" among points. The main task is to define equivalence of paths reflecting execution of a program. We use the notion of homotopy history equivalence relation. The model space considered in this work is a differentiable manifold...
Graphs Theory and Applications
Fournier, Jean-Claude
2008-01-01
This book provides a pedagogical and comprehensive introduction to graph theory and its applications. It contains all the standard basic material and develops significant topics and applications, such as: colorings and the timetabling problem, matchings and the optimal assignment problem, and Hamiltonian cycles and the traveling salesman problem, to name but a few. Exercises at various levels are given at the end of each chapter, and a final chapter presents a few general problems with hints for solutions, thus providing the reader with the opportunity to test and refine their knowledge on the
2009-01-01
This book deals with the basic subjects of design theory. It begins with balanced incomplete block designs, various constructions of which are described in ample detail. In particular, finite projective and affine planes, difference sets and Hadamard matrices, as tools to construct balanced incomplete block designs, are included. Orthogonal latin squares are also treated in detail. Zhu's simpler proof of the falsity of Euler's conjecture is included. The construction of some classes of balanced incomplete block designs, such as Steiner triple systems and Kirkman triple systems, are also given.
Goldie, Charles M
1991-01-01
This book is an introduction, for mathematics students, to the theories of information and codes. They are usually treated separately but, as both address the problem of communication through noisy channels (albeit from different directions), the authors have been able to exploit the connection to give a reasonably self-contained treatment, relating the probabilistic and algebraic viewpoints. The style is discursive and, as befits the subject, plenty of examples and exercises are provided. Some examples and exercises are provided. Some examples of computer codes are given to provide concrete illustrations of abstract ideas.
Sawyer, Eric T
2009-01-01
These lecture notes take the reader from Lennart Carleson's first deep results on interpolation and corona problems in the unit disk to modern analogues in the disk and ball. The emphasis is on introducing the diverse array of techniques needed to attack these problems rather than producing an encyclopedic summary of achievements. Techniques from classical analysis and operator theory include duality, Blaschke product constructions, purely Hilbert space arguments, bounded mean oscillation, best approximation, boundedness of the Beurling transform, estimates on solutions to the \\bar\\partial equ
Blyth, T S; Sneddon, I N; Stark, M
1972-01-01
Residuation Theory aims to contribute to literature in the field of ordered algebraic structures, especially on the subject of residual mappings. The book is divided into three chapters. Chapter 1 focuses on ordered sets; directed sets; semilattices; lattices; and complete lattices. Chapter 2 tackles Baer rings; Baer semigroups; Foulis semigroups; residual mappings; the notion of involution; and Boolean algebras. Chapter 3 covers residuated groupoids and semigroups; group homomorphic and isotone homomorphic Boolean images of ordered semigroups; Dubreil-Jacotin and Brouwer semigroups; and loli
Effective Field Theories and the Role of Consistency in Theory Choice
Wells, James D
2012-01-01
Promoting a theory with a finite number of terms into an effective field theory with an infinite number of terms worsens simplicity, predictability, falsifiability, and other attributes often favored in theory choice. However, the importance of these attributes pales in comparison with consistency, both observational and mathematical consistency, which propels the effective theory to be superior to its simpler truncated version of finite terms, whether that theory be renormalizable (e.g., Standard Model of particle physics) or nonrenormalizable (e.g., gravity). Some implications for the Large Hadron Collider and beyond are discussed, including comments on how directly acknowledging the preeminence of consistency can affect future theory work.
International business theory and marketing theory
Soldner, Helmut
1984-01-01
International business theory and marketing theory : elements for internat. marketing theory building. - In: Marketing aspects of international business / Gerald M. Hampton ... (eds.). - Boston u.a. : Kluwer, 1984. - S. 25-57
Quantum Field Theory A Modern Perspective
Parameswaran Nair, V
2005-01-01
Quantum field theory, which started with Paul Dirac’s work shortly after the discovery of quantum mechanics, has produced an impressive and important array of results. Quantum electrodynamics, with its extremely accurate and well-tested predictions, and the standard model of electroweak and chromodynamic (nuclear) forces are examples of successful theories. Field theory has also been applied to a variety of phenomena in condensed matter physics, including superconductivity, superfluidity and the quantum Hall effect. The concept of the renormalization group has given us a new perspective on field theory in general and on critical phenomena in particular. At this stage, a strong case can be made that quantum field theory is the mathematical and intellectual framework for describing and understanding all physical phenomena, except possibly for a quantum theory of gravity. Quantum Field Theory: A Modern Perspective presents Professor Nair’s view of certain topics in field theory loosely knit together as it gr...
Energy Technology Data Exchange (ETDEWEB)
Svrcek, Peter; /Stanford U., Phys. Dept. /SLAC; Witten, Edward; /Princeton, Inst. Advanced Study
2006-06-09
In the context of string theory, axions appear to provide the most plausible solution of the strong CP problem. However, as has been known for a long time, in many string-based models, the axion coupling parameter Fa is several orders of magnitude higher than the standard cosmological bounds. We re-examine this problem in a variety of models, showing that Fa is close to the GUT scale or above in many models that have GUT-like phenomenology, as well as some that do not. On the other hand, in some models with Standard Model gauge fields supported on vanishing cycles, it is possible for Fa to be well below the GUT scale.
Three essays in econometric theory
Gan, Zhuojiong
2015-01-01
This thesis consists of three essays in econometric theory. In the first essay, he considers a prediction problem with a large number of predictors. He improves the prediction precision of the standard factor model by allowing some variables to have idiosyncratic factors that are relevant for
String theory and cosmological singularities
Indian Academy of Sciences (India)
Well-known examples are singularities inside black holes and initial or final singularities in expanding or contracting universes. In recent times, string theory is providing new perspectives of such singularities which may lead to an understanding of these in the standard framework of time evolution in quantum mechanics.
Developments in high energy theory
Indian Academy of Sciences (India)
It provides a panoramic view of the main theoretical developments in high energy physics since its inception more than half a century ago, a period in which experiments have spanned an enormous range of energies, theories have been developed leading up to the Standard Model, and proposals – including the radical ...
Developments in high energy theory
Indian Academy of Sciences (India)
decay or cosmological dark matter) at both overground and underground locations, each involving a gigantic apparatus. In addition, this field has been a fertile ground for innovative, if sometimes spec- ulative, ideas trying to go beyond the Standard Model. These have provided a rich kaleidoscope of theories, some of them ...
International Nuclear Information System (INIS)
Markland, J.T.
1992-01-01
Techniques used in conventional project appraisal are mathematically very simple in comparison to those used in reservoir modelling, and in the geosciences. Clearly it would be possible to value assets in mathematically more sophisticated ways if it were meaningful and worthwhile so to do. The DCf approach in common use has recognized limitations; the inability to select a meaningful discount rate being particularly significant. Financial Theory has advanced enormously over the last few years, along with computational techniques, and methods are beginning to appear which may change the way we do project evaluations in practice. The starting point for all of this was a paper by Black and Scholes, which asserts that almost all corporate liabilities can be viewed as options of varying degrees of complexity. Although the financial presentation may be unfamiliar to engineers and geoscientists, some of the concepts used will not be. This paper outlines, in plain English, the basis of option pricing theory for assessing the market value of a project. it also attempts to assess the future role of this type of approach in practical Petroleum Exploration and Engineering economics. Reference is made to relevant published Natural Resource literature
String theory as a Lilliputian world
International Nuclear Information System (INIS)
Ambjørn, J.; Makeenko, Y.
2016-01-01
Lattice regularizations of the bosonic string do not allow us to probe the tachyon. This has often been viewed as the reason why these theories have never managed to make any contact to standard continuum string theories when the dimension of spacetime is larger than two. We study the continuum string theory in large spacetime dimensions where simple mean field theory is reliable. By keeping carefully the cutoff we show that precisely the existence of a tachyon makes it possible to take a scaling limit which reproduces the lattice-string results. We compare this scaling limit with another scaling limit which reproduces standard continuum-string results. If the people working with lattice regularizations of string theories are akin to Gulliver they will view the standard string-world as a Lilliputian world no larger than a few lattice spacings.
String theory as a Lilliputian world
Energy Technology Data Exchange (ETDEWEB)
Ambjørn, J., E-mail: ambjorn@nbi.dk [The Niels Bohr Institute, Copenhagen University, Blegdamsvej 17, DK-2100 Copenhagen (Denmark); IMAPP, Radboud University, Heyendaalseweg 135, 6525 AJ, Nijmegen (Netherlands); Makeenko, Y., E-mail: makeenko@nbi.dk [The Niels Bohr Institute, Copenhagen University, Blegdamsvej 17, DK-2100 Copenhagen (Denmark); Institute of Theoretical and Experimental Physics, B. Cheremushkinskaya 25, 117218 Moscow (Russian Federation)
2016-05-10
Lattice regularizations of the bosonic string do not allow us to probe the tachyon. This has often been viewed as the reason why these theories have never managed to make any contact to standard continuum string theories when the dimension of spacetime is larger than two. We study the continuum string theory in large spacetime dimensions where simple mean field theory is reliable. By keeping carefully the cutoff we show that precisely the existence of a tachyon makes it possible to take a scaling limit which reproduces the lattice-string results. We compare this scaling limit with another scaling limit which reproduces standard continuum-string results. If the people working with lattice regularizations of string theories are akin to Gulliver they will view the standard string-world as a Lilliputian world no larger than a few lattice spacings.
Energy Technology Data Exchange (ETDEWEB)
Lykken, Joseph D.; /Fermilab
2010-05-01
'BSM physics' is a phrase used in several ways. It can refer to physical phenomena established experimentally but not accommodated by the Standard Model, in particular dark matter and neutrino oscillations (technically also anything that has to do with gravity, since gravity is not part of the Standard Model). 'Beyond the Standard Model' can also refer to possible deeper explanations of phenomena that are accommodated by the Standard Model but only with ad hoc parameterizations, such as Yukawa couplings and the strong CP angle. More generally, BSM can be taken to refer to any possible extension of the Standard Model, whether or not the extension solves any particular set of puzzles left unresolved in the SM. In this general sense one sees reference to the BSM 'theory space' of all possible SM extensions, this being a parameter space of coupling constants for new interactions, new charges or other quantum numbers, and parameters describing possible new degrees of freedom or new symmetries. Despite decades of model-building it seems unlikely that we have mapped out most of, or even the most interesting parts of, this theory space. Indeed we do not even know what is the dimensionality of this parameter space, or what fraction of it is already ruled out by experiment. Since Nature is only implementing at most one point in this BSM theory space (at least in our neighborhood of space and time), it might seem an impossible task to map back from a finite number of experimental discoveries and measurements to a unique BSM explanation. Fortunately for theorists the inevitable limitations of experiments themselves, in terms of resolutions, rates, and energy scales, means that in practice there are only a finite number of BSM model 'equivalence classes' competing at any given time to explain any given set of results. BSM phenomenology is a two-way street: not only do experimental results test or constrain BSM models, they also suggest
Standard Model, Higgs Boson and What Next?
Indian Academy of Sciences (India)
IAS Admin
RESONANCE | October 2012. GENERAL | ARTICLE. Standard Model is now known to be the basis of almost ALL of known physics except gravity. It is the dynamical theory of electromagnetism and the strong and weak nuclear forces. Standard Model has been constructed by generalizing the century-old electrodynamics of.
Bootstrapping N=3 superconformal theories
Energy Technology Data Exchange (ETDEWEB)
Lemos, Madalena; Liendo, Pedro [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Meneghelli, Carlo [Stony Brook Univ., Stony Brook, NY (United States). Simons Center for Geometry and Physics; Mitev, Vladimir [Mainz Univ. (Germany). PRISMA Cluster of Excellence
2016-12-15
We initiate the bootstrap program for N=3 superconformal field theories (SCFTs) in four dimensions. The problem is considered from two fronts: the protected subsector described by a 2d chiral algebra, and crossing symmetry for half-BPS operators whose superconformal primaries parametrize the Coulomb branch of N=3 theories. With the goal of describing a protected subsector of a family of =3 SCFTs, we propose a new 2d chiral algebra with super Virasoro symmetry that depends on an arbitrary parameter, identified with the central charge of the theory. Turning to the crossing equations, we work out the superconformal block expansion and apply standard numerical bootstrap techniques in order to constrain the CFT data. We obtain bounds valid for any theory but also, thanks to input from the chiral algebra results, we are able to exclude solutions with N=4 supersymmetry, allowing us to zoom in on a specific N=3 SCFT.
International Nuclear Information System (INIS)
Gong, Ha Seong
2006-02-01
This book explains electric theory which is divided into four chapters. The first chapter includes electricity and material, electric field, capacitance, magnetic field and electromagnetic force, inductance. The second chapter mentions electronic circuit analysis, electric resistance,heating and power, chemical activity on current and battery with electrolysis. The third chapter deals with an alternating current circuit about the basics of an AC circuit, operating of resistance, inductance and capacitance, series circuit and parallel circuit of PLC, an alternating current circuit, Three-phase Alternating current, two terminal pair network and voltage and current of non-linearity circuit. The last explains transient phenomena of RC series circuit, RL series circuit, transient phenomena of an alternating current circuit and transient phenomena of RLC series circuit.
International Nuclear Information System (INIS)
Nobile, G.
1993-07-01
With reference to highly debated sustainable growth strategies to counter pressing interrelated global environmental and socio-economic problems, this paper reviews economic and resource development theories proposed by classical and neoclassical economists. The review evidences the growing debate among public administration decision makers regarding appropriate methods to assess the worth of natural resources and ecosystems. Proposed methods tend to be biased either towards environmental protection or economic development. Two major difficulties in the effective implementation of sustainable growth strategies are also evidenced - the management of such strategies would require appropriate revisions to national accounting systems, and the dynamic flow of energy and materials between an economic system and the environment would generate a sequence of unstable structures evolving in a chaotic and unpredictable way
Anders, M; Trezzi, D; Menegazzo, R; Aliotta, M; Bellini, A; Bemmerer, D; Broggini, C; Caciolli, A; Corvisiero, P; Costantini, H; Davinson, T; Elekes, Z; Erhard, M; Formicola, A; Fülöp, Zs; Gervino, G; Guglielmetti, A; Gustavino, C; Gyürky, Gy; Junker, M; Lemut, A; Marta, M; Mazzocchi, C; Prati, P; Rossi Alvarez, C; Scott, D A; Somorjai, E; Straniero, O; Szücs, T
2014-07-25
Recent observations of (6)Li in metal poor stars suggest a large production of this isotope during big bang nucleosynthesis (BBN). In standard BBN calculations, the (2)H(α,γ)(6)Li reaction dominates (6)Li production. This reaction has never been measured inside the BBN energy region because its cross section drops exponentially at low energy and because the electric dipole transition is strongly suppressed for the isoscalar particles (2)H and α at energies below the Coulomb barrier. Indirect measurements using the Coulomb dissociation of (6)Li only give upper limits owing to the dominance of nuclear breakup processes. Here, we report on the results of the first measurement of the (2)H(α,γ)(6)Li cross section at big bang energies. The experiment was performed deep underground at the LUNA 400 kV accelerator in Gran Sasso, Italy. The primordial (6)Li/(7)Li isotopic abundance ratio has been determined to be (1.5 ± 0.3) × 10(-5), from our experimental data and standard BBN theory. The much higher (6)Li/(7)Li values reported for halo stars will likely require a nonstandard physics explanation, as discussed in the literature.
Directory of Open Access Journals (Sweden)
Ricardo Lopes Cardoso
2009-08-01
Full Text Available O processo de convergência das práticas nacionais de contabilidade aos padrões internacionais implica profundas alterações na regulação da contabilidade. É natural que os contabilistas estejam preocupados em se adaptar aos "novos" padrões buscando adotá-los, e auditar sua adoção nas respectivas empresas/clientes. Entretanto, tão importante quanto adotar e auditar a adoção dos International Financial Reporting Standards (IFRS nas demonstrações contábeis das empresas brasileiras é compreender o movimento de alteração das normas contábeis em âmbito nacional. Por outro lado, pouco se tem discutido sobre os impactos dessas novas regulamentações. Este artigo analisa, numa perspectiva interdisciplinar, o processo de alteração da regulação da contabilidade à luz de cinco teorias da regulação. Embora as teorias sejam concorrentes, observou-se que elas podem ser utilizadas de forma complementar entre si na compreensão das alterações promovidas pela Lei nº 11.638/07 e pela Medida Provisória nº 449/08 na Lei nº 6.404/76. Considerou-se que as teorias realiana e habermasiana são as que melhor contribuem para a democratização da contabilidade, uma vez que consideram os valores sociais na elaboração e posterior interpretação da regulação.The convergence process of local accounting standards into international standards requires significant changes in accounting regulation. Accountants and auditors are working hard to understand and familiarize themselves with these "new" standards in order to adopt and audit them at their firms and/or clients. However, adopting and auditing the adoption of International Financial Reporting Standards (IFRS are just as important as understanding the changes in local accounting regulation. Also, the impact of the new regulation has been little discussed. This theory-based article examines the Brazilian IFRS convergence experience through an interdisciplinary perspective. Although all
Applied Hypergame Theory for Network Defense
2013-06-01
majority of game theory models are identified as either strategic , used to represent a simultaneous game, or extensive games, more often used to...the normal strategic form used in standard game theory analysis. The new model is referred to as Hypergame Normal Form (HNF), see Figure 2.7. The full...well with the selection of game theory attributes described. They provide a basic two-player, attacker and defender, game in strategic form with
Physics beyond the Standard Model
Valle, José W F
1991-01-01
We discuss some of the signatures associated with extensions of the Standard Model related to the neutrino and electroweak symmetry breaking sectors, with and without supersymmetry. The topics include a basic discussion of the theory of neutrino mass and the corresponding extensions of the Standard Model that incorporate massive neutrinos; an overview of the present observational status of neutrino mass searches, with emphasis on solar neutrinos, as well the as cosmological data on the amplitude of primordial density fluctuations; the implications of neutrino mass in cosmological nucleosynthesis, non-accelerator, as well as in high energy particle collider experiments. Turning to the electroweak breaking sector, we discuss the physics potential for Higgs boson searches at LEP200, including Majoron extensions of the Standard Model, and the physics of invisibly decaying Higgs bosons. We discuss the minimal supersymmetric Standard Model phenomenology, as well as some of the laboratory signatures that would be as...
Dual symmetry in gauge theories
International Nuclear Information System (INIS)
Koshkarov, A.L.
1997-01-01
Continuous dual symmetry in electrodynamics, Yang-Mills theory and gravitation is investigated. Dual invariant which leads to badly nonlinear motion equations is chosen as a Lagrangian of the pure classical dual nonlinear electrodynamics. In a natural manner some dual angle which is determined by the electromagnetic strengths at the point of the time-space appears in the model. Motion equations may well be interpreted as the equations of the standard Maxwell theory with source. Alternative interpretation is the quasi-Maxwell linear theory with magnetic charge. Analogous approach is possible in the Yang-Mills theory. In this case the dual-invariant non-Abelian theory motion equations possess the same instanton solutions as the conventional Yang-Mills equations have. An Abelian two-parameter dual group is found to exist in gravitation. Irreducible representations have been obtained: the curvature tensor was expanded into the sum of twice anti-self-dual and self-dual parts. Gravitational instantons are defined as (real )solutions to the usual duality equations. Central symmetry solutions to these equations are obtained. The twice anti-self-dual part of the curvature tensor may be used for introduction of new gravitational equations generalizing Einstein''s equations. However, the theory obtained reduces to the conformal-flat Nordstroem theory
Vergados, J D
2017-01-01
This book contains a systematic and pedagogical exposition of recent developments in particle physics and cosmology. It starts with two introductory chapters on group theory and the Dirac theory. Then it proceeds with the formulation of the Standard Model (SM) of Particle Physics, particle content and symmetries, fully exploiting the first chapters. It discusses the concept of gauge symmetries and emphasizes their role in particle physics. It then analyses the Higgs mechanism and the spontaneous symmetry breaking (SSB). It explains how the particles (gauge bosons and fermions) after SSB acquire a mass and get admixed. The various forms of charged currents are discussed in detail as well as how the parameters of the SM, which cannot be determined by the theory, are fixed by experiment, including the recent LHC data and the Higgs discovery. Quantum chromodynamics is discussed and various low energy approximations to it are presented. The Feynman diagrams are introduced and applied, in a way undertandable by fir...
Standard model of knowledge representation
Yin, Wensheng
2016-09-01
Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.
The Dynamics of Standardization
DEFF Research Database (Denmark)
Brunsson, Nils; Rasche, Andreas; Seidl, David
2012-01-01
This paper suggests that when the phenomenon of standards and standardization is examined from the perspective of organization studies, three aspects stand out: the standardization of organizations, standardization by organizations and standardization as (a form of) organization. Following a comp...
Self-consistent normal ordering of gauge field theories
International Nuclear Information System (INIS)
Ruehl, W.
1987-01-01
Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs
Antoniadis, Ignatios; Tomaras, T N
2001-01-01
The minimal embedding of the Standard Model in type I string theory is described. The SU(3) color and SU(2) weak interactions arise from two different collections of branes. The correct prediction of the weak angle is obtained for a string scale of 6-8 TeV. Two Higgs doublets are necessary and proton stability is guaranteed. It predicts two massive vector bosons with masses at the TeV scale, as well as a new superweak interaction.
Boolean Approach to Dichotomic Quantum Measurement Theories
Energy Technology Data Exchange (ETDEWEB)
Nagata, K. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Nakamura, T. [Keio University, Yokohama (Japan); Batle, J. [Universitat de les Illes Balears, Balearic Islands (Spain); Abdalla, S. [King Abdulaziz University Jeddah, Jeddah (Saudi Arabia); Farouk, A. [Al-Zahra College for Women, Muscat (Egypt)
2017-02-15
Recently, a new measurement theory based on truth values was proposed by Nagata and Nakamura [Int. J. Theor. Phys. 55, 3616 (2016)], that is, a theory where the results of measurements are either 0 or 1. The standard measurement theory accepts a hidden variable model for a single Pauli observable. Hence, we can introduce a classical probability space for the measurement theory in this particular case. Additionally, we discuss in the present contribution the fact that projective measurement theories (the results of which are either +1 or −1) imply the Bell, Kochen, and Specker (BKS) paradox for a single Pauli observable. To justify our assertion, we present the BKS theorem in almost all the two-dimensional states by using a projective measurement theory. As an example, we present the BKS theorem in two-dimensions with white noise. Our discussion provides new insight into the quantum measurement problem by using this measurement theory based on the truth values.
Vocation in theology-based nursing theories.
Lundmark, Mikael
2007-11-01
By using the concepts of intrinsicality/extrinsicality as analytic tools, the theology-based nursing theories of Ann Bradshaw and Katie Eriksson are analyzed regarding their explicit and/or implicit understanding of vocation as a motivational factor for nursing. The results show that both theories view intrinsic values as guarantees against reducing nursing practice to mechanistic applications of techniques and as being a way of reinforcing a high ethical standard. The theories explicitly (Bradshaw) or implicitly (Eriksson) advocate a vocational understanding of nursing as being essential for nursing theories. Eriksson's theory has a potential for conceptualizing an understanding of extrinsic and intrinsic motivational factors for nursing but one weakness in the theory could be the risk of slipping over to moral judgments where intrinsic factors are valued as being superior to extrinsic. Bradshaw's theory is more complex and explicit in understanding the concept of vocation and is theologically more plausible, although also more confessional.
Operator theoretic aspects of ergodic theory
Eisner, Tanja; Haase, Markus; Nagel, Rainer
2015-01-01
Stunning recent results by Host–Kra, Green–Tao, and others, highlight the timeliness of this systematic introduction to classical ergodic theory using the tools of operator theory. Assuming no prior exposure to ergodic theory, this book provides a modern foundation for introductory courses on ergodic theory, especially for students or researchers with an interest in functional analysis. While basic analytic notions and results are reviewed in several appendices, more advanced operator theoretic topics are developed in detail, even beyond their immediate connection with ergodic theory. As a consequence, the book is also suitable for advanced or special-topic courses on functional analysis with applications to ergodic theory. Topics include: •an intuitive introduction to ergodic theory •an introduction to the basic notions, constructions, and standard examples of topological dynamical systems •Koopman operators, Banach lattices, lattice and algebra homomorphisms, and the Gelfand–Naimark theorem •m...
International Nuclear Information System (INIS)
Maillard, S.; Skorek, R.; Maugis, P.; Dumont, M.
2015-01-01
This chapter presents the basic principles of cluster dynamics as a particular case of mesoscopic rate theory models developed to investigate fuel behaviour under irradiation such as in UO 2 . It is shown that as this method simulates the evolution of the concentration of every type of point or aggregated defect in a grain of material. It produces rich information that sheds light on the mechanisms involved in microstructure evolution and gas behaviour that are not accessible through conventional models but yet can provide for improvements in those models. Cluster dynamics parameters are mainly the energetic values governing the basic evolution mechanisms of the material (diffusion, trapping and thermal resolution). In this sense, the model has a general applicability to very different operational situations (irradiation, ion-beam implantation, annealing) provided that they rely on the same basic mechanisms, without requiring additional data fitting, as is required for more empirical conventional models. This technique, when applied to krypton implanted and annealed samples, yields a precise interpretation of the release curves and helps assess migration mechanisms and the krypton diffusion coefficient, for which data is very difficult to obtain due to the low solubility of the gas. (authors)
Standard Model-like corrections to Dilatonic Dynamics
DEFF Research Database (Denmark)
Antipin, Oleg; Krog, Jens; Mølgaard, Esben
2013-01-01
We examine the effects of standard model-like interactions on the near-conformal dynamics of a theory featuring a dilatonic state identified with the standard model-like Higgs. As template for near-conformal dynamics we use a gauge theory with fermionic matter and elementary mesons possessing...... conformal dynamics could accommodate the observed Higgs-like properties....
Theories of inflation and conformal transformations
International Nuclear Information System (INIS)
Kalara, S.; Kaloper, N.; Olive, K.A.
1990-01-01
We show that several different theories of inflation including R 2 , Brans-Dicke, and induced-gravity inflation are all related to generalized or power-law inflation by means of conformal transformations. These theories all involve non-standard gravity, and the use of conformal transformations allows one to obtain standard inflationary predictions such as the expansion time-scale, reheating and density perturbations in each case very simply. We also discuss the possibilities of this method to be applied to string theory. (orig.)
International Nuclear Information System (INIS)
Anon.
1995-01-01
For 1994, the traditional annual DESY Theory Workshop was devoted to supersymmetry. This is a novel symmetry relating bosons (normally force-carrying particles) and fermions (which normally feel the forces). In supersymmetry, bosons could have fermion counterparts, and vice versa. Although this subject is still largely a theorist's playground, many of the particles and phenomena predicted by models of low energy supersymmetry now seem within reach of present and planned future accelerator experiments, and this was one of the main reasons for choosing a more speculative theme after more phenomenological orientations in recent DESY Theory Workshops. After the welcome by DESY Director General Bjorn Wiik, attention was immediately focused on experimental aspects. P. Steffen (DESY) presented the latest results from HERA. In the following talks, K. Honscheid (Ohio), S. Lammel (Fermilab) and S. Komamiya (CERN and Tokyo) reviewed the experimental situation at electron-proton, hadron and electron-positron colliders, respectively. They discussed the most recent limits for supersymmetric particles (still none in sight!), as well as precision experiments where deviations from the standard model might show up. The workshop was treated to a first rate introduction to the MSSM (''minimal supersymmetric standard model'') by F. Zwirner (CERN), who clearly explained the motivation for going supersymmetric and reviewed the basic structure of the MSSM, its particle content and couplings, as well as the soft breaking terms necessary to avoid immediate conflict with experiment. This was followed by a systematic discussion of the Higgs sector by H. Haber (Santa Cruz), where the first hints of new physics could appear. However, he also made clear that it may not be easy to distinguish standard and non-standard Higgs bosons. Symmetries beyond the standard model, and in particular supersymmetric grand unification were treated in detail by G. Ross (Oxford) and S
General Theory of Absorption in Porous Materials: Restricted Multilayer Theory.
Aduenko, Alexander A; Murray, Andy; Mendoza-Cortes, Jose L
2018-04-18
In this article, we present an approach for the generalization of adsorption of light gases in porous materials. This new theory goes beyond Langmuir and Brunauer-Emmett-Teller theories, which are the standard approaches that have a limited application to crystalline porous materials by their unphysical assumptions on the amount of possible adsorption layers. The derivation of a more general equation for any crystalline porous framework is presented, restricted multilayer theory. Our approach allows the determination of gas uptake considering only geometrical constraints of the porous framework and the interaction energy of the guest molecule with the framework. On the basis of this theory, we calculated optimal values for the adsorption enthalpy at different temperatures and pressures. We also present the use of this theory to determine the optimal linker length for a topologically equivalent framework series. We validate this theoretical approach by applying it to metal-organic frameworks (MOFs) and show that it reproduces the experimental results for seven different reported materials. We obtained the universal equation for the optimal linker length, given the topology of a porous framework. This work applied the general equation to MOFs and H 2 to create energy-storage materials; however, this theory can be applied to other crystalline porous materials and light gases, which opens the possibility of designing the next generations of energy-storage materials by first considering only the geometrical constraints of the porous materials.
Baal, Pierre Van
2014-01-01
""… a pleasant novelty that manages the impossible: a full course in field theory from a derivation of the Dirac equation to the standard electroweak theory in less than 200 pages. Moreover, the final chapter consists of a careful selection of assorted problems, which are original and either anticipate or detail some of the topics discussed in the bulk of the chapters. Instead of building a treatise out of a collection of lecture notes, the author took the complementary approach and constructed a course out of a number of well-known and classic treatises. The result is fresh and useful. … the
The Standard Model and Higgs physics
Torassa, Ezio
2018-05-01
The Standard Model is a consistent and computable theory that successfully describes the elementary particle interactions. The strong, electromagnetic and weak interactions have been included in the theory exploiting the relation between group symmetries and group generators, in order to smartly introduce the force carriers. The group properties lead to constraints between boson masses and couplings. All the measurements performed at the LEP, Tevatron, LHC and other accelerators proved the consistency of the Standard Model. A key element of the theory is the Higgs field, which together with the spontaneous symmetry breaking, gives mass to the vector bosons and to the fermions. Unlike the case of vector bosons, the theory does not provide prediction for the Higgs boson mass. The LEP experiments, while providing very precise measurements of the Standard Model theory, searched for the evidence of the Higgs boson until the year 2000. The discovery of the top quark in 1994 by the Tevatron experiments and of the Higgs boson in 2012 by the LHC experiments were considered as the completion of the fundamental particles list of the Standard Model theory. Nevertheless the neutrino oscillations, the dark matter and the baryon asymmetry in the Universe evidence that we need a new extended model. In the Standard Model there are also some unattractive theoretical aspects like the divergent loop corrections to the Higgs boson mass and the very small Yukawa couplings needed to describe the neutrino masses. For all these reasons, the hunt of discrepancies between Standard Model and data is still going on with the aim to finally describe the new extended theory.
MACCIA, ELIZABETH S.; AND OTHERS
AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…
The Development of the Standard Lithuanian Language: Ecolinguistic Approach
Directory of Open Access Journals (Sweden)
Vaida Buivydienė
2014-06-01
Full Text Available The theory of standard languages is closely linked with the standardization policy and prevailing ideology. The language ideology comprises its value, experience and convictions related to language usage and its dis - course being influenced at institutional, local and global levels. Recently, in the last decades, foreign linguists have linked the theories of the development of standard lan- guages and their ideologies with an ecolinguistic approach towards language standardization phenomena. The article is based on Einar Haugen ’s theory about the development of standard languages and ecolinguistic statements and presents the stages of developing the standard language as well as the factors having an influ - ence on them. In conclusion, a strong political and social impact has been made on the development of the standard Lithuanian language. The stages of the progress of the standard Lithuanian language have rapidly changed each other, some have been held very close to one another and some still have been taken part.
Mean fields and self consistent normal ordering of lattice spin and gauge field theories
International Nuclear Information System (INIS)
Ruehl, W.
1986-01-01
Classical Heisenberg spin models on lattices possess mean field theories that are well defined real field theories on finite lattices. These mean field theories can be self consistently normal ordered. This leads to a considerable improvement over standard mean field theory. This concept is carried over to lattice gauge theories. We construct first an appropriate real mean field theory. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean field theory are derived. (orig.)
Galois theory through exercises
Brzeziński, Juliusz
2018-01-01
This textbook offers a unique introduction to classical Galois theory through many concrete examples and exercises of varying difficulty (including computer-assisted exercises). In addition to covering standard material, the book explores topics related to classical problems such as Galois’ theorem on solvable groups of polynomial equations of prime degrees, Nagell's proof of non-solvability by radicals of quintic equations, Tschirnhausen's transformations, lunes of Hippocrates, and Galois' resolvents. Topics related to open conjectures are also discussed, including exercises related to the inverse Galois problem and cyclotomic fields. The author presents proofs of theorems, historical comments and useful references alongside the exercises, providing readers with a well-rounded introduction to the subject and a gateway to further reading. A valuable reference and a rich source of exercises with sample solutions, this book will be useful to both students and lecturers. Its original concept makes it particula...
Becker, Katrin; Becker, Melanie; Schwarz, John H.
String theory is one of the most exciting and challenging areas of modern theoretical physics. This book guides the reader from the basics of string theory to recent developments. It introduces the basics of perturbative string theory, world-sheet supersymmetry, space-time supersymmetry, conformal field theory and the heterotic string, before describing modern developments, including D-branes, string dualities and M-theory. It then covers string geometry and flux compactifications, applications to cosmology and particle physics, black holes in string theory and M-theory, and the microscopic origin of black-hole entropy. It concludes with Matrix theory, the AdS/CFT duality and its generalizations. This book is ideal for graduate students and researchers in modern string theory, and will make an excellent textbook for a one-year course on string theory. It contains over 120 exercises with solutions, and over 200 homework problems with solutions available on a password protected website for lecturers at www.cambridge.org/9780521860697. Comprehensive coverage of topics from basics of string theory to recent developments Ideal textbook for a one-year course in string theory Includes over 100 exercises with solutions Contains over 200 homework problems with solutions available to lecturers on-line
P. Wakke (Paul); K. Blind (Knut); H.J. de Vries (Henk)
2012-01-01
textabstractExtant research suggests a positive and bidirectional relation between innovation and standardization. Focusing on the service industries, this paper relates the theory of innovation in services to the participation of service providers in standardization committees. For this purpose, we
Potential Theory of Multicomponent Adsorption
DEFF Research Database (Denmark)
Shapiro, Alexander; Stenby, Erling Halfdan
1998-01-01
We developed a theory of multicomponent adsorption on the basis of the potential concept originally suggested by Polanyi. The mixture is considered as a heterogeneous substance segregated in the external field emitted by the adsorbent. The same standard equation of state, with no additional fitting...... parameters, is used for the segregated and for the bulk phases. With this approach, few parameters are needed to correlate pure component adsorption isotherms. These parameters may be used to predict adsorption equilibria of multicomponent mixtures without additional adjustment. A connection between...... the potential theory and the spreading pressure concept is established, and problems of the theory consistency are studied. Numerical algorithms are suggested for evaluation of the segregated state of the mixture in the potential field of adsorption forces. Comparison with experimental data shows good agreement...
Nonlocal Theories in Continuum Mechanics
Directory of Open Access Journals (Sweden)
M. Jirásek
2004-01-01
Full Text Available The purpose of this paper is to explain why the standard continuum theory fails to properly describe certain mechanical phenomena and how the description can be improved by enrichments that incorporate the influence of gradients or weighted spatial averages of strain or of an internal variable. Three typical mechanical problems that require such enrichments are presented: (i dispersion of short elastic waves in heterogeneous or discrete media, (ii size effects in microscale elastoplasticity, in particular with the size dependence of the apparent hardening modulus, and (iii localization of strain and damage in quasibrittle structures and with the resulting transitional size effect. Problems covered in the examples encompass static and dynamic phenomena, linear and nonlinear behavior, and three constitutive frameworks, namely elasticity, plasticity and continuum damage mechanics. This shows that enrichments of the standard continuum theory can be useful in a wide range of mechanical problems.
Causal quantum theory and the collapse locality loophole
International Nuclear Information System (INIS)
Kent, Adrian
2005-01-01
Causal quantum theory is an umbrella term for ordinary quantum theory modified by two hypotheses: state vector reduction is a well-defined process, and strict local causality applies. The first of these holds in some versions of Copenhagen quantum theory and need not necessarily imply practically testable deviations from ordinary quantum theory. The second implies that measurement events which are spacelike separated have no nonlocal correlations. To test this prediction, which sharply differs from standard quantum theory, requires a precise definition of state vector reduction. Formally speaking, any precise version of causal quantum theory defines a local hidden variable theory. However, causal quantum theory is most naturally seen as a variant of standard quantum theory. For that reason it seems a more serious rival to standard quantum theory than local hidden variable models relying on the locality or detector efficiency loopholes. Some plausible versions of causal quantum theory are not refuted by any Bell experiments to date, nor is it evident that they are inconsistent with other experiments. They evade refutation via a neglected loophole in Bell experiments--the collapse locality loophole--which exists because of the possible time lag between a particle entering a measurement device and a collapse taking place. Fairly definitive tests of causal versus standard quantum theory could be made by observing entangled particles separated by ≅0.1 light seconds
Review of Hydroelasticity Theories
DEFF Research Database (Denmark)
Chen, Xu-jun; Wu, You-sheng; Cui, Wei-cheng
2006-01-01
Existing hydroelastic theories are reviewed. The theories are classified into different types: two-dimensional linear theory, two-dimensional nonlinear theory, three-dimensional linear theory and three-dimensional nonlinear theory. Applications to analysis of very large floating structures (VLFS)......) are reviewed and discussed in details. Special emphasis is placed on papers from China and Japan (in native languages) as these papers are not generally publicly known in the rest of the world....
Einstein's strugges with quantum theory a reappraisal
Home, Dipankar
2007-01-01
Einstein’s Struggles with Quantum Theory: A Reappraisal by Dipankar Home and Andrew Whitaker provides a detailed account of Albert Einstein’s thinking in regard to quantum physics. Until recently, most of Einstein’s views on quantum physics were dismissed and even ridiculed; some critics even suggested that Einstein was not able to grasp the complexities of the formalism of quantum theory and subtleties of the standard interpretation of this theory known as the Copenhagen interpretation put forward by Niels Bohr and his colleagues. But was that true? Modern scholarship argues otherwise, insist Drs. Home and Whitaker, who painstakingly explain the questions Einstein raised as well as offer a detailed discussion of Einstein’s position and major contributions to quantum theory, connecting them with contemporary studies on fundamental aspects of this theory. This unique book presents a mathematical as well as a non-mathematical route through the theories, controversies, and investigations, making the disc...
Grounded theory, feminist theory, critical theory: toward theoretical triangulation.
Kushner, Kaysi Eastlick; Morrow, Raymond
2003-01-01
Nursing and social science scholars have examined the compatibility between feminist and grounded theory traditions in scientific knowledge generation, concluding that they are complementary, yet not without certain tensions. This line of inquiry is extended to propose a critical feminist grounded theory methodology. The construction of symbolic interactionist, feminist, and critical feminist variants of grounded theory methodology is examined in terms of the presuppositions of each tradition and their interplay as a process of theoretical triangulation.
Boley, Bruno A
1997-01-01
Highly regarded text presents detailed discussion of fundamental aspects of theory, background, problems with detailed solutions. Basics of thermoelasticity, heat transfer theory, thermal stress analysis, more. 1985 edition.
International Nuclear Information System (INIS)
Marciano, W.J.
1984-12-01
The present state of the art in elementary particle theory is reviewed. Topics include quantum electrodynamics, weak interactions, electroweak unification, quantum chromodynamics, and grand unified theories. 113 references
Standards for Standardized Logistic Regression Coefficients
Menard, Scott
2011-01-01
Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…
Jardine, John F
2015-01-01
This monograph on the homotopy theory of topologized diagrams of spaces and spectra gives an expert account of a subject at the foundation of motivic homotopy theory and the theory of topological modular forms in stable homotopy theory. Beginning with an introduction to the homotopy theory of simplicial sets and topos theory, the book covers core topics such as the unstable homotopy theory of simplicial presheaves and sheaves, localized theories, cocycles, descent theory, non-abelian cohomology, stacks, and local stable homotopy theory. A detailed treatment of the formalism of the subject is interwoven with explanations of the motivation, development, and nuances of ideas and results. The coherence of the abstract theory is elucidated through the use of widely applicable tools, such as Barr's theorem on Boolean localization, model structures on the category of simplicial presheaves on a site, and cocycle categories. A wealth of concrete examples convey the vitality and importance of the subject in topology, n...
Extensions of the standard model
International Nuclear Information System (INIS)
Ramond, P.
1983-01-01
In these lectures we focus on several issues that arise in theoretical extensions of the standard model. First we describe the kinds of fermions that can be added to the standard model without affecting known phenomenology. We focus in particular on three types: the vector-like completion of the existing fermions as would be predicted by a Kaluza-Klein type theory, which we find cannot be realistically achieved without some chiral symmetry; fermions which are vector-like by themselves, such as do appear in supersymmetric extensions, and finally anomaly-free chiral sets of fermions. We note that a chiral symmetry, such as the Peccei-Quinn symmetry can be used to produce a vector-like theory which, at scales less than M/sub W/, appears to be chiral. Next, we turn to the analysis of the second hierarchy problem which arises in Grand Unified extensions of the standard model, and plays a crucial role in proton decay of supersymmetric extensions. We review the known mechanisms for avoiding this problem and present a new one which seems to lead to the (family) triplication of the gauge group. Finally, this being a summer school, we present a list of homework problems. 44 references
The standard model in a nutshell
Goldberg, Dave
2017-01-01
For a theory as genuinely elegant as the Standard Model--the current framework describing elementary particles and their forces--it can sometimes appear to students to be little more than a complicated collection of particles and ranked list of interactions. The Standard Model in a Nutshell provides a comprehensive and uncommonly accessible introduction to one of the most important subjects in modern physics, revealing why, despite initial appearances, the entire framework really is as elegant as physicists say. Dave Goldberg uses a "just-in-time" approach to instruction that enables students to gradually develop a deep understanding of the Standard Model even if this is their first exposure to it. He covers everything from relativity, group theory, and relativistic quantum mechanics to the Higgs boson, unification schemes, and physics beyond the Standard Model. The book also looks at new avenues of research that could answer still-unresolved questions and features numerous worked examples, helpful illustrat...
O-Theory: a hybrid uncertainty theory
Energy Technology Data Exchange (ETDEWEB)
Oblow, E.M.
1985-10-01
A hybrid uncertainty theory is developed to bridge the gap between fuzzy set theory and Bayesian inference theory. Its basis is the Dempster-Shafer formalism (a probability-like, set-theoretic approach), which is extended and expanded upon so as to include a complete set of basic operations for manipulating uncertainties in approximate reasoning. The new theory, operator-belief theory (OT), retains the probabilistic flavor of Bayesian inference but includes the potential for defining a wider range of operators like those found in fuzzy set theory. The basic operations defined for OT in this paper include those for: dominance and order, union, intersection, complement and general mappings. A formal relationship between the membership function in fuzzy set theory and the upper probability function in the Dempster-Shafer formalism is also developed. Several sample problems in logical inference are worked out to illustrate the results derived from this new approach as well as to compare them with the other theories currently being used. A general method of extending the theory using the historical development of fuzzy set theory as an example is suggested.
International Nuclear Information System (INIS)
Khazali Mohd Zin
2001-01-01
In order to become a developed country, Malaysia needs to develop her own national standards. It has been projected that by the year 2020, Malaysia requires about 8,000 standards (Department of Standard Malaysia). Currently more than 2,000 Malaysian Standards have been gazette by the government which considerably too low before tire year 2020. NDT standards have been identified by the standard working group as one of the areas to promote our national standards. In this paper the author describes the steps taken to establish the Malaysian very own NDT standards. The project starts with the establishment of radiographic standards. (Author)
Apsche, Jack A.
2005-01-01
In his work on the Theory of Modes, Beck (1996) suggested that there were flaws with his cognitive theory. He suggested that though there are shortcomings to his cognitive theory, there were not similar shortcomings to the practice of Cognitive Therapy. The author suggests that if there are shortcomings to cognitive theory the same shortcomings…
Contemporary theories of democracy
Directory of Open Access Journals (Sweden)
Mladenović Ivan
2008-01-01
Full Text Available The aim of this paper is two-fold: first, to analyze several contemporary theories of democracy, and secondly, to propose a theoretical framework for further investigations based on analyzed theories. The following four theories will be analyzed: pluralism, social choice theory, deliberative democracy and participatory democracy.
't Hooft, Gerardus; Witten, Edward
2005-01-01
In his later years, Einstein sought a unified theory that would extend general relativity and provide an alternative to quantum theory. There is now talk of a "theory of everything"; fifty years after his death, how close are we to such a theory? (3 pages)
de Bruin, B.P.
2005-01-01
Game theory is the mathematical study of strategy and conflict. It has wide applications in economics, political science, sociology, and, to some extent, in philosophy. Where rational choice theory or decision theory is concerned with individual agents facing games against nature, game theory deals
Moschovakis, YN
1987-01-01
Now available in paperback, this monograph is a self-contained exposition of the main results and methods of descriptive set theory. It develops all the necessary background material from logic and recursion theory, and treats both classical descriptive set theory and the effective theory developed by logicians.
From the standard model to dark matter
International Nuclear Information System (INIS)
Wilczek, F.
1995-01-01
The standard model of particle physics is marvelously successful. However, it is obviously not a complete or final theory. I shall argue here that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Taking these hints seriously, one is led to predict the existence of new types of very weakly interacting matter, stable on cosmological time scales and produced with cosmologically interesting densities--that is, ''dark matter''. copyright 1995 American Institute of Physics
Small numbers in supersymmetric theories of nature
Energy Technology Data Exchange (ETDEWEB)
Graesser, Michael Lawrence [Univ. of California, Berkeley, CA (United States)
1999-05-01
The Standard Model of particle interactions is a successful theory for describing the interactions of quarks, leptons and gauge bosons at microscopic distance scales. Despite these successes, the theory contains many unsatisfactory features. The origin of particle masses is a central mystery that has eluded experimental elucidation. In the Standard Model the known particles obtain their mass from the condensate of the so-called Higgs particle. Quantum corrections to the Higgs mass require an unnatural fine tuning in the Higgs mass of one part in 10^{-32} to obtain the correct mass scale of electroweak physics. In addition, the origin of the vast hierarchy between the mass scales of the electroweak and quantum gravity physics is not explained in the current theory. Supersymmetric extensions to the Standard Model are not plagued by this fine tuning issue and may therefore be relevant in Nature. In the minimal supersymmetric Standard Model there is also a natural explanation for electroweak symmetry breaking. Supersymmetric Grand Unified Theories also correctly predict a parameter of the Standard Model. This provides non-trivial indirect evidence for these theories. The most general supersymmetric extension to the Standard Model however, is excluded by many physical processes, such as rare flavor changing processes, and the non-observation of the instability of the proton. These processes provide important information about the possible structure such a theory. In particular, certain parameters in this theory must be rather small. A physics explanation for why this is the case would be desirable. It is striking that the gauge couplings of the Standard Model unify if there is supersymmetry close to the weak scale. This suggests that at high energies Nature is described by a supersymmetric Grand Unified Theory. But the mass scale of unification must be introduced into the theory since it does not coincide with the probable mass scale of strong quantum gravity
Theory reduction and non-Boolean theories.
Primas, H
1977-07-19
It is suggested that biological theories should be embedded into the family of non-Boolean theories based on an orthomodular propositional calculus. The structure of universal theories that include quantal phenomena is investigated and it is shown that their subtheories form a directed set which cannot be totally orders. A precise definition of theory reduction is given; it turns out that hierarchically different descriptive levels are not related by a homomorphic map. A subtheory that is reducible to a more general theory can be associated with the emergence of novel concepts and is in general subject to a wider empirical clissification scheme than the reducing theory. The implications of these results for reductionism, holism, emergence, and their conceptual unification are discussed.
Zimmerman Jones, Andrew
2010-01-01
Making Everything Easier!. String Theory for Dummies. Learn:. The basic concepts of this controversial theory;. How string theory builds on physics concepts;. The different viewpoints in the field;. String theory's physical implications. Andrew Zimmerman Jones. Physics Guide, About.com. with Daniel Robbins, PhD in Physics. Your plain-English guide to this complex scientific theory. String theory is one of the most complicated sciences being explored today. Not to worry though! This informative guide clearly explains the basics of this hot topic, discusses the theory's hypotheses and prediction
A and B Theories of Closed Time
Directory of Open Access Journals (Sweden)
Phill Dowe
Full Text Available ABSTRACT Closed time is possible in several senses of ‘possible’. One might like to know, therefore, whether closed time is possible in the sense that it is compatible with standard metaphysical theories of time. In this paper I am concerned with whether closed time is compatible with A and/or B theories of time. A common enough view amongst philosophers is that B theories do but A theories do not allow closed time. However, I show that prima-facie neither approach allows closed time, but that with a little work standard versions of both approaches do. This shows that there’s no special problem with the notion of eternal return.
Teaching Theory X and Theory Y in Organizational Communication
Noland, Carey
2014-01-01
The purpose of the activity described here is to integrate McGregor's Theory X and Theory Y into a group application: design a syllabus that embodies either Theory X or Theory Y tenets. Students should be able to differentiate between Theory X and Theory Y, create a syllabus based on Theory X or Theory Y tenets, evaluate the different syllabi…
Foundations of Information Theory
Burgin, Mark
2008-01-01
Information is the basic concept of information theory. However, there is no definition of this concept that can encompass all uses of the term information in information theories and beyond. Many question a possibility of such a definition. However, foundations of information theory developed in the context of the general theory of information made it possible to build such a relevant and at the same time, encompassing definition. Foundations of information theory are built in a form of onto...
Towards a theory of spacetime theories
Schiemann, Gregor; Scholz, Erhard
2017-01-01
This contributed volume is the result of a July 2010 workshop at the University of Wuppertal Interdisciplinary Centre for Science and Technology Studies which brought together world-wide experts from physics, philosophy and history, in order to address a set of questions first posed in the 1950s: How do we compare spacetime theories? How do we judge, objectively, which is the “best” theory? Is there even a unique answer to this question? The goal of the workshop, and of this book, is to contribute to the development of a meta-theory of spacetime theories. Such a meta-theory would reveal insights about specific spacetime theories by distilling their essential similarities and differences, deliver a framework for a class of theories that could be helpful as a blueprint to build other meta-theories, and provide a higher level viewpoint for judging which theory most accurately describes nature. But rather than drawing a map in broad strokes, the focus is on particularly rich regions in the “space of spaceti...
Gauge theory loop operators and Liouville theory
Energy Technology Data Exchange (ETDEWEB)
Drukker, Nadav [Humboldt Univ. Berlin (Germany). Inst. fuer Physik; Gomis, Jaume; Okuda, Takuda [Perimeter Inst. for Theoretical Physics, Waterloo, ON (Canada); Teschner, Joerg [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)
2009-10-15
We propose a correspondence between loop operators in a family of four dimensional N=2 gauge theories on S{sup 4} - including Wilson, 't Hooft and dyonic operators - and Liouville theory loop operators on a Riemann surface. This extends the beautiful relation between the partition function of these N=2 gauge theories and Liouville correlators found by Alday, Gaiotto and Tachikawa. We show that the computation of these Liouville correlators with the insertion of a Liouville loop operator reproduces Pestun's formula capturing the expectation value of a Wilson loop operator in the corresponding gauge theory. We prove that our definition of Liouville loop operators is invariant under modular transformations, which given our correspondence, implies the conjectured action of S-duality on the gauge theory loop operators. Our computations in Liouville theory make an explicit prediction for the exact expectation value of 't Hooft and dyonic loop operators in these N=2 gauge theories. The Liouville loop operators are also found to admit a simple geometric interpretation within quantum Teichmueller theory as the quantum operators representing the length of geodesics. We study the algebra of Liouville loop operators and show that it gives evidence for our proposal as well as providing definite predictions for the operator product expansion of loop operators in gauge theory. (orig.)
The International Standards Organisation offshore structures standard
International Nuclear Information System (INIS)
Snell, R.O.
1994-01-01
The International Standards Organisation has initiated a program to develop a suite of ISO Codes and Standards for the Oil Industry. The Offshore Structures Standard is one of seven topics being addressed. The scope of the standard will encompass fixed steel and concrete structures, floating structures, Arctic structures and the site specific assessment of mobile drilling and accommodation units. The standard will use as base documents the existing recommended practices and standards most frequently used for each type of structure, and will develop them to incorporate best published and recognized practice and knowledge where it provides a significant improvement on the base document. Work on the Code has commenced under the direction of an internationally constituted sub-committee comprising representatives from most of the countries with a substantial offshore oil and gas industry. This paper outlines the background to the code and the format, content and work program
A first course in topos quantum theory
International Nuclear Information System (INIS)
Flori, Cecilia
2013-01-01
Written by a leading researcher in the field. Concise course-tested textbook. Includes worked-out problems In the last five decades various attempts to formulate theories of quantum gravity have been made, but none has fully succeeded in becoming the quantum theory of gravity. One possible explanation for this failure might be the unresolved fundamental issues in quantum theory as it stands now. Indeed, most approaches to quantum gravity adopt standard quantum theory as their starting point, with the hope that the theory's unresolved issues will get solved along the way. However, these fundamental issues may need to be solved before attempting to define a quantum theory of gravity. The present text adopts this point of view, addressing the following basic questions: What are the main conceptual issues in quantum theory? How can these issues be solved within a new theoretical framework of quantum theory? A possible way to overcome critical issues in present-day quantum physics - such as a priori assumptions about space and time that are not compatible with a theory of quantum gravity, and the impossibility of talking about systems without reference to an external observer - is through a reformulation of quantum theory in terms of a different mathematical framework called topos theory. This course-tested primer sets out to explain to graduate students and newcomers to the field alike, the reasons for choosing topos theory to resolve the above-mentioned issues and how it brings quantum physics back to looking more like a ''neo-realist'' classical physics theory again.
What is "Standard" About the Standard Deviation
Newberger, Florence; Safer, Alan M.; Watson, Saleem
2010-01-01
The choice of the formula for standard deviation is explained in elementary statistics textbooks in various ways. We give an explanation for this formula by representing the data as a vector in $\\mathbb R^n$ and considering its distance from a central tendency vector. In this setting the "standard" formula represents a shortest distance in the standard metric. We also show that different metrics lead to different measures of central tendency.
On supersymmetric effective theories of axion
Energy Technology Data Exchange (ETDEWEB)
Higaki, Tetsutaro [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Kitano, Ryuichiro [Tohoku Univ., Sendai (Japan). Dept. of Physics
2011-04-15
We study effective theories of an axion in spontaneously broken supersymmetric theories. We consider a system where the axion supermultiplet is directly coupled to a supersymmetry breaking sector whereas the standard model sector is communicated with those sectors through loops of messenger fields. The gaugino masses and the axion-gluon coupling necessary for solving the strong CP problem are both obtained by the same effective interaction. We discuss cosmological constraints on this framework. (orig.)
Magnetic monopoles in field theory and cosmology.
Rajantie, Arttu
2012-12-28
The existence of magnetic monopoles is predicted by many theories of particle physics beyond the standard model. However, in spite of extensive searches, there is no experimental or observational sign of them. I review the role of magnetic monopoles in quantum field theory and discuss their implications for particle physics and cosmology. I also highlight their differences and similarities with monopoles found in frustrated magnetic systems.
Department of Transportation — The Standard Reference Tables (SRT) provide consistent reference data for the various applications that support Flight Standards Service (AFS) business processes and...
Simple recursion relations for general field theories
International Nuclear Information System (INIS)
Cheung, Clifford; Shen, Chia-Hsien; Trnka, Jaroslav
2015-01-01
On-shell methods offer an alternative definition of quantum field theory at tree-level, replacing Feynman diagrams with recursion relations and interaction vertices with a handful of seed scattering amplitudes. In this paper we determine the simplest recursion relations needed to construct a general four-dimensional quantum field theory of massless particles. For this purpose we define a covering space of recursion relations which naturally generalizes all existing constructions, including those of BCFW and Risager. The validity of each recursion relation hinges on the large momentum behavior of an n-point scattering amplitude under an m-line momentum shift, which we determine solely from dimensional analysis, Lorentz invariance, and locality. We show that all amplitudes in a renormalizable theory are 5-line constructible. Amplitudes are 3-line constructible if an external particle carries spin or if the scalars in the theory carry equal charge under a global or gauge symmetry. Remarkably, this implies the 3-line constructibility of all gauge theories with fermions and complex scalars in arbitrary representations, all supersymmetric theories, and the standard model. Moreover, all amplitudes in non-renormalizable theories without derivative interactions are constructible; with derivative interactions, a subset of amplitudes is constructible. We illustrate our results with examples from both renormalizable and non-renormalizable theories. Our study demonstrates both the power and limitations of recursion relations as a self-contained formulation of quantum field theory.
Prospects of experimentally reachable beyond Standard Model ...
Indian Academy of Sciences (India)
2016-01-06
Jan 6, 2016 ... Home; Journals; Pramana – Journal of Physics; Volume 86; Issue 2. Prospects of experimentally reachable beyond Standard Model physics in inverse see-saw motivated SO(10) GUT. Ram Lal Awasthi. Special: Supersymmetric Unified Theories and Higgs Physics Volume 86 Issue 2 February 2016 pp 223- ...
Why supersymmetry? Physics beyond the standard model
Indian Academy of Sciences (India)
The Naturalness Principle as a requirement that the heavy mass scales decouple from the physics of light mass scales is reviewed. In quantum field theories containing {\\em elementary} scalar fields, such as the StandardModel of electroweak interactions containing the Higgs particle, mass of the scalar field is not a natural ...
Standardized Curriculum for Electricity/Electronics.
Mississippi State Dept. of Education, Jackson. Office of Vocational, Technical and Adult Education.
Standardized vocational education course titles and core contents are provided for two courses in Mississippi: electricity/electronics I and II. The first course contains the following units: (1) orientation, safety, and leadership; (2) basic principles of electricity/electronics; (3) direct current (DC) theory; (4) magnetism and DC motors; (5)…
The making of the standard model
Hooft, G. 't
2007-01-01
The standard model of particle physics is more than a model. It is a detailed theory that encompasses nearly all that is known about the subatomic particles and forces in a concise set of principles and equations. The extensive research that culminated in this model includes numerous small and
Why supersymmetry? Physics beyond the standard model
Indian Academy of Sciences (India)
2016-08-23
Aug 23, 2016 ... Abstract. The Naturalness Principle as a requirement that the heavy mass scales decouple from the physics of light mass scales is reviewed. In quantum field theories containing elementary scalar fields, such as the Standard. Model of electroweak interactions containing the Higgs particle, mass of the ...
Beyond the Standard Model for Montaneros
Bustamante, M; Ellis, John
2010-01-01
These notes cover (i) electroweak symmetry breaking in the Standard Model (SM) and the Higgs boson, (ii) alternatives to the SM Higgs boson} including an introduction to composite Higgs models and Higgsless models that invoke extra dimensions, (iii) the theory and phenomenology of supersymmetry, and (iv) various further beyond topics, including Grand Unification, proton decay and neutrino masses, supergravity, superstrings and extra dimensions.
Equivariant surgery theories and their periodicity properties
Dovermann, Karl Heinz
1990-01-01
The theory of surgery on manifolds has been generalized to categories of manifolds with group actions in several different ways. This book discusses some basic properties that such theories have in common. Special emphasis is placed on analogs of the fourfold periodicity theorems in ordinary surgery and the roles of standard general position hypotheses on the strata of manifolds with group actions. The contents of the book presuppose some familiarity with the basic ideas of surgery theory and transformation groups, but no previous knowledge of equivariant surgery is assumed. The book is designed to serve either as an introduction to equivariant surgery theory for advanced graduate students and researchers in related areas, or as an account of the authors' previously unpublished work on periodicity for specialists in surgery theory or transformation groups.
Effective nonrenormalizable theories at one loop
Energy Technology Data Exchange (ETDEWEB)
Gaillard, M.K.
1987-10-12
The paper focuses on a nonrenormalizable theory that is more closely related to those suggested by superstrings, namely a gauged nonlinear delta-model, but one which can also be obtained analytically in a particular limit of a parameter (m/sub H/ ..-->.. infinity) of the standard, renormalizable electroweak theory. This will provide another laboratory for testing the validity of calculations using the effective theory. We find (as for certain superstring inspired models to be discussed later) features similar to those for the Fermi theory: quadratic divergences can be reinterpreted as renormalizations, while new terms are generated at the level of logarithmic divergences. Also introduced in the context of more familiar physics are notions such as scalar metric, scalar curvature and nonlinear symmetries, that play an important role in formal aspects of string theories. 58 refs., 12 figs.
Effective nonrenormalizable theories at one loop
International Nuclear Information System (INIS)
Gaillard, M.K.
1987-01-01
The paper focuses on a nonrenormalizable theory that is more closely related to those suggested by superstrings, namely a gauged nonlinear δ-model, but one which can also be obtained analytically in a particular limit of a parameter (m/sub H/ → ∞) of the standard, renormalizable electroweak theory. This will provide another laboratory for testing the validity of calculations using the effective theory. We find (as for certain superstring inspired models to be discussed later) features similar to those for the Fermi theory: quadratic divergences can be reinterpreted as renormalizations, while new terms are generated at the level of logarithmic divergences. Also introduced in the context of more familiar physics are notions such as scalar metric, scalar curvature and nonlinear symmetries, that play an important role in formal aspects of string theories. 58 refs., 12 figs
International Nuclear Information System (INIS)
Raby, S.; Walker, T.; Babu, K.S.; Baer, H.; de Gouvea, A.; Gabadadze, G.; Gal, A.; Gondolo, P.; Lande, K.; Olive, K.A.; Profumo, S.; Shrock, R.; Tavartkiladze, Z.; Whisnant, K.; Wolfenstein, L.
2011-01-01
The scientific case for a Deep Underground Science and Engineering Laboratory (DUSEL) located at the Homestake mine in Lead, South Dakota is exceptional. The site of this future laboratory already claims a discovery for the detection of solar neutrinos, leading to a Nobel Prize for Ray Davis. Moreover this work provided the first step to our present understanding of solar neutrino oscillations and a chink in the armor of the Standard Model of particle physics. We now know, from several experiments located in deep underground experimental laboratories around the world, that neutrinos have mass and even more importantly this mass appears to fit into the framework of theories which unify all the known forces of nature, i.e. the strong, weak, electromagnetic and gravitational. Similarly, DUSEL can forge forward in the discovery of new realms of nature, housing six fundamental experiments that will test the frontiers of our knowledge: (1) Searching for nucleon decay (the decay of protons and neutrons predicted by grand unified theories of nature); (2) Searching for neutrino oscillations and CP violation by detecting neutrinos produced at a neutrino source (possibly located at Brookhaven National Laboratory and/or Fermi National Laboratory); (3) Searching for astrophysical neutrinos originating from the sun, from cosmic rays hitting the upper atmosphere or from other astrophysical sources, such a supernovae; (4) Searching for dark matter particles (the type of matter which does not interact electromagnetically, yet provides 24% of the mass of the Universe); (5) Looking for the rare process known as neutrino-less double beta decay which is predicted by most theories of neutrino mass and allows two neutrons in a nucleus to spontaneously change into two protons and two electrons; and (6) Searching for the rare process of neutron- anti-neutron oscillations, which would establish violation of baryon number symmetry. A large megaton water Cherenkov detector for neutrinos and
Energy Technology Data Exchange (ETDEWEB)
Raby, S.; /Ohio State U.; Walker, T.; /Ohio State U. /Ohio State U., Dept. Astron. /Ohio State U., CCAPP; Babu, K.S.; /Oklahoma State U.; Baer, H.; /Florida State U.; Balantekin, A.B.; Barger, V.; /Wisconsin U., Madison; Berezhiani, Z.; /Gran Sasso; de Gouvea, A.; /Northwestern U.; Dermisek, R.; /Princeton U.; Dolgov, A.; /Moscow, ITEP /Ferrara U.; Fileviez Perez, P.; /Wisconsin U., Madison; Gabadadze, G.; /New York U.; Gal, A.; /Hebrew U.; Gondolo, P.; /Utah U.; Haxton, W.; /Washington U., Seattle; Kamyshkov, Y.; /Tennessee U.; Kayser, B.; /Fermilab; Kearns, E.; /Boston U.; Kopeliovich, B.; /Santa Maria U., Valparaiso; Lande, K.; /Pennsylvania U.; Marfatia, D.; /Kansas U. /Maryland U. /Northeastern U. /UC, Berkeley /LBL, Berkeley /Minnesota U. /SLAC /UC, Santa Cruz /SUNY, Stony Brook /Oklahoma State U. /Iowa State U. /Carnegie Mellon U.
2011-11-14
The scientific case for a Deep Underground Science and Engineering Laboratory [DUSEL] located at the Homestake mine in Lead, South Dakota is exceptional. The site of this future laboratory already claims a discovery for the detection of solar neutrinos, leading to a Nobel Prize for Ray Davis. Moreover this work provided the first step to our present understanding of solar neutrino oscillations and a chink in the armor of the Standard Model of particle physics. We now know, from several experiments located in deep underground experimental laboratories around the world, that neutrinos have mass and even more importantly this mass appears to fit into the framework of theories which unify all the known forces of nature, i.e. the strong, weak, electromagnetic and gravitational. Similarly, DUSEL can forge forward in the discovery of new realms of nature, housing six fundamental experiments that will test the frontiers of our knowledge: (1) Searching for nucleon decay (the decay of protons and neutrons predicted by grand unified theories of nature); (2) Searching for neutrino oscillations and CP violation by detecting neutrinos produced at a neutrino source (possibly located at Brookhaven National Laboratory and/or Fermi National Laboratory); (3) Searching for astrophysical neutrinos originating from the sun, from cosmic rays hitting the upper atmosphere or from other astrophysical sources, such a supernovae; (4) Searching for dark matter particles (the type of matter which does not interact electromagnetically, yet provides 24% of the mass of the Universe); (5) Looking for the rare process known as neutrino-less double beta decay which is predicted by most theories of neutrino mass and allows two neutrons in a nucleus to spontaneously change into two protons and two electrons; and (6) Searching for the rare process of neutron- anti-neutron oscillations, which would establish violation of baryon number symmetry. A large megaton water Cherenkov detector for neutrinos and
Energy Technology Data Exchange (ETDEWEB)
Shafi, Qaisar [Univ. of Delaware, Newark, DE (United States); Barr, Steven [Univ. of Delaware, Newark, DE (United States); Gaisser, Thomas [Univ. of Delaware, Newark, DE (United States); Stanev, Todor [Univ. of Delaware, Newark, DE (United States)
2015-03-31
1. Executive Summary (April 1, 2012 - March 31, 2015) Title: Particle Theory, Particle Astrophysics and Cosmology Qaisar Shafi University of Delaware (Principal Investigator) Stephen M. Barr, University of Delaware (Co-Principal Investigator) Thomas K. Gaisser, University of Delaware (Co-Principal Investigator) Todor Stanev, University of Delaware (Co-Principal Investigator) The proposed research was carried out at the Bartol Research included Professors Qaisar Shafi Stephen Barr, Thomas K. Gaisser, and Todor Stanev, two postdoctoral fellows (Ilia Gogoladze and Liucheng Wang), and several graduate students. Five students of Qaisar Shafi completed their PhD during the period August 2011 - August 2014. Measures of the group’s high caliber performance during the 2012-2015 funding cycle included pub- lications in excellent refereed journals, contributions to working groups as well as white papers, and conference activities, which together provide an exceptional record of both individual performance as well as overall strength. Another important indicator of success is the outstanding quality of the past and current cohort of graduate students. The PhD students under our supervision regularly win the top departmental and university awards, and their publications records show excellence both in terms of quality and quantity. The topics covered under this grant cover the frontline research areas in today’s High Energy Theory & Phenomenology. For Professors Shafi and Barr they include LHC related topics including supersymmetry, collider physics, fl vor physics, dark matter physics, Higgs boson and seesaw physics, grand unifi and neutrino physics. The LHC two years ago discovered the Standard Model Higgs boson, thereby at least partially unlocking the secrets behind electroweak symmetry breaking. We remain optimistic that new and exciting physics will be found at LHC 14, which explain our focus on physics beyond the Standard Model. Professors Shafi continued his
The Gribov theory of quark confinement
2001-01-01
V N Gribov, one of the founders of modern particle physics, shaped our understanding of QCD as the microscopic dynamics of hadrons. This volume collects his papers on quark confinement, showing the road he followed to arrive at the theory and formulating the theory itself. It begins with papers providing a beautiful physical explanation of asymptotic freedom based on the phenomenon of antiscreening and demonstrating the inconsistency of the standard perturbative treatment of the gluon fields (Gribov copies, Gribov horizon). It continues with papers presenting the Gribov theory according to whi
Multiparameter eigenvalue problems Sturm-Liouville theory
Atkinson, FV
2010-01-01
One of the masters in the differential equations community, the late F.V. Atkinson contributed seminal research to multiparameter spectral theory and Sturm-Liouville theory. His ideas and techniques have long inspired researchers and continue to stimulate discussion. With the help of co-author Angelo B. Mingarelli, Multiparameter Eigenvalue Problems: Sturm-Liouville Theory reflects much of Dr. Atkinson's final work.After covering standard multiparameter problems, the book investigates the conditions for eigenvalues to be real and form a discrete set. It gives results on the determinants of fun
Improved thermodynamics of SU(2) gauge theory
Energy Technology Data Exchange (ETDEWEB)
Giudice, Pietro [University of Muenster, Institute for Theoretical Physics, Muenster (Germany); Piemonte, Stefano [University of Regensburg, Institute for Theoretical Physics, Regensburg (Germany)
2017-12-15
In this work we present the results of our investigation of the thermodynamics of SU(2) gauge theory. We employ a Symanzik improved action to reduce strongly the discretisations effects, and we use the scaling relations to take into account the finite volume effects close to the critical temperature. We determine the β-function for this particular theory and we use it in the determination of different thermodynamic observables. Finally we compare our results with previous work where only the standard Wilson action was considered. We confirm the relevance of using the improved action to access easily the correct continuum thermodynamics of the theory. (orig.)
International Nuclear Information System (INIS)
Kaku, M.
1987-01-01
In this article, the authors summarize the rapid progress in constructing string field theory actions, such as the development of the covariant BRST theory. They also present the newer geometric formulation of string field theory, from which the BRST theory and the older light cone theory can be derived from first principles. This geometric formulation allows us to derive the complete field theory of strings from two geometric principles, in the same way that general relativity and Yang-Mills theory can be derived from two principles based on global and local symmetry. The geometric formalism therefore reduces string field theory to a problem of finding an invariant under a new local gauge group they call the universal string group (USG). Thus, string field theory is the gauge theory of the universal string group in much the same way that Yang-Mills theory is the gauge theory of SU(N). The geometric formulation places superstring theory on the same rigorous group theoretical level as general relativity and gauge theory
Rotor theories by Professor Joukowsky: Momentum theories
DEFF Research Database (Denmark)
van Kuik, G. A. M.; Sørensen, Jens Nørkær; Okulov, V. L.
2015-01-01
This paper is the first of two papers on the history of rotor aerodynamics with special emphasis on the role of Joukowsky. The present one focuses on the development of the momentum theory while the second one surveys the development of vortex theory for rotors. Joukowsky has played a major role ...
Generalizability theory and item response theory
Glas, Cornelis A.W.; Eggen, T.J.H.M.; Veldkamp, B.P.
2012-01-01
Item response theory is usually applied to items with a selected-response format, such as multiple choice items, whereas generalizability theory is usually applied to constructed-response tasks assessed by raters. However, in many situations, raters may use rating scales consisting of items with a
Separation-individuation theory and attachment theory.
Blum, Harold P
2004-01-01
Separation-individuation and attachment theories are compared and assessed in the context of psychoanalytic developmental theory and their application to clinical work. As introduced by Margaret Mahler and John Bowlby, respectively, both theories were initially regarded as diverging from traditional views. Separation-individuation theory, though it has had to be corrected in important respects, and attachment theory, despite certain limitations, have nonetheless enriched psychoanalytic thought. Without attachment an infant would die, and with severely insecure attachment is at greater risk for serious disorders. Development depends on continued attachment to a responsive and responsible caregiver. Continued attachment to the primary object was regarded by Mahler as as intrinsic to the process of separation-individuation. Attachment theory does not account for the essential development of separateness, and separation-individuation is important for the promotion of autonomy, independence, and identity. Salient historical and theoretical issues are addressed, including the renewed interest in attachment theory and the related decline of interest in separation-individuation theory.
Generalizability Theory and Classical Test Theory
Brennan, Robert L.
2011-01-01
Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…
Physics beyond the standard model
Energy Technology Data Exchange (ETDEWEB)
Valle, J.W.F. [Valencia Univ. (Spain). Dept. de Fisica Teorica]. E-mail: valle@flamenco.uv.es
1996-07-01
We discuss some of the signatures associated with extensions of the Standard Model related to the neutrino and electroweak symmetry breaking sectors, with and without supersymmetry. The topics include a basic discussion of the theory of neutrino mass and the corresponding extensions of the Standard Model that incorporate massive neutrinos; an overview of the present observational status of neutrino mass searches, with emphasis on solar neutrinos, as well as cosmological data on the amplitude of primordial density fluctuations; the implications of neutrino mass in cosmological nucleosynthesis, non-accelerator, as well as in high energy particle collider experiments. Turning to the electroweak breaking sector, we discuss the physics potential for Higgs boson searches at LEP200, including Majorana extensions of the Standard Model, and the physics of invisibly decaying Higgs bosons. We discuss the minimal supersymmetric Standard Model phenomenology, as well as some of the laboratory signatures that would be associated to models with R parity violation, especially in Z and scalar boson decays. (author)
International Nuclear Information System (INIS)
Marciano, W.J.
1989-05-01
In these lectures, my aim is to present a status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows. I survey the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also commented on. In addition, I have included an appendix on dimensional regularization and a simple example which employs that technique. I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, extra Z' bosons, and compositeness are discussed. An overview of the physics of tau decays is also included. I discuss weak neutral current phenomenology and the extraction of sin 2 θW from experiment. The results presented there are based on a global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, implications for grand unified theories (GUTS), extra Z' gauge bosons, and atomic parity violation. The potential for further experimental progress is also commented on. Finally, I depart from the narrowest version of the standard model and discuss effects of neutrino masses, mixings, and electromagnetic moments. 32 refs., 3 figs., 5 tabs
Standardization and the European Standards Organisations
Directory of Open Access Journals (Sweden)
Marta Orviska
2014-01-01
Full Text Available Standardization is a relatively neglected aspect of the EU regulatory process and yet it is fundamental to that process and arguably has recently been the key vehicle in making the single market an economic reality. Yet the key standardization bodies in the EU, the ESOs, are scarcely known to the public and seldom discussed in the literature. In this article we redress this imbalance, arguing that standardization and integration are closely related concepts. We also argue that the ESOs have developed a degree of autonomy in expanding the boundaries of standardization and even in developing their own links with the rest of the world. Recent proposals put forward by the European Commission can be seen as an attempt to reduce that autonomy. These proposals emphasize the speed of, and stakeholder involvement in, standards production, which we further suggest are somewhat conflicting aims.
Utilitarianism and Double Standards: A Discussion of R. M. Hare's "Moral Thinking."
Annas, Julia
1982-01-01
Criticizes R. M. Hare's theory of moral thinking. Hare identifies two levels of moral thinking: critical and intuitive thinking. The author argues that Hare's theory suggests a double standard and makes moral conflicts appear trivial. (AM)
Standards for holdup measurement
International Nuclear Information System (INIS)
Zucker, M.S.
1982-01-01
Holdup measurement, needed for material balance, depend intensively on standards and on interpretation of the calibration procedure. More than other measurements, the calibration procedure using the standard becomes part of the standard. Standards practical for field use and calibration techniques have been developed. While accuracy in holdup measurements is comparatively poor, avoidance of bias is a necessary goal
Fundamental statistical theories
International Nuclear Information System (INIS)
Demopoulos, W.
1976-01-01
Einstein argued that since quantum mechanics is not a fundamental theory it cannot be regarded as in any sense final. The pure statistical states of the quantum theory are not dispersion-free. In this sense, the theory is significantly statistical. The problem investigated in this paper is to determine under what conditions is a significalty statistical theory correctly regarded as fundamental. The solution developed in this paper is that a statistical theory is fundamental only if it is complete; moreover the quantum theory is complete. (B.R.H.)
Schürmann, Michael
2008-01-01
This volume contains the revised and completed notes of lectures given at the school "Quantum Potential Theory: Structure and Applications to Physics," held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald from February 26 to March 10, 2007. Quantum potential theory studies noncommutative (or quantum) analogs of classical potential theory. These lectures provide an introduction to this theory, concentrating on probabilistic potential theory and it quantum analogs, i.e. quantum Markov processes and semigroups, quantum random walks, Dirichlet forms on C* and von Neumann algebras, and boundary theory. Applications to quantum physics, in particular the filtering problem in quantum optics, are also presented.
Lurie, Jacob
2009-01-01
Higher category theory is generally regarded as technical and forbidding, but part of it is considerably more tractable: the theory of infinity-categories, higher categories in which all higher morphisms are assumed to be invertible. In Higher Topos Theory, Jacob Lurie presents the foundations of this theory, using the language of weak Kan complexes introduced by Boardman and Vogt, and shows how existing theorems in algebraic topology can be reformulated and generalized in the theory's new language. The result is a powerful theory with applications in many areas of mathematics. The book's firs
From F/M-theory to K-theory and back
Garcia-Etxebarria, I; Garcia-Etxebarria, Inaki; Uranga, Angel M.
2006-01-01
We consider discrete K-theory tadpole cancellation conditions in type IIB orientifolds with magnetised 7-branes. Cancellation of K-theory charge constrains the choices of world-volume magnetic fluxes on the latter. We describe the F-/M-theory lift of these configurations, where 7-branes are encoded in the geometry of an elliptic fibration, and their magnetic quanta correspond to supergravity 4-form field strength fluxes. In a K3 compactification example, we show that standard quantization of 4-form fluxes as integer cohomology classes in K3 automatically implies the K-theory charge cancellation constraints on the 7-brane worldvolume magnetic fluxes in string theory (as well as new previously unnoticed discrete constraints, which we also interpret). Finally, we show that flux quantization in F-/M-theory implies that 7-brane world-volume flux quantization conditions are modified in the presence of 3-form fluxes.
Collaboration Between Multistakeholder Standards
DEFF Research Database (Denmark)
Rasche, Andreas; Maclean, Camilla
Public interest in corporate social responsibility (CSR) has resulted in a wide variety of multistakeholder CSR standards in which companies can choose to participate. While such standards reflect collaborative governance arrangements between public and private actors, the market for corporate...... responsibility is unlikely to support a great variety of partly competing and overlapping standards. Increased collaboration between these standards would enhance both their impact and their adoption by firms. This report examines the nature, benefits, and shortcomings of existing multistakeholder standards...
Weak interactions and gauge theories
International Nuclear Information System (INIS)
Gaillard, M.K.
1979-12-01
The status of the electroweak gauge theory, also known as quantum asthenodynamics (QAD), is examined. The major result is that the standard WS-GIM model describes the data well, although one should still look for signs of further complexity and better tests of its gauge theory aspect. A second important result is that the measured values of the three basic coupling constants of present-energy physics, g/sub s/, g, and √(5/3)g' of SU(3)/sub c/ x SU(2) 2 x U(1), are compatible with the idea that these interactions are unified at high energies. Much of the paper deals with open questions, and it takes up the following topics: the status of QAD, the scalar meson spectrum, the fermion spectrum, CP violation, and decay dynamics. 118 references, 20 figures
Wrapping rules (in) string theory
Bergshoeff, Eric A.; Riccioni, Fabio
2018-01-01
In this paper we show that the number of all 1/2-BPS branes in string theory compactified on a torus can be derived by universal wrapping rules whose formulation we present. These rules even apply to branes in less than ten dimensions whose ten-dimensional origin is an exotic brane. In that case the wrapping rules contain an additional combinatorial factor that is related to the highest dimension in which the ten-dimensional exotic brane, after compactification, can be realized as a standard brane. We show that the wrapping rules also apply to cases with less supersymmetry. As a specific example, we discuss the compactification of IIA/IIB string theory on ( T 4/ ℤ 2) × T n .
Theories of Career Development. A Comparison of the Theories.
Osipow, Samuel H.
These seven theories of career development are examined in previous chapters: (1) Roe's personality theory, (2) Holland's career typology theory, (3) the Ginzberg, Ginsburg, Axelrod, and Herma Theory, (4) psychoanalytic conceptions, (5) Super's developmental self-concept theory, (6) other personality theories, and (7) social systems theories.…
Technicolor and Beyond: Unification in Theory Space
DEFF Research Database (Denmark)
Sannino, Francesco
2010-01-01
supersymmetry and technicolor. The reason is to provide a unification of different extensions of the standard model. For example, this means that one can recover, according to the parameters and spectrum of the theory distinct extensions of the standard model, from supersymmetry to technicolor and unparticle...... physiscs. A surprising result is that a minimal (in terms of the smallest number of fields) supersymmetrization of the MWT model leads to the maximal supersymmetry in four dimensions, i.e. N=4 SYM....
[Mathematics and string theory
Energy Technology Data Exchange (ETDEWEB)
Jaffe, A.; Yau, Shing-Tung.
1993-01-01
Work on this grant was centered on connections between non- commutative geometry and physics. Topics covered included: cyclic cohomology, non-commutative manifolds, index theory, reflection positivity, space quantization, quantum groups, number theory, etc.
Introduction to percolation theory
Stauffer, Dietrich
1991-01-01
Percolation theory deals with clustering, criticallity, diffusion, fractals, phase transitions and disordered systems. This book covers the basic theory for the graduate, and also professionals dealing with it for the first time
Henneaux, Marc; Vasiliev, Mikhail A
2017-01-01
Symmetries play a fundamental role in physics. Non-Abelian gauge symmetries are the symmetries behind theories for massless spin-1 particles, while the reparametrization symmetry is behind Einstein's gravity theory for massless spin-2 particles. In supersymmetric theories these particles can be connected also to massless fermionic particles. Does Nature stop at spin-2 or can there also be massless higher spin theories. In the past strong indications have been given that such theories do not exist. However, in recent times ways to evade those constraints have been found and higher spin gauge theories have been constructed. With the advent of the AdS/CFT duality correspondence even stronger indications have been given that higher spin gauge theories play an important role in fundamental physics. All these issues were discussed at an international workshop in Singapore in November 2015 where the leading scientists in the field participated. This volume presents an up-to-date, detailed overview of the theories i...
Economic theories of dictatorship
Alexandre Debs
2010-01-01
This article reviews recent advances in economic theories of dictatorships and their lessons for the political stability and economic performance of dictatorships. It reflects on the general usefulness of economic theories of dictatorship, with an application to foreign relations.
Algebraic conformal field theory
International Nuclear Information System (INIS)
Fuchs, J.; Nationaal Inst. voor Kernfysica en Hoge-Energiefysica
1991-11-01
Many conformal field theory features are special versions of structures which are present in arbitrary 2-dimensional quantum field theories. So it makes sense to describe 2-dimensional conformal field theories in context of algebraic theory of superselection sectors. While most of the results of the algebraic theory are rather abstract, conformal field theories offer the possibility to work out many formulae explicitly. In particular, one can construct the full algebra A-bar of global observables and the endomorphisms of A-bar which represent the superselection sectors. Some explicit results are presented for the level 1 so(N) WZW theories; the algebra A-bar is found to be the enveloping algebra of a Lie algebra L-bar which is an extension of the chiral symmetry algebra of the WZW theory. (author). 21 refs., 6 figs
International Nuclear Information System (INIS)
Bonara, L.; Cotta-Ramusino, P.; Rinaldi, M.
1987-01-01
It is well-known that type I and heterotic superstring theories have a zero mass spectrum which correspond to the field content of N=1 supergravity theory coupled to supersymmetric Yang-Mills theory in 10-D. The authors study the field theory ''per se'', in the hope that simple consistency requirements will determine the theory completely once one knows the field content inherited from string theory. The simplest consistency requirements are: N=1 supersymmetry; and absence of chiral anomalies. This is what the authors discuss in this paper here leaving undetermined the question of the range of validity of the resulting field theory. As is known, a model of N=1 supergravity (SUGRA) coupled to supersymmetric Yang-Mills (SYM) theory was known in the form given by Chapline and Manton. The coupling of SUGRA to SYM was determined by the definition of the ''field strength'' 3-form H in this paper
Zielenkiewicz, Wojciech
2004-01-01
The purpose of this book is to give a comprehensive description of the theoretical fundamentals of calorimetry. The considerations are based on the relations deduced from the laws and general equations of heat exchange theory and steering theory.
DEFF Research Database (Denmark)
Clemmensen, Torkil; Kaptelinin, Victor; Nardi, Bonnie
2016-01-01
This paper reports a study of the use of activity theory in human–computer interaction (HCI) research. We analyse activity theory in HCI since its first appearance about 25 years ago. Through an analysis and meta-synthesis of 109 selected HCI activity theory papers, we created a taxonomy of 5...... different ways of using activity theory: (1) analysing unique features, principles, and problematic aspects of the theory; (2) identifying domain-specific requirements for new theoretical tools; (3) developing new conceptual accounts of issues in the field of HCI; (4) guiding and supporting empirical...... analyses of HCI phenomena; and (5) providing new design illustrations, claims, and guidelines. We conclude that HCI researchers are not only users of imported theory, but also theory-makers who adapt and develop theory for different purposes....
Stabilizing bottomless action theories
International Nuclear Information System (INIS)
Greensite, J.; Halpern, M.B.
1983-12-01
The authors show how to construct the Euclidean quantum theory corresponding to classical actions which are unbounded from below. The method preserves the classical limit, the large-N limit, and the perturbative expansion of the unstabilized theories. (Auth.)
Molder, te H.F.M.
2009-01-01
Available in both print and electronic formats, the Encyclopedia of Communication Theory provides students and researchers with a comprehensive two-volume overview of contemporary communication theory. Reference librarians report that students frequently approach them seeking a source that will
Rothbart, Andrea
2012-01-01
An imaginative introduction to number theory and abstract algebra, this unique approach employs a pair of fictional characters whose dialogues explain theories and demonstrate applications in terms of football scoring, chess moves, and more.
Dimensional comparison theory.
Möller, Jens; Marsh, Herb W
2013-07-01
Although social comparison (Festinger, 1954) and temporal comparison (Albert, 1977) theories are well established, dimensional comparison is a largely neglected yet influential process in self-evaluation. Dimensional comparison entails a single individual comparing his or her ability in a (target) domain with his or her ability in a standard domain (e.g., "How good am I in math compared with English?"). This article reviews empirical findings from introspective, path-analytic, and experimental studies on dimensional comparisons, categorized into 3 groups according to whether they address the "why," "with what," or "with what effect" question. As the corresponding research shows, dimensional comparisons are made in everyday life situations. They impact on domain-specific self-evaluations of abilities in both domains: Dimensional comparisons reduce self-concept in the worse off domain and increase self-concept in the better off domain. The motivational basis for dimensional comparisons, their integration with recent social cognitive approaches, and the interdependence of dimensional, temporal, and social comparisons are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
International Nuclear Information System (INIS)
Adler, S.L.; Wilczek, F.
1993-11-01
Areas of emphasis include acceleration algorithms for the Monte Carlo analysis of lattice field and gauge theories, quaternionic generalizations of complex quantum mechanics and field theory, application of the renormalization group to the QCD phase transition, the quantum Hall effect, and black holes. Other work involved string theory, statistical properties of energy levels in integrable quantum systems, baryon asymmetry and the electroweak phase transition, anisotropies of the cosmic microwave background, and theory of superconductors
Constructivist Grounded Theory?
Glaser, Barney G.
2007-01-01
In meinem Beitrag greife ich zurück auf den ausgezeichneten und inspirierenden Artikel von CHARMAZ zu konstruktivistischer Grounded Theory, um an diesem Beispiel zu diskutieren, dass und warum die Grounded Theory kein konstruktivistisches Unterfangen ist. Ich versuche zu zeigen, dass "konstruktivistische Daten" bzw. konstruktivistische Anwendungen der Grounded Theory, sofern sie überhaupt existieren bzw. sinnvoll sein könnten, nur einen verschwindend kleinen Teil der Grounded Theory ausmachen...
International Nuclear Information System (INIS)
Partovi, M.H.
1982-01-01
From a generalization of the covariant derivative, nonlocal gauge theories are developed. These theories enjoy local gauge invariance and associated Ward identities, a corresponding locally conserved current, and a locally conserved energy-momentum tensor, with the Ward identities implying the masslessness of the gauge field as in local theories. Their ultraviolet behavior allows the presence as well as the absence of the Adler-Bell-Jackiw anomaly, the latter in analogy with lattice theories
2006: Particle Physics in the Standard Model and beyond
Indian Academy of Sciences (India)
journal of. October 2006 physics pp. 561–577. 2006: Particle Physics in the Standard Model and beyond. GUIDO ALTARELLI. Department of Physics, Theory Division, ..... that the gauge symmetry is unbroken in the vertices of the theory: all currents and charges ... Here, when talking of divergences, we are not worried of ac-.
Vazzana, Anthony; Garth, David
2007-01-01
One of the oldest branches of mathematics, number theory is a vast field devoted to studying the properties of whole numbers. Offering a flexible format for a one- or two-semester course, Introduction to Number Theory uses worked examples, numerous exercises, and two popular software packages to describe a diverse array of number theory topics.
Missinne, Leo E.; Wilcox, Victoria
This paper discusses the life, theories, and therapeutic techniques of psychotherapist, Viktor E. Frankl. A brief biography of Frankl is included discussing the relationship of his early experiences as a physician to his theory of personality. Frankl's theory focusing on man's need for meaning and emphasizing the spiritual dimension in each human…
DEFF Research Database (Denmark)
Javadi, Hossein; Forouzbakhsh, Farshid; Daei Kasmaei, Hamed
2016-01-01
There are various theories in physics, but nature is unique. This is not nature's problem that we have various theories; nature obeys simple and unique law. We should improve our theories. Universal constancy of the speed of light undergoes the question whether the limit on the light speed origin...
Matsumoto, Kohji
2002-01-01
The book includes several survey articles on prime numbers, divisor problems, and Diophantine equations, as well as research papers on various aspects of analytic number theory such as additive problems, Diophantine approximations and the theory of zeta and L-function Audience Researchers and graduate students interested in recent development of number theory
Pais, Alexandre; Valero, Paola
2014-01-01
What is the place of social theory in mathematics education research, and what is it for? This special issue of "Educational Studies in Mathematics" offers insights on what could be the role of some sociological theories in a field that has historically privileged learning theories coming from psychology and mathematics as the main…
Algorithmic information theory
Grünwald, P.D.; Vitányi, P.M.B.; Adriaans, P.; van Benthem, J.
2008-01-01
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining 'information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are
Algorithmic information theory
Grünwald, P.D.; Vitányi, P.M.B.
2008-01-01
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are
Schmidt, Ulrich; Zank, Horst
2010-01-01
In previous models of (cumulative) prospect theory reference-dependence of preferences is imposed beforehand and the location of the reference point is exogenously determined. This paper provides an axiomatization of a new specification of cumulative prospect theory, termed endogenous prospect theory, where reference-dependence is derived from preference conditions and a unique reference point arises endogenously.
Reflections on Activity Theory
Bakhurst, David
2009-01-01
It is sometimes suggested that activity theory represents the most important legacy of Soviet philosophy and psychology. But what exactly "is" activity theory? The canonical account in the West is given by Engestrom, who identifies three stages in the theory's development: from Vygotsky's insights, through Leontiev's articulation of the…
Superspace conformal field theory
International Nuclear Information System (INIS)
Quella, Thomas
2013-07-01
Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.
Iachello, F
1995-01-01
1. The Wave Mechanics of Diatomic Molecules. 2. Summary of Elements of Algebraic Theory. 3. Mechanics of Molecules. 4. Three-Body Algebraic Theory. 5. Four-Body Algebraic Theory. 6. Classical Limit and Coordinate Representation. 8. Prologue to the Future. Appendices. Properties of Lie Algebras; Coupling of Algebras; Hamiltonian Parameters
DEFF Research Database (Denmark)
Andersen, Jack
2015-01-01
Purpose To provide a small overview of genre theory and its associated concepts and to show how genre theory has had its antecedents in certain parts of the social sciences and not in the humanities. Findings The chapter argues that the explanatory force of genre theory may be explained with its ...
Rudner, Lawrence M.
This paper describes and evaluates the use of decision theory as a tool for classifying examinees based on their item response patterns. Decision theory, developed by A. Wald (1947) and now widely used in engineering, agriculture, and computing, provides a simple model for the analysis of categorical data. Measurement decision theory requires only…
Constructor theory of probability
2016-01-01
Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914
Superspace conformal field theory
Energy Technology Data Exchange (ETDEWEB)
Quella, Thomas [Koeln Univ. (Germany). Inst. fuer Theoretische Physik; Schomerus, Volker [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)
2013-07-15
Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.
Influences of Economic Theories on Accounting Theory: the case of the Objective Function of the Firm
Directory of Open Access Journals (Sweden)
Lineker Costa Passos
2016-10-01
Full Text Available This essay aims to establish the relationship between the theoretical precepts that guide the accounting disclosure procedures for its stakeholders, both internal and external, and the two main theoretical trends that address the firm’s objective function: the Shareholder theory and the Stakeholder theory. In the perspective of the Shareholder theory, the firm has to define a single objective, which is to maximize shareholder wealth. In the context of Stakeholders theory, the firm must establish a multiple objective, which is to meet the interests of all those involved with its activities. We discuss to what extent theories, standards and accounting practices emanate from the concepts of the two models, especially regarding the users’ demand for useful and relevant information. There is a predominance of Shareholder theory in influencing accounting principles that guide the disclosure of information, although different accounting reports are already discussed and presented, oriented to the Stakeholders of the firm, without establishing a set of concepts that explain and justify them within the scope of Accounting theory. Additionally, it is argued that, all things taken into consideration, both currents of the Economic theory point in the same direction: to seek the wellbeing of the firm’s stakeholders. The research contributes to the accounting literature, in the sense of clarifying the impacts arising from the two economic models that deal with the objective function of the firm in the evolution of Accounting theory, not yet captured directly in the discussion of the fundamentals of accounting theory.
Essential methodological considerations when using grounded theory.
Achora, Susan; Matua, Gerald Amandu
2016-07-01
To suggest important methodological considerations when using grounded theory. A research method widely used in nursing research is grounded theory, at the centre of which is theory construction. However, researchers still struggle with some of its methodological issues. Although grounded theory is widely used to study and explain issues in nursing practice, many researchers are still failing to adhere to its rigorous standards. Researchers should articulate the focus of their investigations - the substantive area of interest as well as the focal population. This should be followed by a succinct explanation of the strategies used to collect and analyse data, supported by clear coding processes. Finally, the resolution of the core issues, including the core category and related categories, should be explained to advance readers' understanding. Researchers should endeavour to understand the tenets of grounded theory. This enables 'neophytes' in particular to make methodological decisions that will improve their studies' rigour and fit with grounded theory. This paper complements the current dialogue on improving the understanding of grounded theory methodology in nursing research. The paper also suggests important procedural decisions researchers need to make to preserve their studies' scientific merit and fit with grounded theory.
Standards, Standards, Standards: The Unintended Consequences of Widening Participation?
Stuart, Mary
2002-01-01
Debate over widening access to higher education is narrowing to a focus on preservation of standards. Examination of the discourses of school policy, classroom environment, and peer culture shows how these competing cultures can work against efforts to increase participation. (Contains 17 references.) (SK)
An introduction to gauge theories
Cabibbo, Nicola; Benhar, Omar
2017-01-01
Written by three of the world's leading experts on particle physics and the standard model, including an award-winning former director general of CERN, this book provides a completely up-to-date account of gauge theories. Starting from Feynman’s path integrals, Feynman rules are derived, gauge fixing and Faddeev-Popov ghosts are discussed, and renormalization group equations are derived. Several important applications to quantum electrodynamics and quantum chromodynamics (QCD) are discussed, including the one-loop derivation of asymptotic freedom for QCD.
Theory and context / Theory in context
DEFF Research Database (Denmark)
Glaveanu, Vlad Petre
2014-01-01
questions? This de-pends on how one understands theory. Against a view of theoretical work as aiming towards generality, universality, uniformity, completeness, and singularity, I advocate for a dynamic perspective in which theory is plural, multifaceted, and contextual. Far from ‘waiting for the Messiah......It is debatable whether the psychology of creativity is a field in crisis or not. There are clear signs of increased fragmenta-tion and a scarcity of integrative efforts, but is this necessari-ly bad? Do we need more comprehensive theories of creativ-ity and a return to old epistemological......’, theoreti-cal work in the psychology of creativity can be integrative without having the ambition to explain or, even more, predict, creative expression across all people, at all times, and in all domains. To avoid such ambition, the psychology of creativi-ty requires a theory of context that doesn...
Introduction to lattice gauge theory
International Nuclear Information System (INIS)
Gupta, R.
1987-01-01
The lattice formulation of Quantum Field Theory (QFT) can be exploited in many ways. We can derive the lattice Feynman rules and carry out weak coupling perturbation expansions. The lattice then serves as a manifestly gauge invariant regularization scheme, albeit one that is more complicated than standard continuum schemes. Strong coupling expansions: these give us useful qualitative information, but unfortunately no hard numbers. The lattice theory is amenable to numerical simulations by which one calculates the long distance properties of a strongly interacting theory from first principles. The observables are measured as a function of the bare coupling g and a gauge invariant cut-off ≅ 1/α, where α is the lattice spacing. The continuum (physical) behavior is recovered in the limit α → 0, at which point the lattice artifacts go to zero. This is the more powerful use of lattice formulation, so in these lectures the author focuses on setting up the theory for the purpose of numerical simulations to get hard numbers. The numerical techniques used in Lattice Gauge Theories have their roots in statistical mechanics, so it is important to develop an intuition for the interconnection between quantum mechanics and statistical mechanics. This will be the emphasis of the first lecture. In the second lecture, the author reviews the essential ingredients of formulating QCD on the lattice and discusses scaling and the continuum limit. In the last lecture the author summarizes the status of some of the main results. He also mentions the bottlenecks and possible directions for research. 88 refs
Theory Choice and Social Choice: Kuhn meets Arrow
Okasha, S
2011-01-01
Kuhn’s famous thesis that there is ‘no unique algorithm’ for choosing between rival scientific theories is analysed using the machinery of social choice theory. It is shown that the problem of theory choice as posed by Kuhn is formally identical to a standard social choice problem. This suggests that analogues of well-known results from the social choice literature, such as Arrow’s impossibility theorem, may apply to theory choice. If an analogue of Arrow’s theorem does hold for theory choice...
Reassessment of the theory of stimulated Raman scattering
Fralick, G. C.; Deck, R. T.
1985-01-01
A modification of the standard theory of stimulated Raman scattering (SRS) first proposed by Sparks (1974, 1975) is analyzed and shown to incorporate a possibly important physical effect; however, its original formulation is incorrect. The analysis is based on an exact numerical integration of the coupled equations of the modified theory, the results of which are compared with both the conventional theory of SRS and with one set of experimental data. A reformulation of the modified theory is suggested that leads to a gain which is in somewhat better agreement with the data than is the conventional theory.
Non-unique factorizations algebraic, combinatorial and analytic theory
Geroldinger, Alfred
2006-01-01
From its origins in algebraic number theory, the theory of non-unique factorizations has emerged as an independent branch of algebra and number theory. Focused efforts over the past few decades have wrought a great number and variety of results. However, these remain dispersed throughout the vast literature. For the first time, Non-Unique Factorizations: Algebraic, Combinatorial, and Analytic Theory offers a look at the present state of the theory in a single, unified resource.Taking a broad look at the algebraic, combinatorial, and analytic fundamentals, this book derives factorization results and applies them in concrete arithmetical situations using appropriate transfer principles. It begins with a basic introduction that can be understood with knowledge of standard basic algebra. The authors then move to the algebraic theory of monoids, arithmetic theory of monoids, the structure of sets of lengths, additive group theory, arithmetical invariants, and the arithmetic of Krull monoids. They also provide a s...
F-Theory - From Geometry to Physics and Back
CERN. Geneva
2017-01-01
Compactifications of string theory have the potential to form a bridge between what we believe is a consistent quantum theory of gravity in 10 spacetime dimensions and observed physics in four dimensions. At the same time, beautiful results from mathematics, especially algebraic geometry, are directly linked to some of the key concepts in modern particle and quantum field theory. This theory colloquium will illustrate some of these ideas in the context of F-theory, which provides a non-perturbative formulation of a class of string compactifications in their geometric regime. Recent applications of F-theory range from very concrete suggestions to address known challenges in physics beyond the Standard Model to the 'physicalization of geometry' to the construction and investigations of strongly coupled quantum field theories in various dimensions. After reviewing examples of such applications we will conclude by demonstrating the close links between geometry and physics in F-theory via some new results on the r...
DEFF Research Database (Denmark)
Wæver, Ole
2011-01-01
’ is distinct from both the study of political practices of securitization and explorations of competing concepts of politics among security theories. It means tracking what kinds of analysis the theory can produce and whether such analysis systematically impacts real-life political struggles. Securitization...... theory is found to ‘act politically’ through three structural features that systematically shape the political effects of using the theory. The article further discusses – on the basis of the preceding articles in the special issue – three emerging debates around securitization theory: ethics...
Variational Transition State Theory
Energy Technology Data Exchange (ETDEWEB)
Truhlar, Donald G. [Univ. of Minnesota, Minneapolis, MN (United States)
2016-09-29
This is the final report on a project involving the development and applications of variational transition state theory. This project involved the development of variational transition state theory for gas-phase reactions, including optimized multidimensional tunneling contributions and the application of this theory to gas-phase reactions with a special emphasis on developing reaction rate theory in directions that are important for applications to combustion. The development of variational transition state theory with optimized multidimensional tunneling as a useful computational tool for combustion kinetics involved eight objectives.
Bollobas, Bela
2004-01-01
The ever-expanding field of extremal graph theory encompasses a diverse array of problem-solving methods, including applications to economics, computer science, and optimization theory. This volume, based on a series of lectures delivered to graduate students at the University of Cambridge, presents a concise yet comprehensive treatment of extremal graph theory.Unlike most graph theory treatises, this text features complete proofs for almost all of its results. Further insights into theory are provided by the numerous exercises of varying degrees of difficulty that accompany each chapter. A
Gross, Jonathan L
2003-01-01
The Handbook of Graph Theory is the most comprehensive single-source guide to graph theory ever published. Best-selling authors Jonathan Gross and Jay Yellen assembled an outstanding team of experts to contribute overviews of more than 50 of the most significant topics in graph theory-including those related to algorithmic and optimization approaches as well as ""pure"" graph theory. They then carefully edited the compilation to produce a unified, authoritative work ideal for ready reference.Designed and edited with non-experts in mind, the Handbook of Graph Theory makes information easy to fi
Directory of Open Access Journals (Sweden)
Antonio De Felice
2010-06-01
Full Text Available Over the past decade, f(R theories have been extensively studied as one of the simplest modifications to General Relativity. In this article we review various applications of f(R theories to cosmology and gravity – such as inflation, dark energy, local gravity constraints, cosmological perturbations, and spherically symmetric solutions in weak and strong gravitational backgrounds. We present a number of ways to distinguish those theories from General Relativity observationally and experimentally. We also discuss the extension to other modified gravity theories such as Brans–Dicke theory and Gauss–Bonnet gravity, and address models that can satisfy both cosmological and local gravity constraints.
Barron, E N
2013-01-01
An exciting new edition of the popular introduction to game theory and its applications The thoroughly expanded Second Edition presents a unique, hands-on approach to game theory. While most books on the subject are too abstract or too basic for mathematicians, Game Theory: An Introduction, Second Edition offers a blend of theory and applications, allowing readers to use theory and software to create and analyze real-world decision-making models. With a rigorous, yet accessible, treatment of mathematics, the book focuses on results that can be used to
Prest, M
1988-01-01
In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module
National Aeronautics and Space Administration — The innovation of the work herein proposed is the development of standards for software autonomous agents. These standards are essential to achieve software...
Catalytic Functions of Standards
K. Blind (Knut)
2009-01-01
textabstractThe three different areas and the examples have illustrated several catalytic functions of standards for innovation. First, the standardisation process reduces the time to market of inventions, research results and innovative technologies. Second, standards themselves promote the
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Dental Assisting Program Standards.
Georgia Univ., Athens. Dept. of Vocational Education.
This publication contains statewide standards for the dental assisting program in Georgia. The standards are divided into 12 categories: foundations (philosophy, purpose, goals, program objectives, availability, evaluation); admissions (admission requirements, provisional admission requirements, recruitment, evaluation and planning); program…
Metastring theory and modular space-time
International Nuclear Information System (INIS)
Freidel, Laurent; Leigh, Robert G.; Minic, Djordje
2015-01-01
String theory is canonically accompanied with a space-time interpretation which determines S-matrix-like observables, and connects to the standard physics at low energies in the guise of local effective field theory. Recently, we have introduced a reformulation of string theory which does not rely on an a priori space-time interpretation or a pre-assumption of locality. This metastring theory is formulated in such a way that stringy symmetries (such as T-duality) are realized linearly. In this paper, we study metastring theory on a flat background and develop a variety of technical and interpretational ideas. These include a formulation of the moduli space of Lorentzian worldsheets, a careful study of the symplectic structure and consequently consistent closed and open boundary conditions, and the string spectrum and operator algebra. What emerges from these studies is a new quantum notion of space-time that we refer to as a quantum Lagrangian or equivalently a modular space-time. This concept embodies the standard tenets of quantum theory and implements in a precise way a notion of relative locality. The usual string backgrounds (non-compact space-time along with some toroidally compactified spatial directions) are obtained from modular space-time by a limiting procedure that can be thought of as a correspondence limit.
Spontaneous symmetry breakdown in gauge theories
International Nuclear Information System (INIS)
Scadron, M.D.
1982-01-01
The dynamical theory of spontaneous breakdown correctly predicts the bound states and relates the order parameters of electron-photon superconductivity and quark-gluon chiral symmetry. A similar statement cannot be made for the standard electro-weak gauge symmetry. (author)
Research program in elementary particle theory
International Nuclear Information System (INIS)
Balachandran, A.P.; Rosenzweig, C.; Schechter, J.; Wali, K.C.
1992-01-01
In this paper we give a brief account of the work of the group during the past year. The topics covered here include (1) Effective Lagrangians and Solitons; (2) Chern-Simons and Conformal Field Theories; (3) Spin and Statistics; (4) The Standard Model and Beyond; (5) Non-Abelian Monopoles; (6) The Inflationary Universe; (7) The Hubbard Model, and (8) Miscellaneous
Evolutionary economic theories of sustainable development
Mulder, P.; van den Bergh, J.C.J.M.
2001-01-01
Sustainable development has become the dominant concept in the study of interactions between the economy and the biophysical environment, as well as a generally accepted goal of environmental policy. So far, economists have predominantly applied standard or neo-classical theory to environmental
Rhythmic licensing theory : an extended typology
Kager, R.W.J.
2005-01-01
The standard model of directional stress assignment in Optimality Theory uses two gradient alignment constraints which assess the distance between edges of feet and words. This model predicts a large amount of symmetry in metrical typology, in terms of directionality and in terms of foot type.
Search for a Final Theory of Matter.
Indian Academy of Sciences (India)
Many theorists have put forward proposals for new mathematically consistent theories where this happens. But there is a much more compelling reason why the standard model cannot be the final story. This has to do with gravity. Recall that we have ignored the gravitational force in our discus- sion of elementary particles.
Covariant perturbation theory and chiral superpropagators
Ecker, G
1972-01-01
The authors use a covariant formulation of perturbation theory for the non-linear chiral invariant pion model to define chiral superpropagators leading to S-matrix elements which are independent of the choice of the pion field coordinates. The relation to the standard definition of chiral superpropagators is discussed. (11 refs).
Gauge theories of the weak interactions
International Nuclear Information System (INIS)
Quinn, H.
1978-08-01
Two lectures are presented on the Weinberg--Salam--Glashow--Iliopoulos--Maiani gauge theory for weak interactions. An attempt is made to give some impressions of the generality of this model, how it was developed, variations found in the literature, and the status of the standard model. 21 references
International Nuclear Information System (INIS)
Eloranta, E.
2003-11-01
The geophysical field theory includes the basic principles of electromagnetism, continuum mechanics, and potential theory upon which the computational modelling of geophysical phenomena is based on. Vector analysis is the main mathematical tool in the field analyses. Electrostatics, stationary electric current, magnetostatics, and electrodynamics form a central part of electromagnetism in geophysical field theory. Potential theory concerns especially gravity, but also electrostatics and magnetostatics. Solid state mechanics and fluid mechanics are central parts in continuum mechanics. Also the theories of elastic waves and rock mechanics belong to geophysical solid state mechanics. The theories of geohydrology and mass transport form one central field theory in geophysical fluid mechanics. Also heat transfer is included in continuum mechanics. (orig.)
Niederreiter, Harald
2015-01-01
This textbook effectively builds a bridge from basic number theory to recent advances in applied number theory. It presents the first unified account of the four major areas of application where number theory plays a fundamental role, namely cryptography, coding theory, quasi-Monte Carlo methods, and pseudorandom number generation, allowing the authors to delineate the manifold links and interrelations between these areas. Number theory, which Carl-Friedrich Gauss famously dubbed the queen of mathematics, has always been considered a very beautiful field of mathematics, producing lovely results and elegant proofs. While only very few real-life applications were known in the past, today number theory can be found in everyday life: in supermarket bar code scanners, in our cars’ GPS systems, in online banking, etc. Starting with a brief introductory course on number theory in Chapter 1, which makes the book more accessible for undergraduates, the authors describe the four main application areas in Chapters...
Resource Theory of Superposition.
Theurer, T; Killoran, N; Egloff, D; Plenio, M B
2017-12-08
The superposition principle lies at the heart of many nonclassical properties of quantum mechanics. Motivated by this, we introduce a rigorous resource theory framework for the quantification of superposition of a finite number of linear independent states. This theory is a generalization of resource theories of coherence. We determine the general structure of operations which do not create superposition, find a fundamental connection to unambiguous state discrimination, and propose several quantitative superposition measures. Using this theory, we show that trace decreasing operations can be completed for free which, when specialized to the theory of coherence, resolves an outstanding open question and is used to address the free probabilistic transformation between pure states. Finally, we prove that linearly independent superposition is a necessary and sufficient condition for the faithful creation of entanglement in discrete settings, establishing a strong structural connection between our theory of superposition and entanglement theory.
Computational invariant theory
Derksen, Harm
2015-01-01
This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...
The Distance Standard Deviation
Edelmann, Dominic; Richards, Donald; Vogel, Daniel
2017-01-01
The distance standard deviation, which arises in distance correlation analysis of multivariate data, is studied as a measure of spread. New representations for the distance standard deviation are obtained in terms of Gini's mean difference and in terms of the moments of spacings of order statistics. Inequalities for the distance variance are derived, proving that the distance standard deviation is bounded above by the classical standard deviation and by Gini's mean difference. Further, it is ...
Higgsless grand unified theory breaking and trinification
International Nuclear Information System (INIS)
Carone, Christopher D.; Conroy, Justin M.
2004-01-01
Boundary conditions on an extra dimensional interval can be chosen to break bulk gauge symmetries and to reduce the rank of the gauge group. We consider this mechanism in models with gauge trinification. We determine the boundary conditions necessary to break the trinified gauge group directly down to that of the standard model. Working in an effective theory for the gauge-symmetry-breaking parameters on a boundary, we examine the limit in which the grand-unified theory-breaking-sector is Higgsless and show how one may obtain the low-energy particle content of the minimal supersymmetric standard model. We find that gauge unification is preserved in this scenario, and that the differential gauge coupling running is logarithmic above the scale of compactification. We compare the phenomenology of our model to that of four dimensional 'trinified' theories
Radiological Control Technician: Standardized technician Qualification Standard
International Nuclear Information System (INIS)
1992-10-01
The Qualification Standard states and defines the knowledge and skill requirements necessary for successful completion of the Radiological Control Technician Training Program. The standard is divided into three phases: Phase I concerns RCT Academic training. There are 13 lessons associated with the core academics program and 19 lessons associated with the site academics program. The staff member should sign the appropriate blocks upon successful completion of the examination for that lesson or group of lessons. In addition, facility specific lesson plans may be added to meet the knowledge requirements in the Job Performance Measures (JPM) of the practical program. Phase II concerns RCT core/site practical (JPMs) training. There are thirteen generic tasks associated with the core practical program. Both the trainer/evaluator and student should sign the appropriate block upon successful completion of the JPM. In addition, facility specific tasks may be added or generic tasks deleted based on the results of the facility job evaluation. Phase III concerns the oral examination board successful completion of the oral examination board is documented by the signature of the chairperson of the board. Upon completion of all of the standardized technician qualification requirements, final qualification is verified by the student and the manager of the Radiological Control Department and acknowledged by signatures on the qualification standard. The completed Qualification Standard shall be maintained as an official training record
Folmer, E.J.A.
2012-01-01
Little scientific literature addresses the issue of quality of semantic standards, albeit a problem with high economic and social impact. Our problem survey, including 34 semantic Standard Setting Organizations (SSOs), gives evidence that quality of standards can be improved, but for improvement a
Automotive Technology Skill Standards
Garrett, Tom; Asay, Don; Evans, Richard; Barbie, Bill; Herdener, John; Teague, Todd; Allen, Scott; Benshoof, James
2009-01-01
The standards in this document are for Automotive Technology programs and are designed to clearly state what the student should know and be able to do upon completion of an advanced high-school automotive program. Minimally, the student will complete a three-year program to achieve all standards. Although these exit-level standards are designed…
[Ophthalmology and standardization].
Heitz, R
1989-01-01
The standards are the references for quality and safety of materials, instruments and devices in ophtalmological use. The French standardisation association, "Association Française de Normalisation" (AFNOR), drafts his standards in connection with the concerned professionals. The ophthalmologists are concerned by standards of diagnostic and therapeutic instruments, intraocular and orbital implants, contact lenses, spectacle frames and glasses, and ocular protectors.
Non standard analysis, polymer models, quantum fields
International Nuclear Information System (INIS)
Albeverio, S.
1984-01-01
We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)
Supersymmetry: Theory, Experiment and Cosmology
Energy Technology Data Exchange (ETDEWEB)
Jones, Tim [Department of Mathematical Sciences, University of Liverpool, Liverpool (United Kingdom)
2008-06-21
This volume presents a comprehensive introduction to supersymmetry, concentrating mainly on the Minimal Supersymmetric Standard Model (MSSM) and its possible embedding in a grand unified theory, but also including material on supergravity, non-perturbative aspects of supersymmetry, string theory and cosmology. There is an excellent self-contained appendix on the standard model which could be read first; other appendices provide introductions to spinor representations of the Lorentz group, superfields, and cosmology, and there is a short appendix listing the MSSM renormalisation group beta-functions. The appendices in fact occupy over a quarter of the volume. Substantial knowledge of quantum field theory is required of the reader; and also a working knowledge of group theory as employed in the construction of particle physics models: while there is some useful material on this in the section on grand unification, an appendix on it might perhaps have been a useful addition. Supersymmetry is introduced via the particle physicist's concern with the hierarchy problem and developed in the component formalism beginning with the Wess-Zumino model and proceeding to supersymmetric gauge theories. The treatment is detailed and authoritative; the author has 25 years of high-level research experience in the area and it shows. The level of presentation is high, and difficult concepts are explained clearly. The examples and associated hints are excellent. One topic I would have liked to see more on is the renormalisation of supersymmetric theories; presentation of the explicit calculation of the anomalous dimension of a chiral superfield (gamma) at one loop for at least the Wess-Zumino model might perhaps have been pedagogically useful. Associated, perhaps, with this omission is an inconsistency in the definition of gamma; the sign of gamma in the treatment in section 8.3.2 clearly differs from its sign in the appendix section E.3. In the text the formalism of supersymmetry
Big-Bang nucleosynthesis and lithium abundance
International Nuclear Information System (INIS)
Singh, Vinay; Lahiri, Joydev; Bhowmick, Debasis; Basu, D.N.
2017-01-01
The predictions of the standard big-bang nucleosynthesis (BBN) theory depend on the astrophysical nuclear reaction rates and on additional three parameters, the number of flavours of light neutrinos, the neutron lifetime and the baryon-to-photon ratio in the uni- verse. The effect of the modification of thirty-five reaction rates on light element abundance yields in BBN was investigated earlier by us. In the present work we have replaced the neutron lifetime, baryon-to-photon ratio by the most recent values and further modified 3 He( 4 He,γ) 7 Be reaction rate which is used directly for estimating the formation of 7 Li as a result of β + decay by the most recent equation. We find that these modifications reduce the calculated abundance of 7 Li by ∼ 12%
Poulin, Vivian; Serpico, Pasquale Dario
2015-03-06
The standard theory of electromagnetic cascades onto a photon background predicts a quasiuniversal shape for the resulting nonthermal photon spectrum. This has been applied to very disparate fields, including nonthermal big bang nucleosynthesis (BBN). However, once the energy of the injected photons falls below the pair-production threshold the spectral shape is much harder, a fact that has been overlooked in past literature. This loophole may have important phenomenological consequences, since it generically alters the BBN bounds on nonthermal relics; for instance, it allows us to reopen the possibility of purely electromagnetic solutions to the so-called "cosmological lithium problem," which were thought to be excluded by other cosmological constraints. We show this with a proof-of-principle example and a simple particle physics model, compared with previous literature.
Revisiting big-bang nucleosynthesis constraints on long-lived decaying particles
Kawasaki, Masahiro; Kohri, Kazunori; Moroi, Takeo; Takaesu, Yoshitaro
2018-01-01
We study the effects of long-lived massive particles, which decayed during the big-bang nucleosynthesis (BBN) epoch, on the primordial abundance of light elements. Compared to previous studies, (i) the reaction rates of standard BBN reactions are updated, (ii) the most recent observational data on the light element abundance and cosmological parameters are used, (iii) the effects of the interconversion of energetic nucleons at the time of inelastic scattering with background nuclei are considered, and (iv) the effects of the hadronic shower induced by energetic high-energy antinucleons are included. We compare the theoretical predictions on the primordial abundance of light elements with the latest observational constraints, and we derive upper bounds on the relic abundance of the decaying particle as a function of its lifetime. We also apply our analysis to an unstable gravitino, the superpartner of a graviton in supersymmetric theories, and obtain constraints on the reheating temperature after inflation.
The cooperative game theory of networks and hierarchies
Gilles, Robert P
2010-01-01
This book details standard concepts in cooperative game theory with applications to the analysis of social networks and hierarchical authority organizations. It covers the multi-linear extension, the Core, the Shapley value, and the cooperative potential.
Mathematical aspects of quantum field theory
de Faria, Edson
2010-01-01
Over the last century quantum field theory has made a significant impact on the formulation and solution of mathematical problems and inspired powerful advances in pure mathematics. However, most accounts are written by physicists, and mathematicians struggle to find clear definitions and statements of the concepts involved. This graduate-level introduction presents the basic ideas and tools from quantum field theory to a mathematical audience. Topics include classical and quantum mechanics, classical field theory, quantization of classical fields, perturbative quantum field theory, renormalization, and the standard model. The material is also accessible to physicists seeking a better understanding of the mathematical background, providing the necessary tools from differential geometry on such topics as connections and gauge fields, vector and spinor bundles, symmetries and group representations.
Hydrodynamics, fields and constants in gravitational theory
International Nuclear Information System (INIS)
Stanyukovich, K.P.; Mel'nikov, V.N.
1983-01-01
Results of original inveatigations into problems of standard gravitation theory and its generalizations are presented. The main attention is paid to the application of methods of continuous media techniques in the gravitation theory; to the specification of the gravitation role in phenomena of macro- and microworld, accurate solutions in the case, when the medium is the matter, assigned by hydrodynamic energy-momentum tensor; and to accurate solutions for the case when the medium is the field. GRT generalizations are analyzed, such as the new cosmologic hypothesis which is based on the gravitation vacuum theory. Investigations are performed into the quantization of cosmological models, effects of spontaneous symmetry violation and particle production in cosmology. Graeity theory with fundamental Higgs field is suggested in the framework of which in the atomic unit number one can explain possible variations of the effective gravitational bonds, and in the gravitation bond, variations of masses of all particles
On the entanglement entropy for gauge theories
International Nuclear Information System (INIS)
Ghosh, Sudip; Soni, Ronak M; Trivedi, Sandip P.
2015-01-01
We propose a definition for the entanglement entropy of a gauge theory on a spatial lattice. Our definition applies to any subset of links in the lattice, and is valid for both Abelian and Non-Abelian gauge theories. For ℤ N and U(1) theories, without matter, our definition agrees with a particular case of the definition given by Casini, Huerta and Rosabal. We also argue that in general, both for Abelian and Non-Abelian theories, our definition agrees with the entanglement entropy calculated using a definition of the replica trick. Our definition, however, does not agree with some standard ways to measure entanglement, like the number of Bell pairs which can be produced by entanglement distillation.
Measurement Errors and Uncertainties Theory and Practice
Rabinovich, Semyon G
2006-01-01
Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...
Progress in string theory research
2016-01-01
At the first look, the String Theory seems just an interesting and non-trivial application of the quantum mechanics and the special relativity to vibrating strings. By itself, the quantization of relativistic strings does not call the attention of the particle physicist as a significant paradigm shift. However, when the string quantization is performed by applying the standard rules of the perturbative Quantum Field Theory, one discovers that the strings in certain states have the same physical properties as the gravity in the flat space-time. Chapter one of this book reviews the construction of the thermal bosonic string and D-brane in the framework of the Thermo Field Dynamics (TFD). It briefly recalls the wellknown light-cone quantization of the bosonic string in the conformal gauge in flat space-time, and gives a bird’s eye view of the fundamental concepts of the TFD. Chapter two examines a visual model inspired by string theory, on the system of interacting anyons. Chapter three investigate the late-ti...
Infrared Constraint on Ultraviolet Theories
Energy Technology Data Exchange (ETDEWEB)
Tsai, Yuhsin [Cornell Univ., Ithaca, NY (United States)
2012-08-01
While our current paradigm of particle physics, the Standard Model (SM), has been extremely successful at explaining experiments, it is theoretically incomplete and must be embedded into a larger framework. In this thesis, we review the main motivations for theories beyond the SM (BSM) and the ways such theories can be constrained using low energy physics. The hierarchy problem, neutrino mass and the existence of dark matter (DM) are the main reasons why the SM is incomplete . Two of the most plausible theories that may solve the hierarchy problem are the Randall-Sundrum (RS) models and supersymmetry (SUSY). RS models usually suffer from strong flavor constraints, while SUSY models produce extra degrees of freedom that need to be hidden from current experiments. To show the importance of infrared (IR) physics constraints, we discuss the flavor bounds on the anarchic RS model in both the lepton and quark sectors. For SUSY models, we discuss the difficulties in obtaining a phenomenologically allowed gaugino mass, its relation to R-symmetry breaking, and how to build a model that avoids this problem. For the neutrino mass problem, we discuss the idea of generating small neutrino masses using compositeness. By requiring successful leptogenesis and the existence of warm dark matter (WDM), we can set various constraints on the hidden composite sector. Finally, to give an example of model independent bounds from collider experiments, we show how to constrain the DM–SM particle interactions using collider results with an effective coupling description.
A relativistic theory for continuous measurement of quantum fields
International Nuclear Information System (INIS)
Diosi, L.
1990-04-01
A formal theory for the continuous measurement of relativistic quantum fields is proposed. The corresponding scattering equations were derived. The proposed formalism reduces to known equations in the Markovian case. Two recent models for spontaneous quantum state reduction have been recovered in the framework of this theory. A possible example of the relativistic continuous measurement has been outlined in standard Quantum Electrodynamics. The continuous measurement theory possesses an alternative formulation in terms of interacting quantum and stochastic fields. (author) 23 refs
Large $N$ QCD and $q$-Deformed Quantum Field Theories
Aref'eva, I. Ya.
1996-01-01
A construction of master field describing multicolour QCD is presented. The master fields for large N matrix theories satisfy to standard equations of relativistic field theory but fields are quantized according $q$-deformed commutation relations with $q=0$. These commutation relations are realized in the Boltzmannian Fock space. The master field for gauge theory does not take values in a finite-dimensional Lie algebra, however, there is a non-Abelian gauge symmetry and BRST-invariance.
Topics in graph theory graphs and their Cartesian product
Imrich, Wilfried; Rall, Douglas F
2008-01-01
From specialists in the field, you will learn about interesting connections and recent developments in the field of graph theory by looking in particular at Cartesian products-arguably the most important of the four standard graph products. Many new results in this area appear for the first time in print in this book. Written in an accessible way, this book can be used for personal study in advanced applications of graph theory or for an advanced graph theory course.
Versatility of field theory motivated nuclear effective Lagrangian approach
International Nuclear Information System (INIS)
Arumugam, P.; Sharma, B.K.; Sahu, P.K.; Patra, S.K.; Sil, Tapas; Centelles, M.; Vinas, X.
2004-01-01
We analyze the results for infinite nuclear and neutron matter using the standard relativistic mean field model and its recent effective field theory motivated generalization. For the first time, we show quantitatively that the inclusion in the effective theory of vector meson self-interactions and scalar-vector cross-interactions explains naturally the recent experimental observations of the softness of the nuclear equation of state, without losing the advantages of the standard relativistic model for finite nuclei
Electric charge, early universe and the Superstring Theories
Abbas, Afsar
1999-01-01
Very recently, it has been shown by the author that the Standard Model Higgs cannot be a physical particle. Here, on most general grounds it is established that as per the Standard Model there is no electric charge above the electro-weak phase transition temperature. Hence there was no electric charge present in the early universe. The Superstring Theories are flawed in as much as they are incompatible with this requirement. Hence the Superstring Theories are inconsistent with this basic stru...
Primordial alchemy: from the Big Bang to the present universe
Steigman, Gary
Of the light nuclides observed in the universe today, D, 3He, 4He, and 7Li are relics from its early evolution. The primordial abundances of these relics, produced via Big Bang Nucleosynthesis (BBN) during the first half hour of the evolution of the universe provide a unique window on Physics and Cosmology at redshifts ~1010. Comparing the BBN-predicted abundances with those inferred from observational data tests the consistency of the standard cosmological model over ten orders of magnitude in redshift, constrains the baryon and other particle content of the universe, and probes both Physics and Cosmology beyond the current standard models. These lectures are intended to introduce students, both of theory and observation, to those aspects of the evolution of the universe relevant to the production and evolution of the light nuclides from the Big Bang to the present. The current observational data is reviewed and compared with the BBN predictions and the implications for cosmology (e.g., universal baryon density) and particle physics (e.g., relativistic energy density) are discussed. While this comparison reveals the stunning success of the standard model(s), there are currently some challenge which leave open the door for more theoretical and observational work with potential implications for astronomy, cosmology, and particle physics.
Variational Wigner-Kirkwood approach to relativistic mean field theory
International Nuclear Information System (INIS)
Del Estal, M.; Centelles, M.; Vinas, X.
1997-01-01
The recently developed variational Wigner-Kirkwood approach is extended to the relativistic mean field theory for finite nuclei. A numerical application to the calculation of the surface energy coefficient in semi-infinite nuclear matter is presented. The new method is contrasted with the standard density functional theory and the fully quantal approach. copyright 1997 The American Physical Society
Organization Theory: Bright Prospects for a Permanently Failing Field
P.P.M.A.R. Heugens (Pursey)
2008-01-01
textabstractOrganization theory is a paradoxical field of scientific inquiry. It has struggled for more than fifty years to develop a unified theory of organizational effectiveness under girded by a coherent set of assumptions, and it has thus far failed to produce one. Yet, by other standards it is
Can An Amended Standard Model Account For Cold Dark Matter?
International Nuclear Information System (INIS)
Goldhaber, Maurice
2004-01-01
It is generally believed that one has to invoke theories beyond the Standard Model to account for cold dark matter particles. However, there may be undiscovered universal interactions that, if added to the Standard Model, would lead to new members of the three generations of elementary fermions that might be candidates for cold dark matter particles
Planning Sign Languages: Promoting Hearing Hegemony? Conceptualizing Sign Language Standardization
Eichmann, Hanna
2009-01-01
In light of the absence of a codified standard variety in British Sign Language and German Sign Language ("Deutsche Gebardensprache") there have been repeated calls for the standardization of both languages primarily from outside the Deaf community. The paper is based on a recent grounded theory study which explored perspectives on sign…
The Standard Model with one universal extra dimension
Indian Academy of Sciences (India)
Yang–Mills, Currents, Higgs, and Yukawa sectors is presented. The one-loop renormalizability of the standard Green's functions, which implies that the Standard ..... The quantization of this theory was discussed in [14]. 2.2 The Higgs sector. The Higgs sector is constituted by the kinetic term and the potential: LH = ∫ 2πR. 0.
Class field theory from theory to practice
Gras, Georges
2003-01-01
Global class field theory is a major achievement of algebraic number theory, based on the functorial properties of the reciprocity map and the existence theorem. The author works out the consequences and the practical use of these results by giving detailed studies and illustrations of classical subjects (classes, idèles, ray class fields, symbols, reciprocity laws, Hasse's principles, the Grunwald-Wang theorem, Hilbert's towers,...). He also proves some new or less-known results (reflection theorem, structure of the abelian closure of a number field) and lays emphasis on the invariant (/cal T) p, of abelian p-ramification, which is related to important Galois cohomology properties and p-adic conjectures. This book, intermediary between the classical literature published in the sixties and the recent computational literature, gives much material in an elementary way, and is suitable for students, researchers, and all who are fascinated by this theory. In the corrected 2nd printing 2005, the author improves s...
Non-perturbative effective interactions in the standard model
Arbuzov, Boris A
2014-01-01
This monograph is devoted to the nonperturbative dynamics in the Standard Model (SM), the basic theory of all, but gravity, fundamental interactions in nature. The Standard Model is devided into two parts: the Quantum chromodynamics (QCD) and the Electro-weak theory (EWT) are well-defined renormalizable theories in which the perturbation theory is valid. However, for the adequate description of the real physics nonperturbative effects are inevitable. This book describes how these nonperturbative effects may be obtained in the framework of spontaneous generation of effective interactions. The well-known example of such effective interaction is provided by the famous Nambu--Jona-Lasinio effective interaction. Also a spontaneous generation of this interaction in the framework of QCD is described and applied to the method for other effective interactions in QCD and EWT. The method is based on N.N. Bogoliubov conception of compensation equations. As a result we then describe the principle feathures of the Standard...
DEFF Research Database (Denmark)
Birkedal, Lars; Bizjak, Aleš; Clouston, Ranald
2016-01-01
terms. CTT provides a computational interpretation of functional extensionality, enjoys canonicity for the natural numbers type, and is conjectured to support decidable type-checking. Our new type theory, guarded cubical type theory (GCTT), provides a computational interpretation of extensionality......This paper improves the treatment of equality in guarded dependent type theory (GDTT), by combining it with cubical type theory (CTT). GDTT is an extensional type theory with guarded recursive types, which are useful for building models of program logics, and for programming and reasoning...... with coinductive types. We wish to implement GDTT with decidable type checking, while still supporting non-trivial equality proofs that reason about the extensions of guarded recursive constructions. CTT is a variation of Martin-L\\"of type theory in which the identity type is replaced by abstract paths between...
Algebraic quantum field theory
International Nuclear Information System (INIS)
Foroutan, A.
1996-12-01
The basic assumption that the complete information relevant for a relativistic, local quantum theory is contained in the net structure of the local observables of this theory results first of all in a concise formulation of the algebraic structure of the superselection theory and an intrinsic formulation of charge composition, charge conjugation and the statistics of an algebraic quantum field theory. In a next step, the locality of massive particles together with their spectral properties are wed for the formulation of a selection criterion which opens the access to the massive, non-abelian quantum gauge theories. The role of the electric charge as a superselection rule results in the introduction of charge classes which in term lead to a set of quantum states with optimum localization properties. Finally, the asymptotic observables of quantum electrodynamics are investigated within the framework of algebraic quantum field theory. (author)
Karlin, Anna R
2016-01-01
This book presents a rigorous introduction to the mathematics of game theory without losing sight of the joy of the subject. This is done by focusing on theoretical highlights (e.g., at least six Nobel Prize winning results are developed from scratch) and by presenting exciting connections of game theory to other fields, such as computer science, economics, social choice, biology, and learning theory. Both classical topics, such as zero-sum games, and modern topics, such as sponsored search auctions, are covered. Along the way, beautiful mathematical tools used in game theory are introduced, including convexity, fixed-point theorems, and probabilistic arguments. The book is appropriate for a first course in game theory at either the undergraduate or graduate level, whether in mathematics, economics, computer science, or statistics. Game theory's influence is felt in a wide range of disciplines, and the authors deliver masterfully on the challenge of presenting both the breadth and coherence of its underlying ...
DEFF Research Database (Denmark)
Zander, Pär Ola
2014-01-01
) postmodernism, with specific focus on his value theory, in order to understand his own reasons for abandoning his previous position. I then follow the marginal stream of scholars who are making use of the early Baudrillard. I find his value theory promising but still a mere sketch rather than an actual general......Jean Baudrillard outlined a theory of value in his early writings that built on, but also criticized, Marxist concepts of use value and exchange value. In this paper, I use a close reading to delineate the diachronic transition of Baudrillard's writings toward anti-Marxism and (allegedly...... theory. The paper concludes that Baudrillard's arguments for abandoning Marxism altogether are problematic and led him away from developing a more finished theory of value. This is unfortunate because it remains a project that may yield interesting insights even in contemporary social theory, not least...
DEFF Research Database (Denmark)
Balle, Søren Hattesen
the task of writing a personified portrait of theory. Theory emerges as always beside(s) itself in what constitutes its style, but the poem also suggests that theory’s style is what gives theory both its power and its contingency. Figured as a duchess Theoria is only capable of retaining her power......This paper takes its starting point in a short poem by Wallace Stevens from 1917, which incidentally bears the title “Theory”. The poem can be read as a parable of theory, i.e., as something literally ’thrown beside’ theory (cf. OED: “...“). In the philosophical tradition this is also how the style of theory has been figured, that is to say: as something that is incidental to it or just happens to be around as so much paraphernalia. In my reading of Stevens’ poem I shall argue that this is exactly the position from which Stevens takes off when he assumes...
Petrov, Alexey A
2016-01-01
This book is a broad-based text intended to help the growing student body interested in topics such as gravitational effective theories, supersymmetric effective theories, applications of effective theory techniques to problems in condensed matter physics (superconductivity) and quantum chromodynamics (such as soft-collinear effective theory). It begins with a review of the use of symmetries to identify the relevant degrees of freedom in a problem, and then presents a variety of methods that can be used to solve physical problems. A detailed discussion of canonical examples of effective field theories with increasing complexity is then conducted. Special cases such as supersymmetry and lattice EFT are discussed, as well as recently-found applications to problems in gravitation and cosmology. An appendix includes various factoids from group theory and other topics that are used throughout the text, in an attempt to make the book self-contained.
Requirements of quality standards
International Nuclear Information System (INIS)
Mueller, J.
1977-01-01
The lecture traces the development of nuclear standards, codes, and Federal regulations on quality assurance (QA) for nuclear power plants and associated facilities. The technical evolution of the last twelve years, especially in the area of nuclear technology, led to different activities and regulatory initiatives, and the present result is: several nations have their own homemade standards. The lecture discusses the former and especially current activities in standard development, and gives a description of the requirements of QA-standards used in USA and Europe, especially Western Germany. Furthermore the lecture attempts to give a comparison and an evaluation of the international quality standards from the author's viewpoint. Finally the lecture presents an outlook for the future international implications of QA-standards. There is an urgent need within the nuclear industry for simplification and standardization of QA-standards. The relationship between the various standards, and the applicability of the standards need clarification and a better transparancy. To point out these problems is the purpose of the lecture. (orig.) [de
Hypergraph theory an introduction
Bretto, Alain
2013-01-01
This authored monograph presents hypergraph theory and covers both traditional elements of the theory as well as more original concepts such as entropy of hypergraph, similarities and kernels. Moreover, the author gives a detailed account to applications of the theory, including, but not limited to, applications for telecommunications and modeling of parallel data structures. The target audience primarily comprises researchers and practitioners in applied sciences but the book may also be beneficial for graduate students.