WorldWideScience

Sample records for models standard assessment

  1. STAMINA - Model description. Standard Model Instrumentation for Noise Assessments

    NARCIS (Netherlands)

    Schreurs EM; Jabben J; Verheijen ENG; CMM; mev

    2010-01-01

    Deze rapportage beschrijft het STAMINA-model, dat staat voor Standard Model Instrumentation for Noise Assessments en door het RIVM is ontwikkeld. Het instituut gebruikt dit standaardmodel om omgevingsgeluid in Nederland in kaart te brengen. Het model is gebaseerd op de Standaard Karteringsmethode

  2. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  3. A Standardized Generalized Dimensionality Discrepancy Measure and a Standardized Model-Based Covariance for Dimensionality Assessment for Multidimensional Models

    Science.gov (United States)

    Levy, Roy; Xu, Yuning; Yel, Nedim; Svetina, Dubravka

    2015-01-01

    The standardized generalized dimensionality discrepancy measure and the standardized model-based covariance are introduced as tools to critique dimensionality assumptions in multidimensional item response models. These tools are grounded in a covariance theory perspective and associated connections between dimensionality and local independence.…

  4. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    Science.gov (United States)

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  5. Regional drought assessment using a distributed hydrological model coupled with Standardized Runoff Index

    Directory of Open Access Journals (Sweden)

    H. Shen

    2015-05-01

    Full Text Available Drought assessment is essential for coping with frequent droughts nowadays. Owing to the large spatio-temporal variations in hydrometeorology in most regions in China, it is very necessary to use a physically-based hydrological model to produce rational spatial and temporal distributions of hydro-meteorological variables for drought assessment. In this study, the large-scale distributed hydrological model Variable Infiltration Capacity (VIC was coupled with a modified standardized runoff index (SRI for drought assessment in the Weihe River basin, northwest China. The result indicates that the coupled model is capable of reasonably reproducing the spatial distribution of drought occurrence. It reflected the spatial heterogeneity of regional drought and improved the physical mechanism of SRI. This model also has potential for drought forecasting, early warning and mitigation, given that accurate meteorological forcing data are available.

  6. Perceptual video quality assessment in H.264 video coding standard using objective modeling.

    Science.gov (United States)

    Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu

    2014-01-01

    Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.

  7. Tests of Alignment among Assessment, Standards, and Instruction Using Generalized Linear Model Regression

    Science.gov (United States)

    Fulmer, Gavin W.; Polikoff, Morgan S.

    2014-01-01

    An essential component in school accountability efforts is for assessments to be well-aligned with the standards or curriculum they are intended to measure. However, relatively little prior research has explored methods to determine statistical significance of alignment or misalignment. This study explores analyses of alignment as a special case…

  8. Constructing Assessment Model of Primary and Secondary Educational Quality with Talent Quality as the Core Standard

    Science.gov (United States)

    Chen, Benyou

    2014-01-01

    Quality is the core of education and it is important to standardization construction of primary and secondary education in urban (U) and rural (R) areas. The ultimate goal of the integration of urban and rural education is to pursuit quality urban and rural education. Based on analysing the related policy basis and the existing assessment models…

  9. Beyond the standard model

    International Nuclear Information System (INIS)

    Wilczek, F.

    1993-01-01

    The standard model of particle physics is highly successful, although it is obviously not a complete or final theory. In this presentation the author argues that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Essentially, this presentation is a record of the author's own judgement of what the central clues for physics beyond the standard model are, and also it is an attempt at some pedagogy. 14 refs., 6 figs

  10. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  11. Scan-To Output Validation: Towards a Standardized Geometric Quality Assessment of Building Information Models Based on Point Clouds

    Science.gov (United States)

    Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.

    2017-11-01

    The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.

  12. Phd study of reliability and validity: One step closer to a standardized music therapy assessment model

    DEFF Research Database (Denmark)

    Jacobsen, Stine Lindahl

    The paper will present a phd study concerning reliability and validity of music therapy assessment modelAssessment of Parenting Competences” (APC) in the area of families with emotionally neglected children. This study had a multiple strategy design with a philosophical base of critical realism...... and pragmatism. The fixed design for this study was a between and within groups design in testing the APCs reliability and validity. The two different groups were parents with neglected children and parents with non-neglected children. The flexible design had a multiple case study strategy specifically...

  13. The Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.

  14. Beyond the standard model

    International Nuclear Information System (INIS)

    Pleitez, V.

    1994-01-01

    The search for physics laws beyond the standard model is discussed in a general way, and also some topics on supersymmetry theories. An approach is made on recent possibilities rise in the leptonic sector. Finally, models with SU(3) c X SU(2) L X U(1) Y symmetry are considered as alternatives for the extensions of the elementary particles standard model. 36 refs., 1 fig., 4 tabs

  15. The Standard Model

    International Nuclear Information System (INIS)

    Sutton, Christine

    1994-01-01

    The initial evidence from Fermilab for the long awaited sixth ('top') quark puts another rivet in the already firm structure of today's Standard Model of physics. Analysis of the Fermilab CDF data gives a top mass of 174 GeV with an error of ten per cent either way. This falls within the mass band predicted by the sum total of world Standard Model data and underlines our understanding of physics in terms of six quarks and six leptons. In this specially commissioned overview, physics writer Christine Sutton explains the Standard Model

  16. Beyond the standard model

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1990-04-01

    The unresolved issues of the standard model are reviewed, with emphasis on the gauge hierarchy problem. A possible mechanism for generating a hierarchy in the context of superstring theory is described. 24 refs

  17. Testing the standard model

    International Nuclear Information System (INIS)

    Gordon, H.; Marciano, W.; Williams, H.H.

    1982-01-01

    We summarize here the results of the standard model group which has studied the ways in which different facilities may be used to test in detail what we now call the standard model, that is SU/sub c/(3) x SU(2) x U(1). The topics considered are: W +- , Z 0 mass, width; sin 2 theta/sub W/ and neutral current couplings; W + W - , Wγ; Higgs; QCD; toponium and naked quarks; glueballs; mixing angles; and heavy ions

  18. Beyond the standard model

    International Nuclear Information System (INIS)

    Cuypers, F.

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs

  19. Beyond the standard model

    International Nuclear Information System (INIS)

    Altarelli, G.

    1987-01-01

    The standard model of particle interactions is a complete and relatively simple theoretical framework which describes all the observed fundamental forces. It consists of quantum chromodynamics (QCD) and of the electro-weak theory of Glashow, Salam and Weinberg. The former is the theory of colored quarks and gluons, which underlies the observed phenomena of strong interactions, the latter leads to a unified description of electromagnetism and of weak interactions. The inclusion of the classical Einstein theory of gravity completes the set of established basic knowledge. The standard model is in agreement with essentially all of the experimental information which is very rich by now. The recent discovery of the charged and neutral intermediate vector bosons of weak interactions at the expected masses has closed a really important chapter of particle physics. Never before the prediction of new particles was so neat and quantitatively precise. Yet the experimental proof of the standard model is not completed. For example, the hints of experimental evidence for the top quark at a mass ∼ 40 GeV have not yet been firmly established. The Higgs sector of the theory has not been tested at all. Beyond the realm of pure QED, even remaining within the electro-weak sector, the level of quantitative precision in testing the standard model does not exceed 5% or so. Furthermore, the standard model does not look as the ultimate theory. To a closer inspection a large class of fundamental questions emerges and one finds that a host of crucial problems are left open by the standard model

  20. Beyond the Standard Model

    International Nuclear Information System (INIS)

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ''Beyond the Standard Model'' for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e + e - colliders

  1. Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.

  2. Conference: STANDARD MODEL @ LHC

    CERN Multimedia

    2012-01-01

    HCØ institute Universitetsparken 5 DK-2100 Copenhagen Ø Denmark Room: Auditorium 2 STANDARD MODEL @ LHC Niels Bohr International Academy and Discovery Center 10-13 April 2012 This four day meeting will bring together both experimental and theoretical aspects of Standard Model phenomenology at the LHC. The very latest results from the LHC experiments will be under discussion. Topics covered will be split into the following categories:     * QCD (Hard,Soft & PDFs)     * Vector Boson production     * Higgs searches     * Top Quark Physics     * Flavour physics

  3. The Standard Model

    Science.gov (United States)

    Burgess, Cliff; Moore, Guy

    2012-04-01

    List of illustrations; List of tables; Preface; Acknowledgments; Part I. Theoretical Framework: 1. Field theory review; 2. The standard model: general features; 3. Cross sections and lifetimes; Part II. Applications: Leptons: 4. Elementary boson decays; 5. Leptonic weak interactions: decays; 6. Leptonic weak interactions: collisions; 7. Effective Lagrangians; Part III. Applications: Hadrons: 8. Hadrons and QCD; 9. Hadronic interactions; Part IV. Beyond the Standard Model: 10. Neutrino masses; 11. Open questions, proposed solutions; Appendix A. Experimental values for the parameters; Appendix B. Symmetries and group theory review; Appendix C. Lorentz group and the Dirac algebra; Appendix D. ξ-gauge Feynman rules; Appendix E. Metric convention conversion table; Select bibliography; Index.

  4. Standard deviation analysis of the mastoid fossa temperature differential reading: a potential model for objective chiropractic assessment.

    Science.gov (United States)

    Hart, John

    2011-03-01

    This study describes a model for statistically analyzing follow-up numeric-based chiropractic spinal assessments for an individual patient based on his or her own baseline. Ten mastoid fossa temperature differential readings (MFTD) obtained from a chiropractic patient were used in the study. The first eight readings served as baseline and were compared to post-adjustment readings. One of the two post-adjustment MFTD readings fell outside two standard deviations of the baseline mean and therefore theoretically represents improvement according to pattern analysis theory. This study showed how standard deviation analysis may be used to identify future outliers for an individual patient based on his or her own baseline data. Copyright © 2011 National University of Health Sciences. Published by Elsevier Inc. All rights reserved.

  5. Beyond the Standard Model

    International Nuclear Information System (INIS)

    Ross, G.G.

    1995-01-01

    The attempts to develop models beyond the Standard Model are briefly reviewed paying particular regard to the mechanisms responsible for symmetry breaking and mass generation. A comparison is made of the theoretical expectations with recent precision measurements for theories with composite Higgs and for supersymmetric theories with elementary Higgs boson(s). The implications of a heavy top quark and the origin of the light quark and lepton masses and mixing angles are considered within these frameworks. ((orig.))

  6. Standard Model festival

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-10-15

    The 'Standard Model' of modern particle physics, with the quantum chromodynamics (QCD) theory of inter-quark forces superimposed on the unified electroweak picture, is still unchallenged, but it is not the end of physics. This was the message at the big International Symposium on Lepton and Photon Interactions at High Energies, held in Hamburg from 27-31 July.

  7. Standard Model festival

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    The 'Standard Model' of modern particle physics, with the quantum chromodynamics (QCD) theory of inter-quark forces superimposed on the unified electroweak picture, is still unchallenged, but it is not the end of physics. This was the message at the big International Symposium on Lepton and Photon Interactions at High Energies, held in Hamburg from 27-31 July

  8. Beyond the Standard Model

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future. Supersymmetry, grand unification, extra dimensions and string theory will be presented.

  9. Beyond the Standard Model

    International Nuclear Information System (INIS)

    Lykken, Joseph D.

    2010-01-01

    'BSM physics' is a phrase used in several ways. It can refer to physical phenomena established experimentally but not accommodated by the Standard Model, in particular dark matter and neutrino oscillations (technically also anything that has to do with gravity, since gravity is not part of the Standard Model). 'Beyond the Standard Model' can also refer to possible deeper explanations of phenomena that are accommodated by the Standard Model but only with ad hoc parameterizations, such as Yukawa couplings and the strong CP angle. More generally, BSM can be taken to refer to any possible extension of the Standard Model, whether or not the extension solves any particular set of puzzles left unresolved in the SM. In this general sense one sees reference to the BSM 'theory space' of all possible SM extensions, this being a parameter space of coupling constants for new interactions, new charges or other quantum numbers, and parameters describing possible new degrees of freedom or new symmetries. Despite decades of model-building it seems unlikely that we have mapped out most of, or even the most interesting parts of, this theory space. Indeed we do not even know what is the dimensionality of this parameter space, or what fraction of it is already ruled out by experiment. Since Nature is only implementing at most one point in this BSM theory space (at least in our neighborhood of space and time), it might seem an impossible task to map back from a finite number of experimental discoveries and measurements to a unique BSM explanation. Fortunately for theorists the inevitable limitations of experiments themselves, in terms of resolutions, rates, and energy scales, means that in practice there are only a finite number of BSM model 'equivalence classes' competing at any given time to explain any given set of results. BSM phenomenology is a two-way street: not only do experimental results test or constrain BSM models, they also suggest - to those who get close enough to listen

  10. Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Lykken, Joseph D.; /Fermilab

    2010-05-01

    'BSM physics' is a phrase used in several ways. It can refer to physical phenomena established experimentally but not accommodated by the Standard Model, in particular dark matter and neutrino oscillations (technically also anything that has to do with gravity, since gravity is not part of the Standard Model). 'Beyond the Standard Model' can also refer to possible deeper explanations of phenomena that are accommodated by the Standard Model but only with ad hoc parameterizations, such as Yukawa couplings and the strong CP angle. More generally, BSM can be taken to refer to any possible extension of the Standard Model, whether or not the extension solves any particular set of puzzles left unresolved in the SM. In this general sense one sees reference to the BSM 'theory space' of all possible SM extensions, this being a parameter space of coupling constants for new interactions, new charges or other quantum numbers, and parameters describing possible new degrees of freedom or new symmetries. Despite decades of model-building it seems unlikely that we have mapped out most of, or even the most interesting parts of, this theory space. Indeed we do not even know what is the dimensionality of this parameter space, or what fraction of it is already ruled out by experiment. Since Nature is only implementing at most one point in this BSM theory space (at least in our neighborhood of space and time), it might seem an impossible task to map back from a finite number of experimental discoveries and measurements to a unique BSM explanation. Fortunately for theorists the inevitable limitations of experiments themselves, in terms of resolutions, rates, and energy scales, means that in practice there are only a finite number of BSM model 'equivalence classes' competing at any given time to explain any given set of results. BSM phenomenology is a two-way street: not only do experimental results test or constrain BSM models, they also suggest

  11. Testing the Standard Model

    CERN Document Server

    Riles, K

    1998-01-01

    The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.

  12. Standard Model physics

    CERN Multimedia

    Altarelli, Guido

    1999-01-01

    Introduction structure of gauge theories. The QEDand QCD examples. Chiral theories. The electroweak theory. Spontaneous symmetry breaking. The Higgs mechanism Gauge boson and fermion masses. Yukawa coupling. Charges current couplings. The Cabibo-Kobayashi-Maskawa matrix and CP violation. Neutral current couplings. The Glasow-Iliopoulos-Maiani mechanism. Gauge boson and Higgs coupling. Radiative corrections and loops. Cancellation of the chiral anomaly. Limits on the Higgs comparaison. Problems of the Standard Model. Outlook.

  13. Standard model and beyond

    International Nuclear Information System (INIS)

    Quigg, C.

    1984-09-01

    The SU(3)/sub c/ circle crossSU(2)/sub L/circle crossU(1)/sub Y/ gauge theory of ineractions among quarks and leptons is briefly described, and some recent notable successes of the theory are mentioned. Some shortcomings in our ability to apply the theory are noted, and the incompleteness of the standard model is exhibited. Experimental hints that Nature may be richer in structure than the minimal theory are discussed. 23 references

  14. Impact assessment of commodity standards

    NARCIS (Netherlands)

    Ruben, Ruerd

    2017-01-01

    Voluntary commodity standards are widely used to enhance the performance of tropical agro-food chains and to support the welfare and sustainability of smallholder farmers. Different methods and approaches are used to assess the effectiveness and impact of these certification schemes at

  15. Quasi standard model physics

    International Nuclear Information System (INIS)

    Peccei, R.D.

    1986-01-01

    Possible small extensions of the standard model are considered, which are motivated by the strong CP problem and by the baryon asymmetry of the Universe. Phenomenological arguments are given which suggest that imposing a PQ symmetry to solve the strong CP problem is only tenable if the scale of the PQ breakdown is much above M W . Furthermore, an attempt is made to connect the scale of the PQ breakdown to that of the breakdown of lepton number. It is argued that in these theories the same intermediate scale may be responsible for the baryon number of the Universe, provided the Kuzmin Rubakov Shaposhnikov (B+L) erasing mechanism is operative. (orig.)

  16. Standard-model bundles

    CERN Document Server

    Donagi, Ron; Pantev, Tony; Waldram, Dan; Donagi, Ron; Ovrut, Burt; Pantev, Tony; Waldram, Dan

    2002-01-01

    We describe a family of genus one fibered Calabi-Yau threefolds with fundamental group ${\\mathbb Z}/2$. On each Calabi-Yau $Z$ in the family we exhibit a positive dimensional family of Mumford stable bundles whose symmetry group is the Standard Model group $SU(3)\\times SU(2)\\times U(1)$ and which have $c_{3} = 6$. We also show that for each bundle $V$ in our family, $c_{2}(Z) - c_{2}(V)$ is the class of an effective curve on $Z$. These conditions ensure that $Z$ and $V$ can be used for a phenomenologically relevant compactification of Heterotic M-theory.

  17. The standard model

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1994-03-01

    In these lectures, my aim is to provide a survey of the standard model with emphasis on its renormalizability and electroweak radiative corrections. Since this is a school, I will try to be somewhat pedagogical by providing examples of loop calculations. In that way, I hope to illustrate some of the commonly employed tools of particle physics. With those goals in mind, I have organized my presentations as follows: In Section 2, renormalization is discussed from an applied perspective. The technique of dimensional regularization is described and used to define running couplings and masses. The utility of the renormalization group for computing leading logs is illustrated for the muon anomalous magnetic moment. In Section 3 electroweak radiative corrections are discussed. Standard model predictions are surveyed and used to constrain the top quark mass. The S, T, and U parameters are introduced and employed to probe for ''new physics''. The effect of Z' bosons on low energy phenomenology is described. In Section 4, a detailed illustration of electroweak radiative corrections is given for atomic parity violation. Finally, in Section 5, I conclude with an outlook for the future

  18. Beyond the standard model

    International Nuclear Information System (INIS)

    Domokos, G.; Elliott, B.; Kovesi-Domokos, S.; Mrenna, S.

    1992-01-01

    In this paper the authors briefly review the necessity of going beyond the Standard Model. We argue that certain types of composite models of quarks and leptons may resolve some of the difficulties of the SM. Furthermore the authors argue that, even without a full specification of a composite model, one may predict some observable effects following from the compositeness hypothesis. The effects are most easily seen in reaction channels in which there is little competition from known processes predicted by the SM, typically in neutrino induced reactions. The authors suggest that above a certain characteristic energy, neutrino cross sections rise well above those predicted within the framework of the SM and the difference between the characteristic features of lepton and hadron induced reactions is blurred. The authors claim that there is some (so far, tenuous) evidence for the phenomenon we just alluded to: in certain high energy cosmic ray interactions it appears that photons and/or neutrinos behave in a manner which is inconsistent with the SM. The authors analyze the data and conclude that the origin of the anomaly in the observational data arises from an increased neutrino interaction cross section

  19. Standard model baryogenesis

    CERN Document Server

    Gavela, M.B.; Orloff, J.; Pene, O

    1994-01-01

    Simply on CP arguments, we argue against a Standard Model explanation of baryogenesis via the charge transport mechanism. A CP-asymmetry is found in the reflection coefficients of quarks hitting the electroweak phase boundary created during a first order phase transition. The problem is analyzed both in an academic zero temperature case and in the realistic finite temperature one. At finite temperature, a crucial role is played by the damping rate of quasi-quarks in a hot plasma, which induces loss of spatial coherence and suppresses reflection on the boundary even at tree-level. The resulting baryon asymmetry is many orders of magnitude below what observation requires. We comment as well on related works.

  20. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. Published by Elsevier Inc.

  1. Assessment of hospital performance with a case-mix standardized mortality model using an existing administrative database in Japan

    Directory of Open Access Journals (Sweden)

    Fushimi Kiyohide

    2010-05-01

    Full Text Available Abstract Background Few studies have examined whether risk adjustment is evenly applicable to hospitals with various characteristics and case-mix. In this study, we applied a generic prediction model to nationwide discharge data from hospitals with various characteristics. Method We used standardized data of 1,878,767 discharged patients provided by 469 hospitals from July 1 to October 31, 2006. We generated and validated a case-mix in-hospital mortality prediction model using 50/50 split sample validation. We classified hospitals into two groups based on c-index value (hospitals with c-index ≥ 0.8; hospitals with c-index Results The model demonstrated excellent discrimination as indicated by the high average c-index and small standard deviation (c-index = 0.88 ± 0.04. Expected mortality rate of each hospital was highly correlated with observed mortality rate (r = 0.693, p Conclusion The model fits well to a group of hospitals with a wide variety of acute care events, though model fit is less satisfactory for specialized hospitals and those with convalescent wards. Further sophistication of the generic prediction model would be recommended to obtain optimal indices to region specific conditions.

  2. Premise for Standardized Sepsis Models.

    Science.gov (United States)

    Remick, Daniel G; Ayala, Alfred; Chaudry, Irshad; Coopersmith, Craig M; Deutschman, Clifford; Hellman, Judith; Moldawer, Lyle; Osuchowski, Marcin

    2018-06-05

    Sepsis morbidity and mortality exacts a toll on patients and contributes significantly to healthcare costs. Preclinical models of sepsis have been used to study disease pathogenesis and test new therapies, but divergent outcomes have been observed with the same treatment even when using the same sepsis model. Other disorders such as diabetes, cancer, malaria, obesity and cardiovascular diseases have used standardized, preclinical models that allow laboratories to compare results. Standardized models accelerate the pace of research and such models have been used to test new therapies or changes in treatment guidelines. The National Institutes of Health (NIH) mandated that investigators increase data reproducibility and the rigor of scientific experiments and has also issued research funding announcements about the development and refinement of standardized models. Our premise is that refinement and standardization of preclinical sepsis models may accelerate the development and testing of potential therapeutics for human sepsis, as has been the case with preclinical models for other disorders. As a first step towards creating standardized models, we suggest 1) standardizing the technical standards of the widely used cecal ligation and puncture model and 2) creating a list of appropriate organ injury and immune dysfunction parameters. Standardized sepsis models could enhance reproducibility and allow comparison of results between laboratories and may accelerate our understanding of the pathogenesis of sepsis.

  3. Risk assessment using probabilistic standards

    International Nuclear Information System (INIS)

    Avila, R.

    2004-01-01

    A core element of risk is uncertainty represented by plural outcomes and their likelihood. No risk exists if the future outcome is uniquely known and hence guaranteed. The probability that we will die some day is equal to 1, so there would be no fatal risk if sufficiently long time frame is assumed. Equally, rain risk does not exist if there was 100% assurance of rain tomorrow, although there would be other risks induced by the rain. In a formal sense, any risk exists if, and only if, more than one outcome is expected at a future time interval. In any practical risk assessment we have to deal with uncertainties associated with the possible outcomes. One way of dealing with the uncertainties is to be conservative in the assessments. For example, we may compare the maximal exposure to a radionuclide with a conservatively chosen reference value. In this case, if the exposure is below the reference value then it is possible to assure that the risk is low. Since single values are usually compared; this approach is commonly called 'deterministic'. Its main advantage lies in the simplicity and in that it requires minimum information. However, problems arise when the reference values are actually exceeded or might be exceeded, as in the case of potential exposures, and when the costs for realizing the reference values are high. In those cases, the lack of knowledge on the degree of conservatism involved impairs a rational weighing of the risks against other interests. In this presentation we will outline an approach for dealing with uncertainties that in our opinion is more consistent. We will call it a 'fully probabilistic risk assessment'. The essence of this approach consists in measuring the risk in terms of probabilities, where the later are obtained from comparison of two probabilistic distributions, one reflecting the uncertainties in the outcomes and one reflecting the uncertainties in the reference value (standard) used for defining adverse outcomes. Our first aim

  4. Assessing ballast treatment standards for effect on rate of establishment using a stochastic model of the green crab

    Directory of Open Access Journals (Sweden)

    Cynthia Cooper

    2012-03-01

    Full Text Available This paper describes a stochastic model used to characterize the probability/risk of NIS establishment from ships' ballast water discharges. Establishment is defined as the existence of a sufficient number of individuals of a species to provide for a sustained population of the organism. The inherent variability in population dynamics of organisms in their native or established environments is generally difficult to quantify. Muchqualitative information is known about organism life cycles and biotic and abiotic environmental pressures on the population, but generally little quantitative data exist to develop a mechanistic model of populations in such complex environments. Moreover, there is little quantitative data to characterize the stochastic fluctuations of population size over time even without accounting for systematic responses to biotic and abiotic pressures. This research applies an approach using life-stage density and fecundity measures reported in research to determine a stochastic model of an organism's population dynamics. The model is illustrated withdata from research studies on the green crab that span a range of habitats of the established organism and were collected over some years to represent a range of time-varying biotic and abiotic conditions that are expected to exist in many receiving environments. This model is applied to introductions of NIS at the IMO D-2 and the U.S. ballast water discharge standard levels designated as Phase Two in the United States Coast Guard'sNotice of Proposed Rulemaking. Under a representative range of ballast volumes discharged at U.S. ports, the average rate of establishment of green crabs for ballast waters treated to the IMO-D2 concentration standard (less than 10 organisms/m3 is predicted to be reduced to about a third the average rate from untreated ballast water discharge. The longevity of populations from the untreated ballast water discharges is expected to be reducedby about 90% by

  5. Beyond Standard Model Physics

    Energy Technology Data Exchange (ETDEWEB)

    Bellantoni, L.

    2009-11-01

    There are many recent results from searches for fundamental new physics using the TeVatron, the SLAC b-factory and HERA. This talk quickly reviewed searches for pair-produced stop, for gauge-mediated SUSY breaking, for Higgs bosons in the MSSM and NMSSM models, for leptoquarks, and v-hadrons. There is a SUSY model which accommodates the recent astrophysical experimental results that suggest that dark matter annihilation is occurring in the center of our galaxy, and a relevant experimental result. Finally, model-independent searches at D0, CDF, and H1 are discussed.

  6. A standardized patient model to teach and assess professionalism and communication skills: the effect of personality type on performance.

    Science.gov (United States)

    Lifchez, Scott D; Redett, Richard J

    2014-01-01

    Teaching and assessing professionalism and interpersonal communication skills can be more difficult for surgical residency programs than teaching medical knowledge or patient care, for which many structured educational curricula and assessment tools exist. Residents often learn these skills indirectly, by observing the behavior of their attendings when communicating with patients and colleagues. The purpose of this study was to assess the results of an educational curriculum we created to teach and assess our residents in professionalism and communication. We assessed resident and faculty prior education in delivering bad news to patients. Residents then participated in a standardized patient (SP) encounter to deliver bad news to a patient's family regarding a severe burn injury. Residents received feedback from the encounter and participated in an education curriculum on communication skills and professionalism. As a part of this curriculum, residents underwent assessment of communication style using the Myers-Briggs type inventory. The residents then participated in a second SP encounter discussing a severe pulmonary embolus with a patient's family. Resident performance on the SP evaluation correlated with an increased comfort in delivering bad news. Comfort in delivering bad news did not correlate with the amount of prior education on the topic for either residents or attendings. Most of our residents demonstrated an intuitive thinking style (NT) on the Myers-Briggs type inventory, very different from population norms. The lack of correlation between comfort in delivering bad news and prior education on the subject may indicate the difficulty in imparting communication and professionalism skills to residents effectively. Understanding communication style differences between our residents and the general population can help us teach professionalism and communication skills more effectively. With the next accreditation system, residency programs would need to

  7. Standard model without Higgs particles

    International Nuclear Information System (INIS)

    Kovalenko, S.G.

    1992-10-01

    A modification of the standard model of electroweak interactions with the nonlocal Higgs sector is proposed. Proper form of nonlocality makes Higgs particles unobservable after the electroweak symmetry breaking. They appear only as a virtual state because their propagator is an entire function. We discuss some specific consequences of this approach comparing it with the conventional standard model. (author). 12 refs

  8. Establishing the isolated Standard Model

    International Nuclear Information System (INIS)

    Wells, James D.; Zhang, Zhengkang; Zhao, Yue

    2017-02-01

    The goal of this article is to initiate a discussion on what it takes to claim ''there is no new physics at the weak scale,'' namely that the Standard Model (SM) is ''isolated.'' The lack of discovery of beyond the SM (BSM) physics suggests that this may be the case. But to truly establish this statement requires proving all ''connected'' BSM theories are false, which presents a significant challenge. We propose a general approach to quantitatively assess the current status and future prospects of establishing the isolated SM (ISM), which we give a reasonable definition of. We consider broad elements of BSM theories, and show many examples where current experimental results are not sufficient to verify the ISM. In some cases, there is a clear roadmap for the future experimental program, which we outline, while in other cases, further efforts - both theoretical and experimental - are needed in order to robustly claim the establishment of the ISM in the absence of new physics discoveries.

  9. Assessment of hospital performance with a case-mix standardized mortality model using an existing administrative database in Japan.

    Science.gov (United States)

    Miyata, Hiroaki; Hashimoto, Hideki; Horiguchi, Hiromasa; Fushimi, Kiyohide; Matsuda, Shinya

    2010-05-19

    Few studies have examined whether risk adjustment is evenly applicable to hospitals with various characteristics and case-mix. In this study, we applied a generic prediction model to nationwide discharge data from hospitals with various characteristics. We used standardized data of 1,878,767 discharged patients provided by 469 hospitals from July 1 to October 31, 2006. We generated and validated a case-mix in-hospital mortality prediction model using 50/50 split sample validation. We classified hospitals into two groups based on c-index value (hospitals with c-index > or = 0.8; hospitals with c-index /=0.8 and were classified as the higher c-index group. A significantly higher proportion of hospitals in the lower c-index group were specialized hospitals and hospitals with convalescent wards. The model fits well to a group of hospitals with a wide variety of acute care events, though model fit is less satisfactory for specialized hospitals and those with convalescent wards. Further sophistication of the generic prediction model would be recommended to obtain optimal indices to region specific conditions.

  10. Measurement system for wind turbine acoustic noise assessment based on IEC standard and Qin′s model

    Institute of Scientific and Technical Information of China (English)

    Sun Lei; Qin Shuren; Bo Lin; Xu Liping; Stephan Joeckel

    2008-01-01

    A novel measurement system specially used in noise emission assessment and verification of wind turbine generator systems is presented that complies with specifications given in IEC 61400-11 to ensure the process consistency and accuracy. Theory elements of the calculation formula used for the sound power level of wind turbine have been discussed for the first time, and detailed calculation procedure of tonality and audibility integrating narrowband analysis and psychoacoustics is described. With a microphone and two PXI cards inserted into a PC, this system is designed in Qin′s model using VMIDS development system. Benefiting from the virtual instrument architecture, it′s the first time that all assessment process have been integrated into an organic whole, which gives full advantages of its efficiency, price, and facility. Mass experiments show that its assessment results accord with the ones given by MEASNET member.

  11. Framework for Designing The Assessment Models of Readiness SMEs to Adopt Indonesian National Standard (SNI), Case Study: SMEs Batik in Surakarta

    Science.gov (United States)

    Fahma, Fakhrina; Zakaria, Roni; Fajar Gumilang, Royan

    2018-03-01

    Since the ASEAN Economic Community (AEC) is released, the opportunity to expand market share has become very open, but the level of competition is also very high. Standardization is believed to be an important factor in seizing opportunities in the AEC’s era and other free trade agreements in the future. Standardization activities in industry can be proven by obtaining certification of SNI (Indonesian National Standard). This is a challenge for SMEs, considering that currently only 20% of SMEs had SNI certification both product and process. This research will be designed a model of readiness assessment to obtain SNI certification for SMEs. The stages of model development used an innovative approach by Roger (2003). Variables that affect the readiness of SMEs are obtained from product certification requirements established by BSN (National Standardization Agency) and LSPro (Certification body). This model will be used for mapping the readiness for SNI certification of SMEs’product. The level of readiness of SMEs is determined by the percentage of compliance with those requirements. Based on the result of this study, the five variables are determined as main aspects to assess SME readiness. For model validation, trials were conducted on Batik SMEs in Laweyan Surakarta.

  12. Beyond the standard model; Au-dela du modele standard

    Energy Technology Data Exchange (ETDEWEB)

    Cuypers, F. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs.

  13. Phenomenology beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Lykken, Joseph D.; /Fermilab

    2005-03-01

    An elementary review of models and phenomenology for physics beyond the Standard Model (excluding supersymmetry). The emphasis is on LHC physics. Based upon a talk given at the ''Physics at LHC'' conference, Vienna, 13-17 July 2004.

  14. About the standard solar model

    International Nuclear Information System (INIS)

    Cahen, S.

    1986-07-01

    A discussion of the still controversial solar helium content is presented, based on a comparison of recent standard solar models. Our last model yields an helium mass fraction ∼0.276, 6.4 SNU on 37 Cl and 126 SNU on 71 Ga

  15. The standard model and colliders

    International Nuclear Information System (INIS)

    Hinchliffe, I.

    1987-03-01

    Some topics in the standard model of strong and electroweak interactions are discussed, as well as how these topics are relevant for the high energy colliders which will become operational in the next few years. The radiative corrections in the Glashow-Weinberg-Salam model are discussed, stressing how these corrections may be measured at LEP and the SLC. CP violation is discussed briefly, followed by a discussion of the Higgs boson and the searches which are relevant to hadron colliders are then discussed. Some of the problems which the standard model does not solve are discussed, and the energy ranges accessible to the new colliders are indicated

  16. Dynamics of the standard model

    CERN Document Server

    Donoghue, John F; Holstein, Barry R

    2014-01-01

    Describing the fundamental theory of particle physics and its applications, this book provides a detailed account of the Standard Model, focusing on techniques that can produce information about real observed phenomena. The book begins with a pedagogic account of the Standard Model, introducing essential techniques such as effective field theory and path integral methods. It then focuses on the use of the Standard Model in the calculation of physical properties of particles. Rigorous methods are emphasized, but other useful models are also described. This second edition has been updated to include recent theoretical and experimental advances, such as the discovery of the Higgs boson. A new chapter is devoted to the theoretical and experimental understanding of neutrinos, and major advances in CP violation and electroweak physics have been given a modern treatment. This book is valuable to graduate students and researchers in particle physics, nuclear physics and related fields.

  17. Shifting Gears: Standards, Assessments, Curriculum, & Instruction.

    Science.gov (United States)

    Dougherty, Eleanor

    This book is designed to help educators move from a system that measures students against students to one that values mastery of central concepts and skills, striving for proficiency in publicly acknowledged standards of academic performance. It aims to connect the operative parts of standards-based education (standards, assessment, curriculum,…

  18. Establishing the isolated Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Wells, James D.; Zhang, Zhengkang [Michigan Univ., Ann Arbor, MI (United States). Michigan Center for Theoretical Physics; Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Zhao, Yue [Michigan Univ., Ann Arbor, MI (United States). Michigan Center for Theoretical Physics

    2017-02-15

    The goal of this article is to initiate a discussion on what it takes to claim ''there is no new physics at the weak scale,'' namely that the Standard Model (SM) is ''isolated.'' The lack of discovery of beyond the SM (BSM) physics suggests that this may be the case. But to truly establish this statement requires proving all ''connected'' BSM theories are false, which presents a significant challenge. We propose a general approach to quantitatively assess the current status and future prospects of establishing the isolated SM (ISM), which we give a reasonable definition of. We consider broad elements of BSM theories, and show many examples where current experimental results are not sufficient to verify the ISM. In some cases, there is a clear roadmap for the future experimental program, which we outline, while in other cases, further efforts - both theoretical and experimental - are needed in order to robustly claim the establishment of the ISM in the absence of new physics discoveries.

  19. Perspectives in the standard model

    International Nuclear Information System (INIS)

    Ellis, R.K.; Hill, C.T.; Lykken, J.D.

    1992-01-01

    Particle physics is an experimentally based science, with a need for the best theorists to make contact with data and to enlarge and enhance their theoretical descriptions as the subject evolves. The authors felt it imperative that the TASI (Theoretical Advanced Study Institute) program reflect this need. The goal of this conference, was to provide the students with a comprehensive look at the current understanding of the standard model, as well as the techniques which promise to advance that understanding in the future. Topics covered include: symmetry breaking in the standard model; physics beyond the standard model; chiral effective Lagrangians; semi-classical string theory; renormalization of electroweak gauge interactions; electroweak experiments at LEP; the CKM matrix and CP violation; axion searches; lattice QCD; perturbative QCD; heavy quark effective field theory; heavy flavor physics on the lattice; and neutrinos. Separate abstracts were prepared for 13 papers in this conference

  20. The standard model and beyond

    CERN Document Server

    Langacker, Paul

    2017-01-01

    This new edition of The Standard Model and Beyond presents an advanced introduction to the physics and formalism of the standard model and other non-abelian gauge theories. It provides a solid background for understanding supersymmetry, string theory, extra dimensions, dynamical symmetry breaking, and cosmology. In addition to updating all of the experimental and phenomenological results from the first edition, it contains a new chapter on collider physics; expanded discussions of Higgs, neutrino, and dark matter physics; and many new problems. The book first reviews calculational techniques in field theory and the status of quantum electrodynamics. It then focuses on global and local symmetries and the construction of non-abelian gauge theories. The structure and tests of quantum chromodynamics, collider physics, the electroweak interactions and theory, and the physics of neutrino mass and mixing are thoroughly explored. The final chapter discusses the motivations for extending the standard model and examin...

  1. Standard model of knowledge representation

    Science.gov (United States)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  2. Extensions of the Standard Model

    CERN Document Server

    Zwirner, Fabio

    1996-01-01

    Rapporteur talk at the International Europhysics Conference on High Energy Physics, Brussels (Belgium), July 27-August 2, 1995. This talk begins with a brief general introduction to the extensions of the Standard Model, reviewing the ideology of effective field theories and its practical implications. The central part deals with candidate extensions near the Fermi scale, focusing on some phenomenological aspects of the Minimal Supersymmetric Standard Model. The final part discusses some possible low-energy implications of further extensions near the Planck scale, namely superstring theories.

  3. Custom v. Standardized Risk Models

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-05-01

    Full Text Available We discuss when and why custom multi-factor risk models are warranted and give source code for computing some risk factors. Pension/mutual funds do not require customization but standardization. However, using standardized risk models in quant trading with much shorter holding horizons is suboptimal: (1 longer horizon risk factors (value, growth, etc. increase noise trades and trading costs; (2 arbitrary risk factors can neutralize alpha; (3 “standardized” industries are artificial and insufficiently granular; (4 normalization of style risk factors is lost for the trading universe; (5 diversifying risk models lowers P&L correlations, reduces turnover and market impact, and increases capacity. We discuss various aspects of custom risk model building.

  4. The standard model and beyond

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1989-05-01

    In these lectures, my aim is to present a status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows. I survey the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also commented on. In addition, I have included an appendix on dimensional regularization and a simple example which employs that technique. I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, extra Z' bosons, and compositeness are discussed. An overview of the physics of tau decays is also included. I discuss weak neutral current phenomenology and the extraction of sin 2 θW from experiment. The results presented there are based on a global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, implications for grand unified theories (GUTS), extra Z' gauge bosons, and atomic parity violation. The potential for further experimental progress is also commented on. Finally, I depart from the narrowest version of the standard model and discuss effects of neutrino masses, mixings, and electromagnetic moments. 32 refs., 3 figs., 5 tabs

  5. Beyond the Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future, at LHC and elsewhere. Supersymmetry, grand unification, extra dimensions and a glimpse of string theory will be presented.

  6. Calculating Impacts of Energy Standards on Energy Demand in U.S. Buildings under Uncertainty with an Integrated Assessment Model: Technical Background Data

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daly, Don S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Ying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McJeon, Haewon C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moss, Richard H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Patel, Pralit L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Marty J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rice, Jennie S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhou, Yuyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-06

    This report presents data and assumptions employed in an application of PNNL’s Global Change Assessment Model with a newly-developed Monte Carlo analysis capability. The model is used to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The report provides a summary of how residential and commercial buildings are modeled, together with assumptions made for the distributions of state–level population, Gross Domestic Product (GDP) per worker, efficiency and cost of residential and commercial energy equipment by end use, and efficiency and cost of residential and commercial building shells. The cost and performance of equipment and of building shells are reported separately for current building and equipment efficiency standards and for more aggressive standards. The report also details assumptions concerning future improvements brought about by projected trends in technology.

  7. Standards, Assessments & Opting Out, Spring 2015

    Science.gov (United States)

    Advance Illinois, 2015

    2015-01-01

    In the spring, Illinois students will take new state assessments that reflect the rigor and relevance of the new Illinois Learning Standards. But some classmates will sit out and join the pushback against standardized testing. Opt-out advocates raise concerns about over-testing, and the resulting toll on students as well as the impact on classroom…

  8. The standard model and beyond

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1989-05-01

    The field of elementary particle, or high energy, physics seeks to identify the most elementary constituents of nature and to study the forces that govern their interactions. Increasing the energy of a probe in a laboratory experiment increases its power as an effective microscope for discerning increasingly smaller structures of matter. Thus we have learned that matter is composed of molecules that are in turn composed of atoms, that the atom consists of a nucleus surrounded by a cloud of electrons, and that the atomic nucleus is a collection of protons and neutrons. The more powerful probes provided by high energy particle accelerators have taught us that a nucleon is itself made of objects called quarks. The forces among quarks and electrons are understood within a general theoretical framework called the ''standard model,'' that accounts for all interactions observed in high energy laboratory experiments to date. These are commonly categorized as the ''strong,'' ''weak'' and ''electromagnetic'' interactions. In this lecture I will describe the standard model, and point out some of its limitations. Probing for deeper structures in quarks and electrons defines the present frontier of particle physics. I will discuss some speculative ideas about extensions of the standard model and/or yet more fundamental forces that may underlie our present picture. 11 figs., 1 tab

  9. Extensions of the standard model

    International Nuclear Information System (INIS)

    Ramond, P.

    1983-01-01

    In these lectures we focus on several issues that arise in theoretical extensions of the standard model. First we describe the kinds of fermions that can be added to the standard model without affecting known phenomenology. We focus in particular on three types: the vector-like completion of the existing fermions as would be predicted by a Kaluza-Klein type theory, which we find cannot be realistically achieved without some chiral symmetry; fermions which are vector-like by themselves, such as do appear in supersymmetric extensions, and finally anomaly-free chiral sets of fermions. We note that a chiral symmetry, such as the Peccei-Quinn symmetry can be used to produce a vector-like theory which, at scales less than M/sub W/, appears to be chiral. Next, we turn to the analysis of the second hierarchy problem which arises in Grand Unified extensions of the standard model, and plays a crucial role in proton decay of supersymmetric extensions. We review the known mechanisms for avoiding this problem and present a new one which seems to lead to the (family) triplication of the gauge group. Finally, this being a summer school, we present a list of homework problems. 44 references

  10. State Standards and State Assessment Systems: A Guide to Alignment. Series on Standards and Assessments.

    Science.gov (United States)

    La Marca, Paul M.; Redfield, Doris; Winter, Phoebe C.

    Alignment of content standards, performance standards, and assessments is crucial. This guide contains information to assist states and districts in aligning their assessment systems to their content and performance standards. It includes a review of current literature, both published and fugitive. The research is woven together with a few basic…

  11. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  12. Institutional model for supporting standardization

    International Nuclear Information System (INIS)

    Sanford, M.O.; Jackson, K.J.

    1993-01-01

    Restoring the nuclear option for utilities requires standardized designs. This premise is widely accepted by all parties involved in ALWR development activities. Achieving and maintaining standardization, however, demands new perspectives on the roles and responsibilities for the various commercial organizations involved in nuclear power. Some efforts are needed to define a workable model for a long-term support structure that will allow the benefits of standardization to be realized. The Nuclear Power Oversight Committee (NPOC) has developed a strategic plan that lays out the steps necessary to enable the nuclear industry to be in a position to order a new nuclear power plant by the mid 1990's. One of the key elements of the plan is the, ''industry commitment to standardization: through design certification, combined license, first-of-a-kind engineering, construction, operation, and maintenance of nuclear power plants.'' This commitment is a result of the recognition by utilities of the substantial advantages to standardization. Among these are economic benefits, licensing benefits from being treated as one of a family, sharing risks across a broader ownership group, sharing operating experiences, enhancing public safety, and a more coherent market force. Utilities controlled the construction of the past generation of nuclear units in a largely autonomous fashion procuring equipment and designs from a vendor, engineering services from an architect/engineer, and construction from a construction management firm. This, in addition to forcing the utility to assume virtually all of the risks associated with the project, typically resulted in highly customized designs based on preferences of the individual utility. However, the benefits of standardization can be realized only through cooperative choices and decision making by the utilities and through working as partners with reactor vendors, architect/engineers, and construction firms

  13. Temporal assessment of copper speciation, bioavailability and toxicity in UK freshwaters using chemical equilibrium and biotic ligand models: Implications for compliance with copper environmental quality standards.

    Science.gov (United States)

    Lathouri, Maria; Korre, Anna

    2015-12-15

    Although significant progress has been made in understanding how environmental factors modify the speciation, bioavailability and toxicity of metals such as copper in aquatic environments, the current methods used to establish water quality standards do not necessarily consider the different geological and geochemical characteristics of a given site and the factors that affect copper fate, bioavailability potential and toxicity. In addition, the temporal variation in the concentration and bioavailable metal fraction is also important in freshwater systems. The work presented in this paper illustrates the temporal and seasonal variability of a range of water quality parameters, and Cu speciation, bioavailability and toxicity at four freshwaters sites in the UK. Rivers Coquet, Cree, Lower Clyde and Eden (Kent) were selected to cover a broad range of different geochemical environments and site characteristics. The monitoring data used covered a period of around six years at almost monthly intervals. Chemical equilibrium modelling was used to study temporal variations in Cu speciation and was combined with acute toxicity modelling to assess Cu bioavailability for two aquatic species, Daphnia magna and Daphnia pulex. The estimated copper bioavailability, toxicity levels and the corresponding ecosystem risks were analysed in relation to key water quality parameters (alkalinity, pH and DOC). Although copper concentrations did not vary much during the sampling period or between the seasons at the different sites; copper bioavailability varied markedly. In addition, through the chronic-Cu BLM-based on the voluntary risk assessment approach, the potential environmental risk in terms of the chronic toxicity was assessed. A much higher likelihood of toxicity effects was found during the cold period at all sites. It is suggested that besides the metal (copper) concentration in the surface water environment, the variability and seasonality of other important water quality

  14. Review of the standard model

    International Nuclear Information System (INIS)

    Treille, D.

    1992-01-01

    The goal of this review is not to make one more celebration of the accuracy of LEP results, but rather to put them in a broader perspective. This set of measurements are compared with what they could and should be in the future if the various options available at LEP are exploited properly, and show that much is left to be done. Then various classes of non-LEP results are discussed which are already remarkable and still prone to improvements, which bring complementary information on the Standard Model, by probing it in widely different domains of applicability. (author) 46 refs.; 29 figs.; 12 tabs

  15. Standard for Models and Simulations

    Science.gov (United States)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  16. The standard model and beyond

    CERN Document Server

    Vergados, J D

    2017-01-01

    This book contains a systematic and pedagogical exposition of recent developments in particle physics and cosmology. It starts with two introductory chapters on group theory and the Dirac theory. Then it proceeds with the formulation of the Standard Model (SM) of Particle Physics, particle content and symmetries, fully exploiting the first chapters. It discusses the concept of gauge symmetries and emphasizes their role in particle physics. It then analyses the Higgs mechanism and the spontaneous symmetry breaking (SSB). It explains how the particles (gauge bosons and fermions) after SSB acquire a mass and get admixed. The various forms of charged currents are discussed in detail as well as how the parameters of the SM, which cannot be determined by the theory, are fixed by experiment, including the recent LHC data and the Higgs discovery. Quantum chromodynamics is discussed and various low energy approximations to it are presented. The Feynman diagrams are introduced and applied, in a way undertandable by fir...

  17. NUSS safety standards: A critical assessment

    International Nuclear Information System (INIS)

    Minogue, R.B.

    1985-01-01

    The NUSS safety standards are based on systematic review of safety criteria of many countries in a process carefully defined to assure completeness of coverage. They represent an international consensus of accepted safety principles and practices for regulation and for the design, construction, and operation of nuclear power plants. They are a codification of principles and practices already in use by some Member States. Thus, they are not standards which describe methodologies at their present state of evolution as a result of more recent experience and improvements in technological understanding. The NUSS standards assume an underlying body of national standards and a defined technological base. Detailed design and industrial practices vary between countries and the implementation of basic safety standards within countries has taken approaches that conform with national industrial practices. Thus, application of the NUSS standards requires reconciliation with the standards of the country where the reactor will be built as well as with the country from which procurement takes place. Experience in making that reconciliation will undoubtedly suggest areas of needed improvement. After the TMI accident a reassessment of the NUSS programme was made and it was concluded that, given the information at that time and the then level of technology, the basic approach was sound; the NUSS programme should be continued to completion, and the standards should be brought into use. It was also recognized, however, that in areas such as probabilistic risk assessment, human factors methodology, and consideration of detailed accident sequences, more advanced technology was emerging. As these technologies develop, and become more amenable to practical application, it is anticipated that the NUSS standards will need revision. Ideally those future revisions will also flow from experience in their use

  18. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    Science.gov (United States)

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Non-commutative standard model: model building

    CERN Document Server

    Chaichian, Masud; Presnajder, P

    2003-01-01

    A non-commutative version of the usual electro-weak theory is constructed. We discuss how to overcome the two major problems: (1) although we can have non-commutative U(n) (which we denote by U sub * (n)) gauge theory we cannot have non-commutative SU(n) and (2) the charges in non-commutative QED are quantized to just 0,+-1. We show how the latter problem with charge quantization, as well as with the gauge group, can be resolved by taking the U sub * (3) x U sub * (2) x U sub * (1) gauge group and reducing the extra U(1) factors in an appropriate way. Then we proceed with building the non-commutative version of the standard model by specifying the proper representations for the entire particle content of the theory, the gauge bosons, the fermions and Higgs. We also present the full action for the non-commutative standard model (NCSM). In addition, among several peculiar features of our model, we address the inherentCP violation and new neutrino interactions. (orig.)

  20. Experiments beyond the standard model

    International Nuclear Information System (INIS)

    Perl, M.L.

    1984-09-01

    This paper is based upon lectures in which I have described and explored the ways in which experimenters can try to find answers, or at least clues toward answers, to some of the fundamental questions of elementary particle physics. All of these experimental techniques and directions have been discussed fully in other papers, for example: searches for heavy charged leptons, tests of quantum chromodynamics, searches for Higgs particles, searches for particles predicted by supersymmetric theories, searches for particles predicted by technicolor theories, searches for proton decay, searches for neutrino oscillations, monopole searches, studies of low transfer momentum hadron physics at very high energies, and elementary particle studies using cosmic rays. Each of these subjects requires several lectures by itself to do justice to the large amount of experimental work and theoretical thought which has been devoted to these subjects. My approach in these tutorial lectures is to describe general ways to experiment beyond the standard model. I will use some of the topics listed to illustrate these general ways. Also, in these lectures I present some dreams and challenges about new techniques in experimental particle physics and accelerator technology, I call these Experimental Needs. 92 references

  1. A Model for Semantic IS Standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; Oude Luttighuis, Paul; van Hillegersberg, Jos

    2011-01-01

    We argue that, in order to suggest improvements of any kind to semantic information system (IS) standards, better understanding of the conceptual structure of semantic IS standard is required. This study develops a model for semantic IS standard, based on literature and expert knowledge. The model

  2. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a ''standard model''. The ''standard model'' consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the ''standard model'' to determine if the requirements of ''non-standard'' architectures can be met. Several possible extensions to the ''standard model'' are suggested including software as well as the hardware architectural feature

  3. MRI assessment of myelination: an age standardization

    Energy Technology Data Exchange (ETDEWEB)

    Staudt, M. (Kinderklinik Dritter Orden, Passau (Germany)); Schropp, C. (Kinderklinik Dritter Orden, Passau (Germany)); Staudt, F. (Kinderklinik Dritter Orden, Passau (Germany)); Obletter, N. (Radiologische Praxis, Klinikum Ingolstadt (Germany)); Bise, K. (Neuropathologisches Inst., Muenchen Univ. (Germany)); Breit, A. (MR Tomographie, Klinikum Passau (Germany)); Weinmann, H.M. (Kinderklinik Schwabing, Muenchen (Germany))

    1994-04-01

    777 cerebral MRI examinations of children aged 3 days to 14 years were staged for myelination to establish an age standardization. Staging was performed using a system proposed in a previous paper, separately ranking 10 different regions of the brain. Interpretation of the results led to the identification of foue clinical diagnoses that are frequently associated with delays in myelination: West syndrome, cerebral palsy, developmental retardation, and congenital anomalies. In addition, it was found that assessment of myelination in children with head injuries was not practical as alterations in MRI signal can simulate earlier stages of myelination. Age limits were therefore calculated from the case material after excluding all children with these conditions. When simplifications of the definition of the stages are applied, these age limits for the various stages of myelination of each of the 10 regions of the brain make the staging system applicable for routine assessment of myelination. (orig.)

  4. An alternative to the standard model

    International Nuclear Information System (INIS)

    Baek, Seungwon; Ko, Pyungwon; Park, Wan-Il

    2014-01-01

    We present an extension of the standard model to dark sector with an unbroken local dark U(1) X symmetry. Including various singlet portal interactions provided by the standard model Higgs, right-handed neutrinos and kinetic mixing, we show that the model can address most of phenomenological issues (inflation, neutrino mass and mixing, baryon number asymmetry, dark matter, direct/indirect dark matter searches, some scale scale puzzles of the standard collisionless cold dark matter, vacuum stability of the standard model Higgs potential, dark radiation) and be regarded as an alternative to the standard model. The Higgs signal strength is equal to one as in the standard model for unbroken U(1) X case with a scalar dark matter, but it could be less than one independent of decay channels if the dark matter is a dark sector fermion or if U(1) X is spontaneously broken, because of a mixing with a new neutral scalar boson in the models

  5. Field theory and the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Dudas, E [Orsay, LPT (France)

    2014-07-01

    This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.

  6. Quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert

    2011-01-01

    Semantic IS (Information Systems) standards are essential for achieving interoperability between organizations. However a recent survey suggests that not the full benefits of standards are achieved, due to the quality issues. This paper presents a quality model for semantic IS standards, that should

  7. A revisited standard solar model

    International Nuclear Information System (INIS)

    Casse, M.; Cahen, S.; Doom, C.

    1987-01-01

    Recent models of the Sun, including our own, based on canonical physics and featuring modern reaction rates and radiative opacities are presented. They lead to a presolar helium abundance, in better agreement with the value found in the Orion nebula. Most models predict a neutrino counting rate greater than 6 SNU in the chlorine-argon detector, which is at least 3 times higher than the observed rate. The primordial helium abundance derived from the solar one, on the basis of recent models of helium production from the birth of the Galaxy to the birth of the sun, is significantly higher than the value inferred from observations of extragalactic metal-poor nebulae. This indicates that the stellar production of helium is probably underestimated by the models considered

  8. Modeling in the Common Core State Standards

    Science.gov (United States)

    Tam, Kai Chung

    2011-01-01

    The inclusion of modeling and applications into the mathematics curriculum has proven to be a challenging task over the last fifty years. The Common Core State Standards (CCSS) has made mathematical modeling both one of its Standards for Mathematical Practice and one of its Conceptual Categories. This article discusses the need for mathematical…

  9. Beyond the supersymmetric standard model

    Energy Technology Data Exchange (ETDEWEB)

    Hall, L.J.

    1988-02-01

    The possibility of baryon number violation at the weak scale and an alternative primordial nucleosynthesis scheme arising from the decay of gravitations are discussed. The minimal low energy supergravity model is defined and a few of its features are described. Renormalization group scaling and flavor physics are mentioned.

  10. A revisited standard solar model

    International Nuclear Information System (INIS)

    Casse, M.; Cahen, S.; Doom, C.

    1985-09-01

    Recent models of the Sun, including our own, based on canonical physics and featuring modern reaction rates and radiative opacities are presented. They lead to a presolar helium abundance of approximately 0.28 by mass, at variance with the value of 0.25 proposed by Bahcall et al. (1982, 1985), but in better agreement with the value found in the Orion nebula. Most models predict a neutrino counting rate greater than 6 SNU in the chlorine-argon detector, which is at least 3 times higher than the observed rate. The primordial helium abundance derived from the solar one, on the basis of recent models of helium production from the birth of the Galaxy to the birth of the sun, Ysub(P) approximately 0.26, is significantly higher than the value inferred from observations of extragalactic metal-poor nebulae (Y approximately 0.23). This indicates that the stellar production of helium is probably underestimated by the models considered

  11. Beyond the supersymmetric standard model

    International Nuclear Information System (INIS)

    Hall, L.J.

    1988-02-01

    The possibility of baryon number violation at the weak scale and an alternative primordial nucleosynthesis scheme arising from the decay of gravitations are discussed. The minimal low energy supergravity model is defined and a few of its features are described. Renormalization group scaling and flavor physics are mentioned

  12. Physics beyond the Standard Model

    Science.gov (United States)

    Lach, Theodore

    2011-04-01

    Recent discoveries of the excited states of the Bs** meson along with the discovery of the omega-b-minus have brought into popular acceptance the concept of the orbiting quarks predicted by the Checker Board Model (CBM) 14 years ago. Back then the concept of orbiting quarks was not fashionable. Recent estimates of velocities of these quarks inside the proton and neutron are in excess of 90% the speed of light also in agreement with the CBM model. Still a 2D structure of the nucleus has not been accepted nor has it been proven wrong. The CBM predicts masses of the up and dn quarks are 237.31 MeV and 42.392 MeV respectively and suggests that a lighter generation of quarks u and d make up a different generation of quarks that make up light mesons. The CBM also predicts that the T' and B' quarks do exist and are not as massive as might be expected. (this would make it a 5G world in conflict with the SM) The details of the CB model and prediction of quark masses can be found at: http://checkerboard.dnsalias.net/ (1). T.M. Lach, Checkerboard Structure of the Nucleus, Infinite Energy, Vol. 5, issue 30, (2000). (2). T.M. Lach, Masses of the Sub-Nuclear Particles, nucl-th/0008026, @http://xxx.lanl.gov/.

  13. Beyond the Standard Model of Cosmology

    International Nuclear Information System (INIS)

    Ellis, John; Nanopoulos, D. V.

    2004-01-01

    Recent cosmological observations of unprecented accuracy, by WMAP in particular, have established a 'Standard Model' of cosmology, just as LEP established the Standard Model of particle physics. Both Standard Models raise open questions whose answers are likely to be linked. The most fundamental problems in both particle physics and cosmology will be resolved only within a framework for Quantum Gravity, for which the only game in town is string theory. We discuss novel ways to model cosmological inflation and late acceleration in a non-critical string approach, and discuss possible astrophysical tests

  14. The minimal non-minimal standard model

    International Nuclear Information System (INIS)

    Bij, J.J. van der

    2006-01-01

    In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed

  15. Savannah River Site peer evaluator standards: Operator assessment for restart

    International Nuclear Information System (INIS)

    1990-01-01

    Savannah River Site has implemented a Peer Evaluator program for the assessment of certified Central Control Room Operators, Central Control Room Supervisors and Shift Technical Engineers prior to restart. This program is modeled after the nuclear Regulatory Commission's (NRC's) Examiner Standard, ES-601, for the requalification of licensed operators in the commercial utility industry. It has been tailored to reflect the unique differences between Savannah River production reactors and commercial power reactors

  16. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a open-quotes standard modelclose quotes. The open-quotes standard modelclose quotes consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the open-quotes standard modelclose quotes to determine if the requirements of open-quotes non-standardclose quotes architectures can be met. Several possible extensions to the open-quotes standard modelclose quotes are suggested including software as well as the hardware architectural features

  17. Electroweak baryogenesis and the standard model

    International Nuclear Information System (INIS)

    Huet, P.

    1994-01-01

    Electroweak baryogenesis is addressed within the context of the standard model of particle physics. Although the minimal standard model has the means of fulfilling the three Sakharov's conditions, it falls short to explaining the making of the baryon asymmetry of the universe. In particular, it is demonstrated that the phase of the CKM mixing matrix is an, insufficient source of CP violation. The shortcomings of the standard model could be bypassed by enlarging the symmetry breaking sector and adding a new source of CP violation

  18. Discrete symmetry breaking beyond the standard model

    NARCIS (Netherlands)

    Dekens, Wouter Gerard

    2015-01-01

    The current knowledge of elementary particles and their interactions is summarized in the Standard Model of particle physics. Practically all the predictions of this model, that have been tested, were confirmed experimentally. Nonetheless, there are phenomena which the model cannot explain. For

  19. Beyond the Standard Model (2/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  20. Beyond the Standard Model (5/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  1. Beyond the Standard Model (3/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  2. Beyond the Standard Model (4/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  3. The standard model in a nutshell

    CERN Document Server

    Goldberg, Dave

    2017-01-01

    For a theory as genuinely elegant as the Standard Model--the current framework describing elementary particles and their forces--it can sometimes appear to students to be little more than a complicated collection of particles and ranked list of interactions. The Standard Model in a Nutshell provides a comprehensive and uncommonly accessible introduction to one of the most important subjects in modern physics, revealing why, despite initial appearances, the entire framework really is as elegant as physicists say. Dave Goldberg uses a "just-in-time" approach to instruction that enables students to gradually develop a deep understanding of the Standard Model even if this is their first exposure to it. He covers everything from relativity, group theory, and relativistic quantum mechanics to the Higgs boson, unification schemes, and physics beyond the Standard Model. The book also looks at new avenues of research that could answer still-unresolved questions and features numerous worked examples, helpful illustrat...

  4. Is the Standard Model about to crater?

    CERN Multimedia

    Lane, Kenneth

    2015-01-01

    The Standard Model is coming under more and more pressure from experiments. New results from the analysis of LHC's Run 1 data show effects that, if confirmed, would be the signature of new interactions at the TeV scale.

  5. Beyond the Standard Model (1/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  6. Assessing the predictive capability of optical imaging techniques, Spatial Frequency Domain Imaging (SFDI) and Laser Speckle Imaging (LSI), to the gold standard of clinical assessment in a controlled animal model

    Science.gov (United States)

    Ponticorvo, A.; Rowland, R.; Baldado, M.; Burmeister, D. M.; Christy, R. J.; Bernal, N.; Durkin, A. J.

    2018-02-01

    The current standard for assessment of burn severity and subsequent wound healing is through clinical examination, which is highly subjective. Accurate early assessment of burn severity is critical for dictating the course of wound management. Complicating matters is the fact that burn wounds are often large and can have multiple regions that vary in severity. In order to manage the treatment more effectively, a tool that can provide spatially resolved information related to mapping burn severity could aid clinicians when making decisions. Several new technologies focus on burn care in an attempt to help clinicians objectively determine burn severity. By quantifying perfusion, laser speckle imaging (LSI) has had success in categorizing burn wound severity at earlier time points than clinical assessment alone. Additionally, spatial frequency domain imaging (SFDI) is a new technique that can quantify the tissue structural damage associated with burns to achieve earlier categorization of burn severity. Here we compared the performance of a commercial LSI device (PeriCam PSI, Perimed Inc.), a SFDI device (Reflect RSTM, Modulated Imaging Inc.) and conventional clinical assessment in a controlled (porcine) model of graded burn wound severity over the course of 28 days. Specifically we focused on the ability of each system to predict the spatial heterogeneity of the healed wound at 28 days, based on the images at an early time point. Spatial heterogeneity was defined by clinical assessment of distinct regions of healing on day 28. Across six pigs, 96 burn wounds (3 cm diameter) were created. Clinical assessment at day 28 indicated that 39 had appeared to heal in a heterogeneous manner. Clinical observation at day 1 found 35 / 39 (90%) to be spatially heterogeneous in terms of burn severity. The LSI system was able to detect spatial heterogeneity of burn severity in 14 / 39 (36%) cases on day 1 and 23 / 39 cases (59%) on day 7. By contrast the SFDI system was able to

  7. From the standard model to dark matter

    International Nuclear Information System (INIS)

    Wilczek, F.

    1995-01-01

    The standard model of particle physics is marvelously successful. However, it is obviously not a complete or final theory. I shall argue here that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Taking these hints seriously, one is led to predict the existence of new types of very weakly interacting matter, stable on cosmological time scales and produced with cosmologically interesting densities--that is, ''dark matter''. copyright 1995 American Institute of Physics

  8. Standard Model measurements with the ATLAS detector

    Directory of Open Access Journals (Sweden)

    Hassani Samira

    2015-01-01

    Full Text Available Various Standard Model measurements have been performed in proton-proton collisions at a centre-of-mass energy of √s = 7 and 8 TeV using the ATLAS detector at the Large Hadron Collider. A review of a selection of the latest results of electroweak measurements, W/Z production in association with jets, jet physics and soft QCD is given. Measurements are in general found to be well described by the Standard Model predictions.

  9. Standard Model Particles from Split Octonions

    Directory of Open Access Journals (Sweden)

    Gogberashvili M.

    2016-01-01

    Full Text Available We model physical signals using elements of the algebra of split octonions over the field of real numbers. Elementary particles are corresponded to the special elements of the algebra that nullify octonionic norms (zero divisors. It is shown that the standard model particle spectrum naturally follows from the classification of the independent primitive zero divisors of split octonions.

  10. Exploring the Standard Model of Particles

    Science.gov (United States)

    Johansson, K. E.; Watkins, P. M.

    2013-01-01

    With the recent discovery of a new particle at the CERN Large Hadron Collider (LHC) the Higgs boson could be about to be discovered. This paper provides a brief summary of the standard model of particle physics and the importance of the Higgs boson and field in that model for non-specialists. The role of Feynman diagrams in making predictions for…

  11. Noncommutative geometry and the standard model vacuum

    International Nuclear Information System (INIS)

    Barrett, John W.; Dawe Martins, Rachel A.

    2006-01-01

    The space of Dirac operators for the Connes-Chamseddine spectral action for the standard model of particle physics coupled to gravity is studied. The model is extended by including right-handed neutrino states, and the S 0 -reality axiom is not assumed. The possibility of allowing more general fluctuations than the inner fluctuations of the vacuum is proposed. The maximal case of all possible fluctuations is studied by considering the equations of motion for the vacuum. While there are interesting nontrivial vacua with Majorana-type mass terms for the leptons, the conclusion is that the equations are too restrictive to allow solutions with the standard model mass matrix

  12. Towards LHC physics with nonlocal Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Biswas, Tirthabir, E-mail: tbiswas@loyno.edu [Department of Physics, Loyola University, 6363 St. Charles Avenue, Box 92, New Orleans, LA 70118 (United States); Okada, Nobuchika, E-mail: okadan@ua.edu [Department of Physics and Astronomy, University of Alabama, Tuscaloosa, AL 35487-0324 (United States)

    2015-09-15

    We take a few steps towards constructing a string-inspired nonlocal extension of the Standard Model. We start by illustrating how quantum loop calculations can be performed in nonlocal scalar field theory. In particular, we show the potential to address the hierarchy problem in the nonlocal framework. Next, we construct a nonlocal abelian gauge model and derive modifications of the gauge interaction vertex and field propagators. We apply the modifications to a toy version of the nonlocal Standard Model and investigate collider phenomenology. We find the lower bound on the scale of nonlocality from the 8 TeV LHC data to be 2.5–3 TeV.

  13. Big Bang nucleosynthesis: The standard model

    International Nuclear Information System (INIS)

    Steigman, G.

    1989-01-01

    Current observational data on the abundances of deuterium, helium-3, helium-4 and lithium-7 are reviewed and these data are used to infer (or to bound) the primordial abundances of these elements. The physics of primordial nucleosynthesis in the context of the ''standard'' (isotropic, homogeneous,...) hot big bang model is outlined and the primordial abundances predicted within the context of this model are presented. The theoretical predictions are then confronted with the observational data. This confrontation reveals the remarkable consistency of the standard model, constrains the nucleon abundance to lie within a narrow range and, permits the existence of no more than one additional flavor of light neutrinos

  14. Assessment of Safety Standards for Automotive Electronic Control Systems

    Science.gov (United States)

    2016-06-01

    This report summarizes the results of a study that assessed and compared six industry and government safety standards relevant to the safety and reliability of automotive electronic control systems. These standards include ISO 26262 (Road Vehicles - ...

  15. Looking for physics beyond the standard model

    International Nuclear Information System (INIS)

    Binetruy, P.

    2002-01-01

    Motivations for new physics beyond the Standard Model are presented. The most successful and best motivated option, supersymmetry, is described in some detail, and the associated searches performed at LEP are reviewed. These include searches for additional Higgs bosons and for supersymmetric partners of the standard particles. These searches constrain the mass of the lightest supersymmetric particle which could be responsible for the dark matter of the universe. (authors)

  16. Setting technical standards for visual assessment procedures

    Science.gov (United States)

    Kenneth H. Craik; Nickolaus R. Feimer

    1979-01-01

    Under the impetus of recent legislative and administrative mandates concerning analysis and management of the landscape, governmental agencies are being called upon to adopt or develop visual resource and impact assessment (VRIA) systems. A variety of techniques that combine methods of psychological assessment and landscape analysis to serve these purposes is being...

  17. The Standard Model and Higgs physics

    Science.gov (United States)

    Torassa, Ezio

    2018-05-01

    The Standard Model is a consistent and computable theory that successfully describes the elementary particle interactions. The strong, electromagnetic and weak interactions have been included in the theory exploiting the relation between group symmetries and group generators, in order to smartly introduce the force carriers. The group properties lead to constraints between boson masses and couplings. All the measurements performed at the LEP, Tevatron, LHC and other accelerators proved the consistency of the Standard Model. A key element of the theory is the Higgs field, which together with the spontaneous symmetry breaking, gives mass to the vector bosons and to the fermions. Unlike the case of vector bosons, the theory does not provide prediction for the Higgs boson mass. The LEP experiments, while providing very precise measurements of the Standard Model theory, searched for the evidence of the Higgs boson until the year 2000. The discovery of the top quark in 1994 by the Tevatron experiments and of the Higgs boson in 2012 by the LHC experiments were considered as the completion of the fundamental particles list of the Standard Model theory. Nevertheless the neutrino oscillations, the dark matter and the baryon asymmetry in the Universe evidence that we need a new extended model. In the Standard Model there are also some unattractive theoretical aspects like the divergent loop corrections to the Higgs boson mass and the very small Yukawa couplings needed to describe the neutrino masses. For all these reasons, the hunt of discrepancies between Standard Model and data is still going on with the aim to finally describe the new extended theory.

  18. The Cosmological Standard Model and Its Implications for Beyond the Standard Model of Particle Physics

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    While the cosmological standard model has many notable successes, it assumes 95% of the mass-energy density of the universe is dark and of unknown nature, and there was an early stage of inflationary expansion driven by physics far beyond the range of the particle physics standard model. In the colloquium I will discuss potential particle-physics implications of the standard cosmological model.

  19. General formulation of standard model the standard model is in need of new concepts

    International Nuclear Information System (INIS)

    Khodjaev, L.Sh.

    2001-01-01

    The phenomenological basis for formulation of the Standard Model has been reviewed. The Standard Model based on the fundamental postulates has been formulated. The concept of the fundamental symmetries has been introduced: To look for not fundamental particles but fundamental symmetries. By searching of more general theory it is natural to search first of all global symmetries and than to learn consequence connected with the localisation of this global symmetries like wise of the standard Model

  20. LHC Higgs physics beyond the Standard Model

    International Nuclear Information System (INIS)

    Spannowsky, M.

    2007-01-01

    The Large Hadron Collider (LHC) at CERN will be able to perform proton collisions at a much higher center-of-mass energy and luminosity than any other collider. Its main purpose is to detect the Higgs boson, the last unobserved particle of the Standard Model, explaining the riddle of the origin of mass. Studies have shown, that for the whole allowed region of the Higgs mass processes exist to detect the Higgs at the LHC. However, the Standard Model cannot be a theory of everything and is not able to provide a complete understanding of physics. It is at most an effective theory up to a presently unknown energy scale. Hence, extensions of the Standard Model are necessary which can affect the Higgs-boson signals. We discuss these effects in two popular extensions of the Standard Model: the Minimal Supersymmetric Standard Model (MSSM) and the Standard Model with four generations (SM4G). Constraints on these models come predominantly from flavor physics and electroweak precision measurements. We show, that the SM4G is still viable and that a fourth generation has strong impact on decay and production processes of the Higgs boson. Furthermore, we study the charged Higgs boson in the MSSM, yielding a clear signal for physics beyond the Standard Model. For small tan β in minimal flavor violation (MFV) no processes for the detection of a charged Higgs boson do exist at the LHC. However, MFV is just motivated by the experimental agreement of results from flavor physics with Standard Model predictions, but not by any basic theoretical consideration. In this thesis, we calculate charged Higgs boson production cross sections beyond the assumption of MFV, where a large number of free parameters is present in the MSSM. We find that the soft-breaking parameters which enhance the charged-Higgs boson production most are just bound to large values, e.g. by rare B-meson decays. Although the charged-Higgs boson cross sections beyond MFV turn out to be sizeable, only a detailed

  1. LHC Higgs physics beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Spannowsky, M.

    2007-09-22

    The Large Hadron Collider (LHC) at CERN will be able to perform proton collisions at a much higher center-of-mass energy and luminosity than any other collider. Its main purpose is to detect the Higgs boson, the last unobserved particle of the Standard Model, explaining the riddle of the origin of mass. Studies have shown, that for the whole allowed region of the Higgs mass processes exist to detect the Higgs at the LHC. However, the Standard Model cannot be a theory of everything and is not able to provide a complete understanding of physics. It is at most an effective theory up to a presently unknown energy scale. Hence, extensions of the Standard Model are necessary which can affect the Higgs-boson signals. We discuss these effects in two popular extensions of the Standard Model: the Minimal Supersymmetric Standard Model (MSSM) and the Standard Model with four generations (SM4G). Constraints on these models come predominantly from flavor physics and electroweak precision measurements. We show, that the SM4G is still viable and that a fourth generation has strong impact on decay and production processes of the Higgs boson. Furthermore, we study the charged Higgs boson in the MSSM, yielding a clear signal for physics beyond the Standard Model. For small tan {beta} in minimal flavor violation (MFV) no processes for the detection of a charged Higgs boson do exist at the LHC. However, MFV is just motivated by the experimental agreement of results from flavor physics with Standard Model predictions, but not by any basic theoretical consideration. In this thesis, we calculate charged Higgs boson production cross sections beyond the assumption of MFV, where a large number of free parameters is present in the MSSM. We find that the soft-breaking parameters which enhance the charged-Higgs boson production most are just bound to large values, e.g. by rare B-meson decays. Although the charged-Higgs boson cross sections beyond MFV turn out to be sizeable, only a detailed

  2. A solar neutrino loophole: standard solar models

    Energy Technology Data Exchange (ETDEWEB)

    Rouse, C A [General Atomic Co., San Diego, Calif. (USA)

    1975-11-01

    The salient aspects of the existence theorem for a unique solution to a system of linear of nonlinear first-order, ordinary differential equations are given and applied to the equilibrium stellar structure equations. It is shown that values of pressure, temperature, mass and luminosity are needed at one point - and for the sun, the logical point is the solar radius. It is concluded that since standard solar model calculations use split boundary conditions, a solar neutrino loophole still remains: solar model calculations that seek to satisfy the necessary condition for a unique solution to the solar structure equations suggest a solar interior quite different from that deduced in standard models. This, in turn, suggests a theory of formation and solar evolution significantly different from the standard theory.

  3. Standard model Higgs physics at colliders

    International Nuclear Information System (INIS)

    Rosca, A.

    2007-01-01

    In this report we briefly review the experimental status and prospects to verify the Higgs mechanism of spontaneous symmetry breaking. The focus is on the most relevant aspects of the phenomenology of the Standard Model Higgs boson at current (Tevatron) and future (Large Hadron Collider, LHC and International Linear Collider, ILC) particle colliders. We review the Standard Model searches: searches at the Tevatron, the program planned at the LHC and prospects at the ILC. Emphasis is put on what follows after a candidate discovery at the LHC: the various measurements which are necessary to precisely determine what the properties of this Higgs candidate are. (author)

  4. Physical Activity Stories: Assessing the "Meaning Standard" in Physical Education

    Science.gov (United States)

    Johnson, Tyler G.

    2016-01-01

    The presence of the "meaning standard" in both national and state content standards suggests that professionals consider it an important outcome of a quality physical education program. However, only 10 percent of states require an assessment to examine whether students achieve this standard. The purpose of this article is to introduce…

  5. Standard Model mass spectrum in inflationary universe

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics,60 Garden Street, Cambridge, MA 02138 (United States); Wang, Yi [Department of Physics, The Hong Kong University of Science and Technology,Clear Water Bay, Kowloon, Hong Kong (China); Xianyu, Zhong-Zhi [Center of Mathematical Sciences and Applications, Harvard University,20 Garden Street, Cambridge, MA 02138 (United States)

    2017-04-11

    We work out the Standard Model (SM) mass spectrum during inflation with quantum corrections, and explore its observable consequences in the squeezed limit of non-Gaussianity. Both non-Higgs and Higgs inflation models are studied in detail. We also illustrate how some inflationary loop diagrams can be computed neatly by Wick-rotating the inflation background to Euclidean signature and by dimensional regularization.

  6. Standard Model Effective Potential from Trace Anomalies

    Directory of Open Access Journals (Sweden)

    Renata Jora

    2018-01-01

    Full Text Available By analogy with the low energy QCD effective linear sigma model, we construct a standard model effective potential based entirely on the requirement that the tree level and quantum level trace anomalies must be satisfied. We discuss a particular realization of this potential in connection with the Higgs boson mass and Higgs boson effective couplings to two photons and two gluons. We find that this kind of potential may describe well the known phenomenology of the Higgs boson.

  7. Scale gauge symmetry and the standard model

    International Nuclear Information System (INIS)

    Sola, J.

    1990-01-01

    This paper speculates on a version of the standard model of the electroweak and strong interactions coupled to gravity and equipped with a spontaneously broken, anomalous, conformal gauge symmetry. The scalar sector is virtually absent in the minimal model but in the general case it shows up in the form of a nonlinear harmonic map Lagrangian. A Euclidean approach to the phenological constant problem is also addressed in this framework

  8. Theorists reject challenge to standard model

    CERN Multimedia

    Adam, D

    2001-01-01

    Particle physicists are questioning results that appear to violate the Standard Model. There are concerns that there is not sufficient statistical significance and also charges that the comparison is being made with the 'most convenient' theoretical value for the muon's magnetic moment (1 page).

  9. Precision tests of the Standard Model

    International Nuclear Information System (INIS)

    Ol'shevskij, A.G.

    1996-01-01

    The present status of the precision measurements of electroweak observables is discussed with the special emphasis on the results obtained recently. All together these measurements provide the basis for the stringent test of the Standard Model and determination of the SM parameters. 22 refs., 23 figs., 11 tabs

  10. Standard Model at the LHC 2017

    CERN Document Server

    2017-01-01

    The SM@LHC 2017 conference will be held May 2-5, 2017 at Nikhef, Amsterdam. The meeting aims to bring together experimentalists and theorists to discuss the phenomenology, observational results and theoretical tools for Standard Model physics at the LHC.

  11. Introduction to physics beyond the Standard Model

    CERN Document Server

    Giudice, Gian Francesco

    1998-01-01

    These lectures will give an introductory review of the main ideas behind the attempts to extend the standard-model description of elementary particle interactions. After analysing the conceptual motivations that lead us to blieve in the existence of an underlying fundamental theory, wi will discuss the present status of various theoretical constructs : grand unification, supersymmetry and technicolour.

  12. Is the standard model really tested?

    International Nuclear Information System (INIS)

    Takasugi, E.

    1989-01-01

    It is discussed how the standard model is really tested. Among various tests, I concentrate on the CP violation phenomena in K and B meson system. Especially, the resent hope to overcome the theoretical uncertainty in the evaluation on the CP violation of K meson system is discussed. (author)

  13. Accidentally safe extensions of the Standard Model

    CERN Document Server

    Di Luzio, Luca; Kamenik, Jernej F.; Nardecchia, Marco

    2015-01-01

    We discuss a class of weak-scale extensions of the Standard Model which is completely invisible to low-energy indirect probes. The typical signature of this scenario is the existence of new charged and/or colored states which are stable on the scale of high-energy particle detectors.

  14. Asymptotically Safe Standard Model via Vectorlike Fermions

    Science.gov (United States)

    Mann, R. B.; Meffe, J. R.; Sannino, F.; Steele, T. G.; Wang, Z. W.; Zhang, C.

    2017-12-01

    We construct asymptotically safe extensions of the standard model by adding gauged vectorlike fermions. Using large number-of-flavor techniques we argue that all gauge couplings, including the hypercharge and, under certain conditions, the Higgs coupling, can achieve an interacting ultraviolet fixed point.

  15. Inflation in the standard cosmological model

    Science.gov (United States)

    Uzan, Jean-Philippe

    2015-12-01

    The inflationary paradigm is now part of the standard cosmological model as a description of its primordial phase. While its original motivation was to solve the standard problems of the hot big bang model, it was soon understood that it offers a natural theory for the origin of the large-scale structure of the universe. Most models rely on a slow-rolling scalar field and enjoy very generic predictions. Besides, all the matter of the universe is produced by the decay of the inflaton field at the end of inflation during a phase of reheating. These predictions can be (and are) tested from their imprint of the large-scale structure and in particular the cosmic microwave background. Inflation stands as a window in physics where both general relativity and quantum field theory are at work and which can be observationally studied. It connects cosmology with high-energy physics. Today most models are constructed within extensions of the standard model, such as supersymmetry or string theory. Inflation also disrupts our vision of the universe, in particular with the ideas of chaotic inflation and eternal inflation that tend to promote the image of a very inhomogeneous universe with fractal structure on a large scale. This idea is also at the heart of further speculations, such as the multiverse. This introduction summarizes the connections between inflation and the hot big bang model and details the basics of its dynamics and predictions. xml:lang="fr"

  16. Teacher Assessment Literacy: A Review of International Standards and Measures

    Science.gov (United States)

    DeLuca, Christopher; LaPointe-McEwan, Danielle; Luhanga, Ulemu

    2016-01-01

    Assessment literacy is a core professional requirement across educational systems. Hence, measuring and supporting teachers' assessment literacy have been a primary focus over the past two decades. At present, there are a multitude of assessment standards across the world and numerous assessment literacy measures that represent different…

  17. Research Notes - Openness and Evolvability - Standards Assessment

    Science.gov (United States)

    2016-08-01

    an unfair advantage. The company not only has the opportunity to be faster to market , but can also impose a level of control on its competitors...The independence of different vendors’ implementations must be carefully assessed to ensure a monopolistic or oligopolistic condition does not exist...political affiliation they may have with other implementation vendors. However, this is unlikely to be practical in markets where the customer is not a

  18. Primordial nucleosynthesis: Beyond the standard model

    International Nuclear Information System (INIS)

    Malaney, R.A.

    1991-01-01

    Non-standard primordial nucleosynthesis merits continued study for several reasons. First and foremost are the important implications determined from primordial nucleosynthesis regarding the composition of the matter in the universe. Second, the production and the subsequent observation of the primordial isotopes is the most direct experimental link with the early (t approx-lt 1 sec) universe. Third, studies of primordial nucleosynthesis allow for important, and otherwise unattainable, constraints on many aspects of particle physics. Finally, there is tentative evidence which suggests that the Standard Big Bang (SBB) model is incorrect in that it cannot reproduce the inferred primordial abundances for a single value of the baryon-to-photon ratio. Reviewed here are some aspects of non-standard primordial nucleosynthesis which mostly overlap with the authors own personal interest. He begins with a short discussion of the SBB nucleosynthesis theory, high-lighting some recent related developments. Next he discusses how recent observations of helium and lithium abundances may indicate looming problems for the SBB model. He then discusses how the QCD phase transition, neutrinos, and cosmic strings can influence primordial nucleosynthesis. He concludes with a short discussion of the multitude of other non-standard nucleosynthesis models found in the literature, and make some comments on possible progress in the future. 58 refs., 7 figs., 2 tabs

  19. Standard Model Higgs Searches at the Tevatron

    Energy Technology Data Exchange (ETDEWEB)

    Knoepfel, Kyle J.

    2012-06-01

    We present results from the search for a standard model Higgs boson using data corresponding up to 10 fb{sup -1} of proton-antiproton collision data produced by the Fermilab Tevatron at a center-of-mass energy of 1.96 TeV. The data were recorded by the CDF and D0 detectors between March 2001 and September of 2011. A broad excess is observed between 105 < m{sub H} < 145 GeV/c{sup 2} with a global significance of 2.2 standard deviations relative to the background-only hypothesis.

  20. Beyond the standard model at Tevatron

    International Nuclear Information System (INIS)

    Pagliarone, C.

    2000-01-01

    Tevatron experiments performed extensive searches for physics beyond the Standard Model. No positive results have been found so far showing that the data are consistent with the SM expectations. CDF and D0 continue the analysis of Run I data placing limits on new physics, including Supersymmetry, large space time dimensions and leptoquark models. With the Run II upgrades, providing an higher acceptance and higher luminosity, it will be possible to make important progresses in the search for new phenomena as well as in setting limits on a larger variety of theoretical models

  1. Study on Standard Fatigue Vehicle Load Model

    Science.gov (United States)

    Huang, H. Y.; Zhang, J. P.; Li, Y. H.

    2018-02-01

    Based on the measured data of truck from three artery expressways in Guangdong Province, the statistical analysis of truck weight was conducted according to axle number. The standard fatigue vehicle model applied to industrial areas in the middle and late was obtained, which adopted equivalence damage principle, Miner linear accumulation law, water discharge method and damage ratio theory. Compared with the fatigue vehicle model Specified by the current bridge design code, the proposed model has better applicability. It is of certain reference value for the fatigue design of bridge in China.

  2. Standard model beyond the TeV

    International Nuclear Information System (INIS)

    Aurenche, P.

    1987-01-01

    The phenomenology of the standard model in the hadronic reactions in the 10 TeV range is described. The predictions of the model concerning the hadronic cross sections being based on the parton model, we first discuss the behaviour of the structure functions at the low values of X (x > 10 -4 ) which are attained at these energies and we show that the development of the leading logarithms equations allow us to calculate them. The production of W, Z, and gauge bosons and gauge boson pairs are reviewed. The Higgs boson production is discussed in detail according to his mass value [fr

  3. Assessing the Genetics Content in the Next Generation Science Standards.

    Directory of Open Access Journals (Sweden)

    Katherine S Lontok

    Full Text Available Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM. Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS using a consensus list of American Society of Human Genetics (ASHG core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  4. Assessing the Genetics Content in the Next Generation Science Standards.

    Science.gov (United States)

    Lontok, Katherine S; Zhang, Hubert; Dougherty, Michael J

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using a consensus list of American Society of Human Genetics (ASHG) core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs) included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  5. Performance Standards': Utility for Different Uses of Assessments

    Directory of Open Access Journals (Sweden)

    Robert L. Linn

    2003-09-01

    Full Text Available Performance standards are arguably one of the most controversial topics in educational measurement. There are uses of assessments such as licensure and certification where performance standards are essential. There are many other uses, however, where performance standards have been mandated or become the preferred method of reporting assessment results where the standards are not essential to the use. Distinctions between essential and nonessential uses of performance standards are discussed. It is argued that the insistence on reporting in terms of performance standards in situations where they are not essential has been more harmful than helpful. Variability in the definitions of proficient academic achievement by states for purposes of the No Child Left Behind Act of 2001 is discussed and it is argued that the variability is so great that characterizing achievement is meaningless. Illustrations of the great uncertainty in standards are provided.

  6. Anomalous Abelian symmetry in the standard model

    International Nuclear Information System (INIS)

    Ramond, P.

    1995-01-01

    The observed hierarchy of quark and lepton masses can be parametrized by nonrenormalizable operators with dimensions determined by an anomalous Abelian family symmetry, a gauge extension to the minimal supersymmetric standard model. Such an Abelian symmetry is generic to compactified superstring theories, with its anomalies compensated by the Green-Schwarz mechanism. If we assume these two symmetries to be the same, we find the electroweak mixing angle to be sin 2 θ ω = 3/8 at the string scale, just by setting the ratio of the product of down quark to charged lepton masses equal to one at the string scale. This assumes no GUT structure. The generality of the result suggests a superstring origin for the standard model. We generalize our analysis to massive neutrinos, and mixings in the lepton sector

  7. Higgs triplets in the standard model

    International Nuclear Information System (INIS)

    Gunion, J.F.; Vega, R.; Wudka, J.

    1990-01-01

    Even though the standard model of the strong and electroweak interactions has proven enormously successful, it need not be the case that a single Higgs-doublet field is responsible for giving masses to the weakly interacting vector bosons and the fermions. In this paper we explore the phenomenology of a Higgs sector for the standard model which contains both doublet and triplet fields [under SU(2) L ]. The resulting Higgs bosons have many exotic features and surprising experimental signatures. Since a critical task of future accelerators will be to either discover or establish the nonexistence of Higgs bosons with mass below the TeV scale, it will be important to keep in mind the alternative possibilities characteristic of this and other nonminimal Higgs sectors

  8. Superconnections: an interpretation of the standard model

    Directory of Open Access Journals (Sweden)

    Gert Roepstorff

    2000-07-01

    Full Text Available The mathematical framework of superbundles as pioneered by D. Quillen suggests that one consider the Higgs field as a natural constituent of a superconnection. I propose to take as superbundle the exterior algebra obtained from a Hermitian vector bundle of rank n where n=2 for the electroweak theory and n=5 for the full Standard Model. The present setup is similar to but avoids the use of non-commutative geometry.

  9. Neutrons and the new Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Ramsey-Musolf, M.J., E-mail: mjrm@physics.wisc.ed [Department of Physics, University of Wisconsin-Madison, Madison, WI 53706 (United States); Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA 91125 (United States)

    2009-12-11

    Fundamental symmetry tests with neutrons can provide unique information about whatever will be the new Standard Model of fundamental interactions. I review two aspects of this possibility: searches for the permanent electric dipole moment of the neutron and its relation to the origin of baryonic matter, and precision studies of neutron decay that can probe new symmetries. I discuss the complementarity of these experiments with other low-energy precision tests and high energy collider searches for new physics.

  10. Beyond the standard model in many directions

    Energy Technology Data Exchange (ETDEWEB)

    Chris Quigg

    2004-04-28

    These four lectures constitute a gentle introduction to what may lie beyond the standard model of quarks and leptons interacting through SU(3){sub c} {direct_product} SU(2){sub L} {direct_product} U(1){sub Y} gauge bosons, prepared for an audience of graduate students in experimental particle physics. In the first lecture, I introduce a novel graphical representation of the particles and interactions, the double simplex, to elicit questions that motivate our interest in physics beyond the standard model, without recourse to equations and formalism. Lecture 2 is devoted to a short review of the current status of the standard model, especially the electroweak theory, which serves as the point of departure for our explorations. The third lecture is concerned with unified theories of the strong, weak, and electromagnetic interactions. In the fourth lecture, I survey some attempts to extend and complete the electroweak theory, emphasizing some of the promise and challenges of supersymmetry. A short concluding section looks forward.

  11. Standardized assessment of infrared thermographic fever screening system performance

    Science.gov (United States)

    Ghassemi, Pejhman; Pfefer, Joshua; Casamento, Jon; Wang, Quanzeng

    2017-03-01

    Thermal modalities represent the only currently viable mass fever screening approach for outbreaks of infectious disease pandemics such as Ebola and SARS. Non-contact infrared thermometers (NCITs) and infrared thermographs (IRTs) have been previously used for mass fever screening in transportation hubs such as airports to reduce the spread of disease. While NCITs remain a more popular choice for fever screening in the field and at fixed locations, there has been increasing evidence in the literature that IRTs can provide greater accuracy in estimating core body temperature if appropriate measurement practices are applied - including the use of technically suitable thermographs. Therefore, the purpose of this study was to develop a battery of evaluation test methods for standardized, objective and quantitative assessment of thermograph performance characteristics critical to assessing suitability for clinical use. These factors include stability, drift, uniformity, minimum resolvable temperature difference, and accuracy. Two commercial IRT models were characterized. An external temperature reference source with high temperature accuracy was utilized as part of the screening thermograph. Results showed that both IRTs are relatively accurate and stable (<1% error of reading with stability of +/-0.05°C). Overall, results of this study may facilitate development of standardized consensus test methods to enable consistent and accurate use of IRTs for fever screening.

  12. Beyond standard model calculations with Sherpa

    Energy Technology Data Exchange (ETDEWEB)

    Hoeche, Stefan [SLAC National Accelerator Laboratory, Menlo Park, CA (United States); Kuttimalai, Silvan [Durham University, Institute for Particle Physics Phenomenology, Durham (United Kingdom); Schumann, Steffen [Universitaet Goettingen, II. Physikalisches Institut, Goettingen (Germany); Siegert, Frank [Institut fuer Kern- und Teilchenphysik, TU Dresden, Dresden (Germany)

    2015-03-01

    We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in Beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level. (orig.)

  13. Fitting Simpson's neutrino into the standard model

    International Nuclear Information System (INIS)

    Valle, J.W.F.

    1985-01-01

    I show how to accomodate the 17 keV state recently by Simpson as one of the neutrinos of the standard model. Experimental constraints can only be satisfied if the μ and tau neutrino combine to a very good approximation to form a Dirac neutrino of 17 keV leaving a light νsub(e). Neutrino oscillations will provide the most stringent test of the model. The cosmological bounds are also satisfied in a natural way in models with Goldstone bosons. Explicit examples are given in the framework of majoron-type models. Constraints on the lepton symmetry breaking scale which follow from astrophysics, cosmology and laboratory experiments are discussed. (orig.)

  14. Experimentally testing the standard cosmological model

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N. (Chicago Univ., IL (USA) Fermi National Accelerator Lab., Batavia, IL (USA))

    1990-11-01

    The standard model of cosmology, the big bang, is now being tested and confirmed to remarkable accuracy. Recent high precision measurements relate to the microwave background; and big bang nucleosynthesis. This paper focuses on the latter since that relates more directly to high energy experiments. In particular, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard cosmology scenario. Discussion is presented on the improved light element observational data as well as the improved neutron lifetime data. alternate nucleosynthesis scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, {Omega}{sub b}, remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the standard model conclusion that {Omega}{sub b} {approximately} 0.06. This latter point is the deriving force behind the need for non-baryonic dark matter (assuming {Omega}{sub total} = 1) and the need for dark baryonic matter, since {Omega}{sub visible} < {Omega}{sub b}. Recent accelerator constraints on non-baryonic matter are discussed, showing that any massive cold dark matter candidate must now have a mass M{sub x} {approx gt} 20 GeV and an interaction weaker than the Z{sup 0} coupling to a neutrino. It is also noted that recent hints regarding the solar neutrino experiments coupled with the see-saw model for {nu}-masses may imply that the {nu}{sub {tau}} is a good hot dark matter candidate. 73 refs., 5 figs.

  15. Experimentally testing the standard cosmological model

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1990-11-01

    The standard model of cosmology, the big bang, is now being tested and confirmed to remarkable accuracy. Recent high precision measurements relate to the microwave background; and big bang nucleosynthesis. This paper focuses on the latter since that relates more directly to high energy experiments. In particular, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard cosmology scenario. Discussion is presented on the improved light element observational data as well as the improved neutron lifetime data. alternate nucleosynthesis scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, Ω b , remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the standard model conclusion that Ω b ∼ 0.06. This latter point is the deriving force behind the need for non-baryonic dark matter (assuming Ω total = 1) and the need for dark baryonic matter, since Ω visible b . Recent accelerator constraints on non-baryonic matter are discussed, showing that any massive cold dark matter candidate must now have a mass M x approx-gt 20 GeV and an interaction weaker than the Z 0 coupling to a neutrino. It is also noted that recent hints regarding the solar neutrino experiments coupled with the see-saw model for ν-masses may imply that the ν τ is a good hot dark matter candidate. 73 refs., 5 figs

  16. TDA Assessment of Recommendations for Space Data System Standards

    Science.gov (United States)

    Posner, E. C.; Stevens, R.

    1984-01-01

    NASA is participating in the development of international standards for space data systems. Recommendations for standards thus far developed are assessed. The proposed standards for telemetry coding and packet telemetry provide worthwhile benefit to the DSN; their cost impact to the DSN should be small. Because of their advantage to the NASA space exploration program, their adoption should be supported by TDA, JPL, and OSTDS.

  17. Peer Review of Assessment Network: Supporting Comparability of Standards

    Science.gov (United States)

    Booth, Sara; Beckett, Jeff; Saunders, Cassandra

    2016-01-01

    Purpose: This paper aims to test the need in the Australian higher education (HE) sector for a national network for the peer review of assessment in response to the proposed HE standards framework and propose a sector-wide framework for calibrating and assuring achievement standards, both within and across disciplines, through the establishment of…

  18. Assessing changes in drought characteristics with standardized indices

    Science.gov (United States)

    Vidal, Jean-Philippe; Najac, Julien; Martin, Eric; Franchistéguy, Laurent; Soubeyroux, Jean-Michel

    2010-05-01

    Standardized drought indices like the Standardized Precipitation Index (SPI) are more and more frequently adopted for drought reconstruction, monitoring and forecasting, and the SPI has been recently recommended by the World Meteorological Organization to characterize meteorological droughts. Such indices are based on the statistical distribution of a hydrometeorological variable (e.g., precipitation) in a given reference climate, and a drought event is defined as a period with continuously negative index values. Because of the way these indices are constructed, some issues may arise when using them in a non-stationnary climate. This work thus aims at highlighting such issues and demonstrating the different ways these indices may - or may not - be applied and interpreted in the context of an anthropogenic climate change. Three major points are detailed through examples taken from both a high-resolution gridded reanalysis dataset over France and transient projections from the ARPEGE general circulation model downscaled over France. The first point deals with the choice of the reference climate, and more specifically its type (from observations/reanalysis or from present-day modelled climate) and its record period. Second, the interpretation of actual changes are closely linked with the type of the selected drought feature over a future period: mean index value, under-threshold frequency, or drought event characteristics (number, mean duration and magnitude, seasonality, etc.). Finally, applicable approaches as well as related uncertainties depend on the availability of data from a future climate, whether in the form of a fully transient time series from present-day or only a future time slice. The projected evolution of drought characteristics under climate change must inform present decisions on long-term water resources planning. An assessment of changes in drought characteristics should therefore provide water managers with appropriate information that can help

  19. Standardization of figures and assessment procedures for DTM verticalaccuracy

    Directory of Open Access Journals (Sweden)

    Vittorio Casella

    2015-07-01

    Full Text Available Digital Terrain Models (DTMs are widely used in many sectors. They play a key role in hydrological risk prevention, risk mitigation and numeric simulations. This paper deals with two questions: (i when it is stated that a DTM has a given vertical accuracy, is this assertion univocal? (ii when DTM vertical accuracy is assessed by means of checkpoints, does their location influence results? First, the paper illustrates that two vertical accuracy definitions are conceivable: Vertical Accuracy at the Nodes (VAN, the average vertical distance between the model and the terrain, evaluated at the DTM's nodes and Vertical Accuracy at the interpolated Points (VAP, in which the vertical distance is evaluated at the generic points. These two quantities are not coincident and, when they are calculated for the same DTM, different numeric values are reached. Unfortunately, the two quantities are often interchanged, but this is misleading. Second, the paper shows a simulated example of a DTM vertical accuracy assessment, highlighting that the checkpoints’ location plays a key role: when checkpoints coincide with the DTM nodes, VAN is estimated; when checkpoints are randomly located, VAP is estimated, instead. Third, an in-depth, theoretical characterization of the two considered quantities is performed, based on symbolic computation, and suitable standardization coefficients are proposed. Finally, our discussion has a well-defined frame: it doesn't deal with all the items of the DTM vertical accuracy budget, which would require a much longer essay, but only with one, usually called fundamental vertical accuracy.

  20. Skewness of the standard model possible implications

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Brene, N.

    1989-09-01

    In this paper we consider combinations of gauge algebra and set of rules for quantization of gauge charges. We show that the combination of the algebra of the standard model and the rule satisfied by the electric charges of the quarks and leptons has an exceptional high degree of a kind of asymmetry which we call skewness. Assuming that skewness has physical significance and adding two other rather plausible assumptions, we may conclude that space time must have a non simply connected topology on very small distances. Such topology would allow a kind of symmetry breakdown leading to a more skew combination of gauge algebra and set of quantization rules. (orig.)

  1. The renormalization of the electroweak standard model

    International Nuclear Information System (INIS)

    Boehm, M.; Spiesberger, H.; Hollik, W.

    1984-03-01

    A renormalization scheme for the electroweak standard model is presented in which the electric charge and the masses of the gauge bosons, Higgs particle and fermions are used as physical parameters. The photon is treated such that quantum electrodynamics is contained in the usual form. Field renormalization respecting the gauge symmetry gives finite Green functions. The Ward identities between the Green functions of the unphysical sector allow a renormalization that maintains the simple pole structure of the propagators. Explicit results for the renormalization self energies and vertex functions are given. They can be directly used as building blocks for the evaluation of l-loop radiative corrections. (orig.)

  2. Baryogenesis and standard model CP violation

    International Nuclear Information System (INIS)

    Huet, P.

    1994-08-01

    The standard model possesses a natural source of CP violation contained in the phase of the CKM matrix. Whether the latter participated to the making of the matter-antimatter asymmetry of the observable universe is a fundamental question which has been addressed only recently. The generation of a CP observable occurs through interference of quantum paths along which a sequence of flavor mixings and chirality flips take place. The coherence of this phenomenon in the primeval plasma is limited by the fast quark-gluon interactions. At the electroweak era, this phenomenon of decoherence forbids a successful baryogenesis based on the sole CP violation of the CKM matrix

  3. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    Albeverio, S.

    1984-01-01

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)

  4. The minimally tuned minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Essig, Rouven; Fortin, Jean-Francois

    2008-01-01

    The regions in the Minimal Supersymmetric Standard Model with the minimal amount of fine-tuning of electroweak symmetry breaking are presented for general messenger scale. No a priori relations among the soft supersymmetry breaking parameters are assumed and fine-tuning is minimized with respect to all the important parameters which affect electroweak symmetry breaking. The superpartner spectra in the minimally tuned region of parameter space are quite distinctive with large stop mixing at the low scale and negative squark soft masses at the high scale. The minimal amount of tuning increases enormously for a Higgs mass beyond roughly 120 GeV

  5. Search for the standard model Higgs boson

    Science.gov (United States)

    Buskulic, D.; de Bonis, I.; Decamp, D.; Ghez, P.; Goy, C.; Lees, J.-P.; Minard, M.-N.; Pietrzyk, B.; Ariztizabal, F.; Comas, P.; Crespo, J. M.; Delfino, M.; Efthymiopoulos, I.; Fernandez, E.; Fernandez-Bosman, M.; Gaitan, V.; Garrido, Ll.; Mattison, T.; Pacheco, A.; Padilla, C.; Pascual, A.; Creanza, D.; de Palma, M.; Farilla, A.; Iaselli, G.; Maggi, G.; Natali, S.; Nuzzo, S.; Quattromini, M.; Ranieri, A.; Raso, G.; Romano, F.; Ruggieri, F.; Selvaggi, G.; Silvestris, L.; Tempesta, P.; Zito, G.; Chai, Y.; Hu, H.; Huang, D.; Huang, X.; Lin, J.; Wang, T.; Xie, Y.; Xu, D.; Xu, R.; Zhang, J.; Zhang, L.; Zhao, W.; Blucher, E.; Bonvicini, G.; Boudreau, J.; Casper, D.; Drevermann, H.; Forty, R. W.; Ganis, G.; Gay, C.; Hagelberg, R.; Harvey, J.; Hilgart, J.; Jacobsen, R.; Jost, B.; Knobloch, J.; Lehraus, I.; Lohse, T.; Maggi, M.; Markou, C.; Martinez, M.; Mato, P.; Meinhard, H.; Minten, A.; Miotto, A.; Miguel, R.; Moser, H.-G.; Palazzi, P.; Pater, J. R.; Perlas, J. A.; Pusztaszeri, J.-F.; Ranjard, F.; Redlinger, G.; Rolandi, L.; Rothberg, J.; Ruan, T.; Saich, M.; Schlatter, D.; Schmelling, M.; Sefkow, F.; Tejessy, W.; Tomalin, I. R.; Veenhof, R.; Wachsmuth, H.; Wasserbaech, S.; Wiedenmann, W.; Wildish, T.; Witzeling, W.; Wotschack, J.; Ajaltouni, Z.; Badaud, F.; Bardadin-Otwinowska, M.; El Fellous, R.; Falvard, A.; Gay, P.; Guicheney, C.; Henrard, P.; Jousset, J.; Michel, B.; Montret, J.-C.; Pallin, D.; Perret, P.; Podlyski, F.; Proriol, J.; Prulhière, F.; Saadi, F.; Fearnley, T.; Hansen, J. B.; Hansen, J. D.; Hansen, J. R.; Hansen, P. H.; Møllerud, R.; Nilsson, B. S.; Kyriakis, A.; Simopoulou, E.; Siotis, I.; Vayaki, A.; Zachariadou, K.; Badier, J.; Blondel, A.; Bonneaud, G.; Brient, J. C.; Fouque, G.; Orteu, S.; Rougé, A.; Rumpf, M.; Tanaka, R.; Verderi, M.; Videau, H.; Candlin, D. J.; Parsons, M. I.; Veitch, E.; Focardi, E.; Moneta, L.; Parrini, G.; Corden, M.; Georgiopoulos, C.; Ikeda, M.; Levinthal, D.; Antonelli, A.; Baldini, R.; Bencivenni, G.; Bologna, G.; Bossi, F.; Campana, P.; Capon, G.; Cerutti, F.; Chiarella, V.; D'Ettorre-Piazzoli, B.; Felici, G.; Laurelli, P.; Mannocchi, G.; Murtas, F.; Murtas, G. P.; Passalacqua, L.; Pepe-Altarelli, M.; Picchi, P.; Colrain, P.; Ten Have, I.; Lynch, J. G.; Maitland, W.; Morton, W. T.; Raine, C.; Reeves, P.; Scarr, J. M.; Smith, K.; Thompson, A. S.; Turnbull, R. M.; Brandl, B.; Braun, O.; Geweniger, C.; Hanke, P.; Hepp, V.; Kluge, E. E.; Maumary, Y.; Putzer, A.; Rensch, B.; Stahl, A.; Tittel, K.; Wunsch, M.; Beuselinck, R.; Binnie, D. M.; Cameron, W.; Cattaneo, M.; Colling, D. J.; Dornan, P. J.; Greene, A. M.; Hassard, J. F.; Lieske, N. M.; Moutoussi, A.; Nash, J.; Patton, S.; Payne, D. G.; Phillips, M. J.; San Martin, G.; Sedgbeer, J. K.; Wright, A. G.; Girtler, P.; Kuhn, D.; Rudolph, G.; Vogl, R.; Bowdery, C. K.; Brodbeck, T. J.; Finch, A. J.; Foster, F.; Hughes, G.; Jackson, D.; Keemer, N. R.; Nuttall, M.; Patel, A.; Sloan, T.; Snow, S. W.; Whelan, E. P.; Kleinknecht, K.; Raab, J.; Renk, B.; Sander, H.-G.; Schmidt, H.; Steeg, F.; Walther, S. M.; Wanke, R.; Wolf, B.; Bencheikh, A. M.; Benchouk, C.; Bonissent, A.; Carr, J.; Coyle, P.; Drinkard, J.; Etienne, F.; Nicod, D.; Papalexiou, S.; Payre, P.; Roos, L.; Rousseau, D.; Schwemling, P.; Talby, M.; Adlung, S.; Assmann, R.; Bauer, C.; Blum, W.; Brown, D.; Cattaneo, P.; Dehning, B.; Dietl, H.; Dydak, F.; Frank, M.; Halley, A. W.; Jakobs, K.; Lauber, J.; Lütjens, G.; Lutz, G.; Männer, W.; Richter, R.; Schröder, J.; Schwarz, A. S.; Settles, R.; Seywerd, H.; Stierlin, U.; Stiegler, U.; Dennis, R. St.; Wolf, G.; Alemany, R.; Boucrot, J.; Callot, O.; Cordier, A.; Davier, M.; Duflot, L.; Grivaz, J.-F.; Heusse, Ph.; Jaffe, D. E.; Janot, P.; Kim, D. W.; Le Diberder, F.; Lefrançois, J.; Lutz, A.-M.; Schune, M.-H.; Veillet, J.-J.; Videau, I.; Zhang, Z.; Abbaneo, D.; Bagliesi, G.; Batignani, G.; Bottigli, U.; Bozzi, C.; Calderini, G.; Carpinelli, M.; Ciocci, M. A.; Dell'Orso, R.; Ferrante, I.; Fidecaro, F.; Foà, L.; Forti, F.; Giassi, A.; Giorgi, M. A.; Gregorio, A.; Ligabue, F.; Lusiani, A.; Manneli, E. B.; Marrocchesi, P. S.; Messineo, A.; Palla, F.; Rizzo, G.; Sanguinetti, G.; Spagnolo, P.; Steinberger, J.; Techini, R.; Tonelli, G.; Triggiani, G.; Vannini, C.; Venturi, A.; Verdini, P. G.; Walsh, J.; Betteridge, A. P.; Gao, Y.; Green, M. G.; March, P. V.; Mir, Ll. M.; Medcalf, T.; Quazi, I. S.; Strong, J. A.; West, L. R.; Botterill, D. R.; Clifft, R. W.; Edgecock, T. R.; Haywood, S.; Norton, P. R.; Thompson, J. C.; Bloch-Devaux, B.; Colas, P.; Duarte, H.; Emery, S.; Kozanecki, W.; Lançon, E.; Lemaire, M. C.; Locci, E.; Marx, B.; Perez, P.; Rander, J.; Renardy, J.-F.; Rosowsky, A.; Roussarie, A.; Schuller, J.-P.; Schwindling, J.; Si Mohand, D.; Vallage, B.; Johnson, R. P.; Litke, A. M.; Taylor, G.; Wear, J.; Ashman, J. G.; Babbage, W.; Booth, C. N.; Buttar, C.; Cartwright, S.; Combley, F.; Dawson, I.; Thompson, L. F.; Barberio, E.; Böhrer, A.; Brandt, S.; Cowan, G.; Grupen, C.; Lutters, G.; Rivera, F.; Schäfer, U.; Smolik, L.; Bosisio, L.; Della Marina, R.; Giannini, G.; Gobbo, B.; Ragusa, F.; Bellantoni, L.; Chen, W.; Conway, J. S.; Feng, Z.; Ferguson, D. P. S.; Gao, Y. S.; Grahl, J.; Harton, J. L.; Hayes, O. J.; Nachtman, J. M.; Pan, Y. B.; Saadi, Y.; Schmitt, M.; Scott, I.; Sharma, V.; Shi, Z. H.; Turk, J. D.; Walsh, A. M.; Weber, F. V.; Sau Lan Wu; Wu, X.; Zheng, M.; Zobernig, G.; Aleph Collaboration

    1993-08-01

    Using a data sample corresponding to about 1 233 000 hadronic Z decays collected by the ALEPH experiment at LEP, the reaction e+e- → HZ∗ has been used to search for the standard model Higgs boson, in association with missing energy when Z∗ → v v¯, or with a pair of energetic leptons when Z∗ → e+e-or μ +μ -. No signal was found and, at the 95% confidence level, mH exceeds 58.4 GeV/ c2.

  6. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  7. ASK Standards: Assessment, Skills, and Knowledge Content Standards for Student Affairs Practitioners and Scholars

    Science.gov (United States)

    ACPA College Student Educators International, 2011

    2011-01-01

    The Assessment Skills and Knowledge (ASK) standards seek to articulate the areas of content knowledge, skill and dispositions that student affairs professionals need in order to perform as practitioner-scholars to assess the degree to which students are mastering the learning and development outcomes the professionals intend. Consistent with…

  8. 42 CFR 493.1289 - Standard: Analytic systems quality assessment.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Analytic systems quality assessment. 493.1289 Section 493.1289 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... through 493.1283. (b) The analytic systems quality assessment must include a review of the effectiveness...

  9. 42 CFR 493.1299 - Standard: Postanalytic systems quality assessment.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Postanalytic systems quality assessment. 493.1299 Section 493.1299 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH....1291. (b) The postanalytic systems quality assessment must include a review of the effectiveness of...

  10. 42 CFR 493.1249 - Standard: Preanalytic systems quality assessment.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Preanalytic systems quality assessment. 493.1249 Section 493.1249 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH....1241 through 493.1242. (b) The preanalytic systems quality assessment must include a review of the...

  11. Standard setting and quality of assessment: A conceptual approach ...

    African Journals Online (AJOL)

    Quality performance standards and the effect of assessment outcomes are important in the educational milieu, as assessment remains the representative ... not be seen as a methodological process of setting pass/fail cut-off points only, but as a powerful catalyst for quality improvements in HPE by promoting excellence in ...

  12. 24 CFR 115.206 - Performance assessments; Performance standards.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Performance assessments; Performance standards. 115.206 Section 115.206 Housing and Urban Development Regulations Relating to Housing... AGENCIES Certification of Substantially Equivalent Agencies § 115.206 Performance assessments; Performance...

  13. Can the superstring inspire the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, J.; Enqvist, K.; Nanopoulos, D.V.; Olive, K.A.

    1988-02-01

    We discuss general features of models in which the E/sub 8/xE'/sub 8/ heterotic superstring is compactified on a specific Calabi-Yau manifold. The gauge group of rank-6 in four dimensions is supposed to be broken down at an intermediate scale m/sub I/ to the standard model group SU(3)/sub C/ x SU(2)/sub L/ x U(1)/sub Y/, as a result of two neutral scalar fields acquiring large vacuum expectations (vev's) in one of many flat directions of the effective potential. We find that it is difficult to generate such an intermediate scale by radiative symmetry breaking, whilst such models have prima facie problems with baryon decay mediated by massive particles and with non-perturbative behaviour of the gauge couplings, unless m/sub I/ > or approx. 10/sup 16/ GeV. Rapid baryon decay mediated by light particles, large neutrino masses, other ..delta..L not = 0 processes and flavour-changing neutral currents are generic features of these models. We illustrate these observations with explicit calculations in a number of different models given by vev's in different flat directions.

  14. Can the superstring inspire the standard model?

    International Nuclear Information System (INIS)

    Ellis, J.; Enqvist, K.; Nanopoulos, D.V.; Olive, K.A.

    1988-01-01

    We discuss general features of models in which the E 8 xE' 8 heterotic superstring is compactified on a specific Calabi-Yau manifold. The gauge group of rank-6 in four dimensions is supposed to be broken down at an intermediate scale m I to the standard model group SU(3) C x SU(2) L x U(1) Y , as a result of two neutral scalar fields acquiring large vacuum expectations (vev's) in one of many flat directions of the effective potential. We find that it is difficult to generate such an intermediate scale by radiative symmetry breaking, whilst such models have prima facie problems with baryon decay mediated by massive particles and with non-perturbative behaviour of the gauge couplings, unless m I > or approx. 10 16 GeV. Rapid baryon decay mediated by light particles, large neutrino masses, other ΔL ≠ 0 processes and flavour-changing neutral currents are generic features of these models. We illustrate these observations with explicit calculations in a number of different models given by vev's in different flat directions. (orig.)

  15. B physics beyond the Standard Model

    International Nuclear Information System (INIS)

    Hewett, J.A.L.

    1997-12-01

    The ability of present and future experiments to test the Standard Model in the B meson sector is described. The authors examine the loop effects of new interactions in flavor changing neutral current B decays and in Z → b anti b, concentrating on supersymmetry and the left-right symmetric model as specific examples of new physics scenarios. The procedure for performing a global fit to the Wilson coefficients which describe b → s transitions is outlined, and the results of such a fit from Monte Carlo generated data is compared to the predictions of the two sample new physics scenarios. A fit to the Zb anti b couplings from present data is also given

  16. Complex singlet extension of the standard model

    International Nuclear Information System (INIS)

    Barger, Vernon; McCaskey, Mathew; Langacker, Paul; Ramsey-Musolf, Michael; Shaughnessy, Gabe

    2009-01-01

    We analyze a simple extension of the standard model (SM) obtained by adding a complex singlet to the scalar sector (cxSM). We show that the cxSM can contain one or two viable cold dark matter candidates and analyze the conditions on the parameters of the scalar potential that yield the observed relic density. When the cxSM potential contains a global U(1) symmetry that is both softly and spontaneously broken, it contains both a viable dark matter candidate and the ingredients necessary for a strong first order electroweak phase transition as needed for electroweak baryogenesis. We also study the implications of the model for discovery of a Higgs boson at the Large Hadron Collider.

  17. Leading the Transition from the Alternate Assessment Based on Modified Achievement Standards to the General Assessment

    Science.gov (United States)

    Lazarus, Sheryl S.; Rieke, Rebekah

    2013-01-01

    Schools are facing many changes in the ways that teaching, learning, and assessment take place. Most states are moving from individual state standards to the new Common Core State Standards, which will be fewer, higher, and more rigorous than most current state standards. As the next generation of assessments used for accountability are rolled…

  18. Naturalness of CP Violation in the Standard Model

    International Nuclear Information System (INIS)

    Gibbons, Gary W.; Gielen, Steffen; Pope, C. N.; Turok, Neil

    2009-01-01

    We construct a natural measure on the space of Cabibbo-Kobayashi-Maskawa matrices in the standard model, assuming the fermion mass matrices are randomly selected from a distribution which incorporates the observed quark mass hierarchy. This measure allows us to assess the likelihood of Jarlskog's CP violation parameter J taking its observed value J≅3x10 -5 . We find that the observed value, while well below the mathematically allowed maximum, is in fact typical once the observed quark masses are assumed

  19. Bounds on the Higgs mass in the standard model and the minimal supersymmetric standard model

    CERN Document Server

    Quiros, M.

    1995-01-01

    Depending on the Higgs-boson and top-quark masses, M_H and M_t, the effective potential of the {\\bf Standard Model} can develop a non-standard minimum for values of the field much larger than the weak scale. In those cases the standard minimum becomes metastable and the possibility of decay to the non-standard one arises. Comparison of the decay rate to the non-standard minimum at finite (and zero) temperature with the corresponding expansion rate of the Universe allows to identify the region, in the (M_H, M_t) plane, where the Higgs field is sitting at the standard electroweak minimum. In the {\\bf Minimal Supersymmetric Standard Model}, approximate analytical expressions for the Higgs mass spectrum and couplings are worked out, providing an excellent approximation to the numerical results which include all next-to-leading-log corrections. An appropriate treatment of squark decoupling allows to consider large values of the stop and/or sbottom mixing parameters and thus fix a reliable upper bound on the mass o...

  20. Photon defects in noncommutative standard model candidates

    International Nuclear Information System (INIS)

    Abel, S.A.; Khoze, V.V.

    2006-06-01

    Restrictions imposed by gauge invariance in noncommutative spaces together with the effects of ultraviolet/infrared mixing lead to strong constraints on possible candidates for a noncommutative extension of the Standard Model. We study a general class of noncommutative models consistent with these restrictions. Specifically we consider models based upon a gauge theory with the gauge group U(N 1 ) x U(N 2 ) x.. x U(N m ) coupled to matter fields transforming in the (anti)-fundamental, bi-fundamental and adjoint representations. We pay particular attention to overall trace-U(1) factors of the gauge group which are affected by the ultraviolet/infrared mixing. Typically, these trace-U(1) gauge fields do not decouple sufficiently fast in the infrared, and lead to sizable Lorentz symmetry violating effects in the low-energy effective theory. In a 4-dimensional theory on a continuous space-time making these effects unobservable would require making the effects of noncommutativity tiny, M NC >> M P . This severely limits the phenomenological prospects of such models. However, adding additional universal extra dimensions the trace-U(1) factors decouple with a power law and the constraint on the noncommutativity scale is weakened considerably. Finally, we briefly mention some interesting properties of the photon that could arise if the noncommutative theory is modified at a high energy scale. (Orig.)

  1. Consistency test of the standard model

    International Nuclear Information System (INIS)

    Pawlowski, M.; Raczka, R.

    1997-01-01

    If the 'Higgs mass' is not the physical mass of a real particle but rather an effective ultraviolet cutoff then a process energy dependence of this cutoff must be admitted. Precision data from at least two energy scale experimental points are necessary to test this hypothesis. The first set of precision data is provided by the Z-boson peak experiments. We argue that the second set can be given by 10-20 GeV e + e - colliders. We pay attention to the special role of tau polarization experiments that can be sensitive to the 'Higgs mass' for a sample of ∼ 10 8 produced tau pairs. We argue that such a study may be regarded as a negative selfconsistency test of the Standard Model and of most of its extensions

  2. Standard model fermions and N=8 supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Nicolai, Hermann [Max Planck Institute for Gravitational Physics (Albert Einstein Institute), Am Muehlenberg 1, Potsdam-Golm (Germany)

    2016-07-01

    In a scheme originally proposed by Gell-Mann, and subsequently shown to be realized at the SU(3) x U(1) stationary point of maximal gauged SO(8) supergravity, the 48 spin-1/2 fermions of the theory remaining after the removal of eight Goldstinos can be identified with the 48 quarks and leptons (including right-chiral neutrinos) of the Standard model, provided one identifies the residual SU(3) with the diagonal subgroup of the color group SU(3){sub c} and a family symmetry SU(3){sub f}. However, there remained a systematic mismatch in the electric charges by a spurion charge of ± 1/6. We here identify the ''missing'' U(1) that rectifies this mismatch, and that takes a surprisingly simple, though unexpected form, and show how it is related to the conjectured R symmetry K(E10) of M Theory.

  3. Quantum field theory and the standard model

    CERN Document Server

    Schwartz, Matthew D

    2014-01-01

    Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...

  4. CMS standard model Higgs boson results

    Directory of Open Access Journals (Sweden)

    Garcia-Abia Pablo

    2013-11-01

    Full Text Available In July 2012 CMS announced the discovery of a new boson with properties resembling those of the long-sought Higgs boson. The analysis of the proton-proton collision data recorded by the CMS detector at the LHC, corresponding to integrated luminosities of 5.1 fb−1 at √s = 7 TeV and 19.6 fb−1 at √s = 8 TeV, confirm the Higgs-like nature of the new boson, with a signal strength associated with vector bosons and fermions consistent with the expectations for a standard model (SM Higgs boson, and spin-parity clearly favouring the scalar nature of the new boson. In this note I review the updated results of the CMS experiment.

  5. Standard model group: Survival of the fittest

    Science.gov (United States)

    Nielsen, H. B.; Brene, N.

    1983-09-01

    The essential content of this paper is related to random dynamics. We speculate that the world seen through a sub-Planck-scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some "world (gauge) group". We see that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. We further argue that the subgroup which survives as the end product of a possible chain of collapses is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property.

  6. Standard model group: survival of the fittest

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, H.B. (Niels Bohr Inst., Copenhagen (Denmark); Nordisk Inst. for Teoretisk Atomfysik, Copenhagen (Denmark)); Brene, N. (Niels Bohr Inst., Copenhagen (Denmark))

    1983-09-19

    The essential content of this paper is related to random dynamics. We speculate that the world seen through a sub-Planck-scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some ''world (gauge) group''. We see that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. We further argue that the subgroup which survives as the end product of a possible chain of collapse is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property.

  7. Standard model group: survival of the fittest

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Brene, N.

    1983-01-01

    Th essential content of this paper is related to random dynamics. We speculate that the world seen through a sub-Planck-scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some ''world (gauge) group''. We see that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. We further argue that the subgroup which survives as the end product of a possible chain of collapse is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property. (orig.)

  8. Standard model group survival of the fittest

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Brene, N.

    1983-02-01

    The essential content of this note is related to random dynamics. The authors speculate that the world seen through a sub Planck scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some ''world (gauge) group''. It is seen that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. It is further argued that the subgroup which survives as the end product of a possible chain of collapses is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property. (Auth.)

  9. Symmetry breaking: The standard model and superstrings

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1988-01-01

    The outstanding unresolved issue of the highly successful standard model is the origin of electroweak symmetry breaking and of the mechanism that determines its scale, namely the vacuum expectation value (vev)v that is fixed by experiment at the value v = 4m//sub w//sup 2///g 2 = (√2G/sub F/)/sup /minus/1/ ≅ 1/4 TeV. In this talk I will discuss aspects of two approaches to this problem. One approach is straightforward and down to earth: the search for experimental signatures, as discussed previously by Pierre Darriulat. This approach covers the energy scales accessible to future and present laboratory experiments: roughly (10/sup /minus/9/ /minus/ 10 3 )GeV. The second approach involves theoretical speculations, such as technicolor and supersymmetry, that attempt to explain the TeV scale. 23 refs., 5 figs

  10. The standard model 30 years of glory

    International Nuclear Information System (INIS)

    Lefrancois, J.

    2001-03-01

    In these 3 lectures the author reviews the achievements of the past 30 years, which saw the birth and the detailed confirmation of the standard model. The first lecture is dedicated to quantum chromodynamics (QCD), deep inelastic scattering, neutrino scattering results, R(e + ,e - ), scaling violation, Drell-Yan reactions and the observation of jets. The second lecture deals with weak interactions and quark and lepton families, the discovery of W and Z bosons, of charm, of the tau lepton and B quarks are detailed. The third lecture focuses on the stunning progress that have been made in accuracy concerning detectors, the typical level of accuracy of previous e + e - experiments was about 5-10%, while the accuracy obtained at LEP/SLC is of order 0.1% to 0.5%. (A.C.)

  11. The standard model 30 years of glory

    Energy Technology Data Exchange (ETDEWEB)

    Lefrancois, J

    2001-03-01

    In these 3 lectures the author reviews the achievements of the past 30 years, which saw the birth and the detailed confirmation of the standard model. The first lecture is dedicated to quantum chromodynamics (QCD), deep inelastic scattering, neutrino scattering results, R(e{sup +},e{sup -}), scaling violation, Drell-Yan reactions and the observation of jets. The second lecture deals with weak interactions and quark and lepton families, the discovery of W and Z bosons, of charm, of the tau lepton and B quarks are detailed. The third lecture focuses on the stunning progress that have been made in accuracy concerning detectors, the typical level of accuracy of previous e{sup +}e{sup -} experiments was about 5-10%, while the accuracy obtained at LEP/SLC is of order 0.1% to 0.5%. (A.C.)

  12. Orally disintegrating and oral standard olanzapine tablets similarly elevate the homeostasis model assessment of insulin resistance index and plasma triglyceride levels in 12 healthy men: a randomized crossover study.

    Science.gov (United States)

    Vidarsdottir, Solrun; Vlug, Pauline; Roelfsema, Ferdinand; Frölich, Marijke; Pijl, Hanno

    2010-09-01

    Treatment with olanzapine is associated with obesity, diabetes mellitus, and dyslipidemia. Reports have indicated that orally disintegrating tablets (ODT) cause less weight gain than oral standard tablets (OST). The aim of this study was to compare the effect of short-term treatment with these 2 distinct olanzapine formulations on glucose and lipid metabolism in healthy men. Twelve healthy men (mean ± SEM age: 25.1 ± 5.5 years) received olanzapine ODT (10 mg od, 8 days), olanzapine OST (10 mg od, 8 days), or no intervention in a randomized crossover design. At breakfast and dinner, glucose, insulin, free fatty acids (FFA), and triglyceride concentrations were measured at 10-minute intervals from 30 minutes prior to 2 hours after ingestion of standard meals. Leptin and adiponectin concentrations were measured at 20- and 30-minute intervals, respectively, between 0000h-1200h. Physical activity was assessed with an accelerometer. Fuel oxidation was measured in fasting condition by indirect calorimetry. The study was conducted from April 2006 through September 2006. Treatment with olanzapine ODT and OST equally elevated the homeostasis model assessment of insulin resistance (HOMA-IR) (P = .005). At breakfast, both formulations equally increased fasting and postprandial triglyceride concentrations (P = .013 and P = .005, respectively) while decreasing fasting and postprandial FFA concentrations (P = .004 and P = .009, respectively). Body weight, body composition, physical activity, or fuel oxidation did not differ between treatment modalities. Eight days of treatment with both olanzapine formulations similarly increased HOMA-IR and triglyceride concentrations and decreased FFA concentrations in response to standard meals without affecting anthropometrics or physical activity. These data suggest that olanzapine hampers insulin action via mechanistic routes other than body adiposity or physical inactivity. controlled-trials.com. Identifier: ISRCTN17632637. © Copyright

  13. The Standard Model with one universal extra dimension

    Indian Academy of Sciences (India)

    An exhaustive list of the explicit expressions for all physical couplings induced by the ... the standard Green's functions, which implies that the Standard Model observables do ...... renormalizability of standard Green's functions is implicit in this.

  14. Standardized reporting for rapid relative effectiveness assessments of pharmaceuticals.

    Science.gov (United States)

    Kleijnen, Sarah; Pasternack, Iris; Van de Casteele, Marc; Rossi, Bernardette; Cangini, Agnese; Di Bidino, Rossella; Jelenc, Marjetka; Abrishami, Payam; Autti-Rämö, Ilona; Seyfried, Hans; Wildbacher, Ingrid; Goettsch, Wim G

    2014-11-01

    Many European countries perform rapid assessments of the relative effectiveness (RE) of pharmaceuticals as part of the reimbursement decision making process. Increased sharing of information on RE across countries may save costs and reduce duplication of work. The objective of this article is to describe the development of a tool for rapid assessment of RE of new pharmaceuticals that enter the market, the HTA Core Model® for Rapid Relative Effectiveness Assessment (REA) of Pharmaceuticals. Eighteen member organisations of the European Network of Health Technology Assessment (EUnetHTA) participated in the development of the model. Different versions of the model were developed and piloted in this collaboration and adjusted accordingly based on feedback on the content and feasibility of the model. The final model deviates from the traditional HTA Core Model® used for assessing other types of technologies. This is due to the limited scope (strong focus on RE), the timing of the assessment (just after market authorisation), and strict timelines (e.g. 90 days) required for performing the assessment. The number of domains and assessment elements was limited and it was decided that the primary information sources should preferably be a submission file provided by the marketing authorisation holder and the European Public Assessment Report. The HTA Core Model® for Rapid REA (version 3.0) was developed to produce standardised transparent RE information of pharmaceuticals. Further piloting can provide input for possible improvements, such as further refining the assessment elements and new methodological guidance on relevant areas.

  15. Environmental assessment. Energy efficiency standards for consumer products

    Energy Technology Data Exchange (ETDEWEB)

    McSwain, Berah

    1980-06-01

    The Energy Policy and Conservation Act of 1975 requires DOE to prescribe energy efficiency standards for 13 consumer products. The Consumer Products Efficiency Standards (CPES) program covers: refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners (cooling and heat pumps), furnaces, dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers. This Environmental Assessment evaluates the potential environmental and socioeconomic impacts expected as a result of setting efficiency standards for all of the consumer products covered by the CPES program. DOE has proposed standards for eight of the products covered by the Program in a Notice of Proposed Rulemaking (NOPR). DOE expects to propose standards for home heating equipment, central air conditioners (heat pumps only), dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers in 1981. No significant adverse environmental or socioeconomic impacts have been found to result from instituting the CPES.

  16. Budget impact model in moderate-to-severe psoriasis vulgaris assessing effects of calcipotriene and betamethasone dipropionate foam on per-patient standard of care costs.

    Science.gov (United States)

    Asche, Carl V; Kim, Minchul; Feldman, Steven R; Zografos, Panagiotis; Lu, Minyi

    2017-09-01

    To develop a budget impact model (BIM) for estimating the financial impact of formulary adoption and uptake of calcipotriene and betamethasone dipropionate (C/BD) foam (0.005%/0.064%) on the costs of biologics for treating moderate-to-severe psoriasis vulgaris in a hypothetical US healthcare plan with 1 million members. This BIM incorporated epidemiologic data, market uptake assumptions, and drug utilization costs, simulating the treatment mix for patients who are candidates for biologics before (Scenario #1) and after (Scenario #2) the introduction of C/BD foam. Predicted outcomes were expressed in terms of the annual cost of treatment (COT) and the COT per member per month (PMPM). At year 1, C/BD foam had the lowest per-patient cost ($9,913) necessary to achieve a Psoriasis Area and Severity Index (PASI)-75 response compared with etanercept ($73,773), adalimumab ($92,871), infliximab ($34,048), ustekinumab ($83,975), secukinumab ($113,858), apremilast ($47,960), and ixekizumab ($62,707). Following addition of C/BD foam to the formulary, the annual COT for moderate-to-severe psoriasis would decrease by $36,112,572 (17.91%, from $201,621,219 to $165,508,647). The COT PMPM is expected to decrease by $3.00 (17.86%, from $16.80 to $13.80). Drug costs were based on Medi-Span reference pricing (January 21, 2016); differences in treatment costs for drug administration, laboratory monitoring, or adverse events were not accounted for. Potentially confounding were the definition of "moderate-to-severe" and the heterogeneous efficacy data. The per-patient cost for PASI-75 response at year 1 was estimated from short-term efficacy data for C/BD foam and apremilast only. The introduction of C/BD foam is expected to decrease the annual COT for moderate-to-severe psoriasis treatable with biologics by $36,112,572 for a hypothetical US healthcare plan with 1 million plan members, and to lower the COT PMPM by $3.00.

  17. Primordial lithium and the standard model(s)

    International Nuclear Information System (INIS)

    Deliyannis, C.P.; Demarque, P.; Kawaler, S.D.; Krauss, L.M.; Romanelli, P.

    1989-01-01

    We present the results of new theoretical work on surface 7 Li and 6 Li evolution in the oldest halo stars along with a new and refined analysis of the predicted primordial lithium abundance resulting from big-bang nucleosynthesis. This allows us to determine the constraints which can be imposed upon cosmology by a consideration of primordial lithium using both standard big-bang and standard stellar-evolution models. Such considerations lead to a constraint on the baryon density today of 0.0044 2 <0.025 (where the Hubble constant is 100h Km sec/sup -1/ Mpc /sup -1/), and impose limitations on alternative nucleosynthesis scenarios

  18. STANDARDIZING QUALITY ASSESSMENT OF FUSED REMOTELY SENSED IMAGES

    Directory of Open Access Journals (Sweden)

    C. Pohl

    2017-09-01

    Full Text Available The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  19. Standardizing Quality Assessment of Fused Remotely Sensed Images

    Science.gov (United States)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  20. Standards of Ombudsman Assessment: A New Normative Concept?

    Directory of Open Access Journals (Sweden)

    Milan Remac

    2013-07-01

    Full Text Available Today, an ombudsman is a traditional component of democratic legal systems. Generally, reports of the ombudsman are not legally binding. Due to this fact, the ombudsman can rely only on his own persuasiveness, on his acceptance by individuals and state institutions, on the understanding of the administration and on the accessibility and transparency of rules that underpin his reports. During investigations, ombudsmen assess whether the administration has acted in accordance with certain legal or extra-legal standards. Depending on the legal system, ombudsmen can investigate whether there is an instance of maladministration in the activities of administrative bodies, whether the administration has acted ‘properly’, whether it has acted in accordance with the law, whether administrative actions have breached the human rights of complainants or whether the actions of the administration were in accordance with anti-corruption rules etc. Regardless of the legislative standard of an ombudsman’s control, the ombudsman should consider and assess the situation described in complaints against certain criteria or against certain normative standards. A distinct set of standards which ombudsmen use during their investigation, or at least a clear statement of their assessment criteria, can increase the transparency of their procedures and the persuasiveness of their reports. Are the normative standards used by different ombudsmen the same? Do they possibly create a new normative concept? And can it possibly lead to a higher acceptance of their reports by the administration?

  1. Safety assessment standards for modern plants in the UK

    International Nuclear Information System (INIS)

    Harbison, S.A.; Hannaford, J.

    1993-01-01

    The NII has revised its safety assessment principles (SAPs). This paper discusses the revised SAPs and their links with international standards. It considers the licensing of foreign designs of plant - a matter under active consideration in the UK -and discusses how the SAPs and the licensing process cater for that possibility. (author)

  2. Psychosocial Assessment as a Standard of Care in Pediatric Cancer

    NARCIS (Netherlands)

    Kazak, Anne E.; Abrams, Annah N.; Banks, Jaime; Christofferson, Jennifer; DiDonato, Stephen; Grootenhuis, Martha A.; Kabour, Marianne; Madan-Swain, Avi; Patel, Sunita K.; Zadeh, Sima; Kupst, Mary Jo

    2015-01-01

    This paper presents the evidence for a standard of care for psychosocial assessment in pediatric cancer. An interdisciplinary group of investigators utilized EBSCO, PubMed, PsycINFO, Ovid, and Google Scholar search databases, focusing on five areas: youth/family psychosocial adjustment, family

  3. Xpand chest drain: assessing equivalence to current standard ...

    African Journals Online (AJOL)

    leakage from 'open to air' system or breakage of glass bottle (with associated risk to ... and an air-leak detection system. It is connected to a ... need to add water. Xpand chest drain: assessing equivalence to current standard therapy – a randomised controlled trial. CHARL COOPER, M.B. CH.B. TIMOTHY HARDCASTLE ...

  4. Motivational Effects of Standardized Language Assessment on Chinese Young Learners

    Science.gov (United States)

    Zhao, Chuqiao

    2016-01-01

    This review paper examines how standardized language assessment affects Chinese young learners' motivation for second-language learning. By presenting the historical and contemporary contexts of the testing system in China, this paper seeks to demonstrate the interrelationship among cultural, social, familial, and individual factors, which…

  5. Designing Standardized Patient Assessments to Measure SBIRT Skills

    Science.gov (United States)

    Wamsley, Maria A.; Julian, Katherine A.; O'Sullivan, Patricia; Satterfield, Jason M.; Satre, Derek D.; McCance-Katz, Elinore; Batki, Steven L.

    2013-01-01

    Objectives: Resident physicians report insufficient experience caring for patients with substance use disorders (SUDs). Resident training in Screening, Brief Intervention, and Referral to Treatment (SBIRT) has been recommended. We describe the development of a standardized patient (SP) assessment to measure SBIRT skills, resident perceptions of…

  6. Searches for Beyond Standard Model Physics with ATLAS and CMS

    CERN Document Server

    Rompotis, Nikolaos; The ATLAS collaboration

    2017-01-01

    The exploration of the high energy frontier with ATLAS and CMS experiments provides one of the best opportunities to look for physics beyond the Standard Model. In this talk, I review the motivation, the strategy and some recent results related to beyond Standard Model physics from these experiments. The review will cover beyond Standard Model Higgs boson searches, supersymmetry and searches for exotic particles.

  7. Connected formulas for amplitudes in standard model

    Energy Technology Data Exchange (ETDEWEB)

    He, Song [CAS Key Laboratory of Theoretical Physics,Institute of Theoretical Physics, Chinese Academy of Sciences,Beijing 100190 (China); School of Physical Sciences, University of Chinese Academy of Sciences,No. 19A Yuquan Road, Beijing 100049 (China); Zhang, Yong [Department of Physics, Beijing Normal University,Beijing 100875 (China); CAS Key Laboratory of Theoretical Physics,Institute of Theoretical Physics, Chinese Academy of Sciences,Beijing 100190 (China)

    2017-03-17

    Witten’s twistor string theory has led to new representations of S-matrix in massless QFT as a single object, including Cachazo-He-Yuan formulas in general and connected formulas in four dimensions. As a first step towards more realistic processes of the standard model, we extend the construction to QCD tree amplitudes with massless quarks and those with a Higgs boson. For both cases, we find connected formulas in four dimensions for all multiplicities which are very similar to the one for Yang-Mills amplitudes. The formula for quark-gluon color-ordered amplitudes differs from the pure-gluon case only by a Jacobian factor that depends on flavors and orderings of the quarks. In the formula for Higgs plus multi-parton amplitudes, the massive Higgs boson is effectively described by two additional massless legs which do not appear in the Parke-Taylor factor. The latter also represents the first twistor-string/connected formula for form factors.

  8. Experimental tests of the standard model

    International Nuclear Information System (INIS)

    Nodulman, L.

    1998-01-01

    The title implies an impossibly broad field, as the Standard Model includes the fermion matter states, as well as the forces and fields of SU(3) x SU(2) x U(1). For practical purposes, I will confine myself to electroweak unification, as discussed in the lectures of M. Herrero. Quarks and mixing were discussed in the lectures of R. Aleksan, and leptons and mixing were discussed in the lectures of K. Nakamura. I will essentially assume universality, that is flavor independence, rather than discussing tests of it. I will not pursue tests of QED beyond noting the consistency and precision of measurements of α EM in various processes including the Lamb shift, the anomalous magnetic moment (g-2) of the electron, and the quantum Hall effect. The fantastic precision and agreement of these predictions and measurements is something that convinces people that there may be something to this science enterprise. Also impressive is the success of the ''Universal Fermi Interaction'' description of beta decay processes, or in more modern parlance, weak charged current interactions. With one coupling constant G F , most precisely determined in muon decay, a huge number of nuclear instabilities are described. The slightly slow rate for neutron beta decay was one of the initial pieces of evidence for Cabbibo mixing, now generalized so that all charged current decays of any flavor are covered

  9. Standard Model theory calculations and experimental tests

    International Nuclear Information System (INIS)

    Cacciari, M.; Hamel de Monchenault, G.

    2015-01-01

    To present knowledge, all the physics at the Large Hadron Collider (LHC) can be described in the framework of the Standard Model (SM) of particle physics. Indeed the newly discovered Higgs boson with a mass close to 125 GeV seems to confirm the predictions of the SM. Thus, besides looking for direct manifestations of the physics beyond the SM, one of the primary missions of the LHC is to perform ever more stringent tests of the SM. This requires not only improved theoretical developments to produce testable predictions and provide experiments with reliable event generators, but also sophisticated analyses techniques to overcome the formidable experimental environment of the LHC and perform precision measurements. In the first section, we describe the state of the art of the theoretical tools and event generators that are used to provide predictions for the production cross sections of the processes of interest. In section 2, inclusive cross section measurements with jets, leptons and vector bosons are presented. Examples of differential cross sections, charge asymmetries and the study of lepton pairs are proposed in section 3. Finally, in section 4, we report studies on the multiple production of gauge bosons and constraints on anomalous gauge couplings

  10. Lepton radiative decays in supersymmetric standard model

    International Nuclear Information System (INIS)

    Volkov, G.G.; Liparteliani, A.G.

    1988-01-01

    Radiative decays of charged leptons l i →l j γ(γ * ) have been discussed in the framework of the supersymmetric generalization of the standard model. The most general form of the formfactors for the one-loop vertex function is written. Decay widths of the mentioned radiative decays are calculated. Scalar lepton masses are estimated at the maximal mixing angle in the scalar sector proceeding from the present upper limit for the branching of the decay μ→eγ. In case of the maximal mixing angle and the least mass degeneration of scalar leptons of various generations the following lower limit for the scalar electron mass m e-tilde >1.5 TeV has been obtained. The mass of the scalar neutrino is 0(1) TeV, in case the charged calibrino is lighter than the scalar neutrino. The result obtained sensitive to the choice of the lepton mixing angle in the scalar sector, namely, in decreasing the value sin 2 θ by an order of magnitude, the limitation on the scalar electron mass may decrease more than 3 times. In the latter case the direct observation of electrons at the e + e - -collider (1x1 TeV) becomes available

  11. Geometrical basis for the Standard Model

    Science.gov (United States)

    Potter, Franklin

    1994-02-01

    The robust character of the Standard Model is confirmed. Examination of its geometrical basis in three equivalent internal symmetry spaces-the unitary plane C 2, the quaternion space Q, and the real space R 4—as well as the real space R 3 uncovers mathematical properties that predict the physical properties of leptons and quarks. The finite rotational subgroups of the gauge group SU(2) L × U(1) Y generate exactly three lepton families and four quark families and reveal how quarks and leptons are related. Among the physical properties explained are the mass ratios of the six leptons and eight quarks, the origin of the left-handed preference by the weak interaction, the geometrical source of color symmetry, and the zero neutrino masses. The ( u, d) and ( c, s) quark families team together to satisfy the triangle anomaly cancellation with the electron family, while the other families pair one-to-one for cancellation. The spontaneously broken symmetry is discrete and needs no Higgs mechanism. Predictions include all massless neutrinos, the top quark at 160 GeV/ c 2, the b' quark at 80 GeV/ c 2, and the t' quark at 2600 GeV/ c 2.

  12. Making Use of the New Student Assessment Standards To Enhance Technological Literacy.

    Science.gov (United States)

    Russell, Jill

    2003-01-01

    Describes the student assessment standards outlined in "Advancing Excellence in Technological Literacy: Student Assessment, Professional Development, and Program Standards," a companion to the "Standards for Technological Literacy." Discusses how the standards apply to everyday teaching practices. (JOW)

  13. Standards for psychological assessment of nuclear facility personnel. Technical report

    International Nuclear Information System (INIS)

    Frank, F.D.; Lindley, B.S.; Cohen, R.A.

    1981-07-01

    The subject of this study was the development of standards for the assessment of emotional instability in applicants for nuclear facility positions. The investigation covered all positions associated with a nuclear facility. Conclusions reached in this investigation focused on the ingredients of an integrated selection system including the use of personality tests, situational simulations, and the clinical interview; the need for professional standards to ensure quality control; the need for a uniform selection system as organizations vary considerably in terms of instruments presently used; and the need for an on-the-job behavioral observation program

  14. Prediction of Phase Behavior of Spray-Dried Amorphous Solid Dispersions: Assessment of Thermodynamic Models, Standard Screening Methods and a Novel Atomization Screening Device with Regard to Prediction Accuracy

    Directory of Open Access Journals (Sweden)

    Aymeric Ousset

    2018-03-01

    Full Text Available The evaluation of drug–polymer miscibility in the early phase of drug development is essential to ensure successful amorphous solid dispersion (ASD manufacturing. This work investigates the comparison of thermodynamic models, conventional experimental screening methods (solvent casting, quench cooling, and a novel atomization screening device based on their ability to predict drug–polymer miscibility, solid state properties (Tg value and width, and adequate polymer selection during the development of spray-dried amorphous solid dispersions (SDASDs. Binary ASDs of four drugs and seven polymers were produced at 20:80, 40:60, 60:40, and 80:20 (w/w. Samples were systematically analyzed using modulated differential scanning calorimetry (mDSC and X-ray powder diffraction (XRPD. Principal component analysis (PCA was used to qualitatively assess the predictability of screening methods with regards to SDASD development. Poor correlation was found between theoretical models and experimentally-obtained results. Additionally, the limited ability of usual screening methods to predict the miscibility of SDASDs did not guarantee the appropriate selection of lead excipient for the manufacturing of robust SDASDs. Contrary to standard approaches, our novel screening device allowed the selection of optimal polymer and drug loading and established insight into the final properties and performance of SDASDs at an early stage, therefore enabling the optimization of the scaled-up late-stage development.

  15. Models for Pesticide Risk Assessment

    Science.gov (United States)

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  16. Assessment of resveratrol, apocynin and taurine on mechanical-metabolic uncoupling and oxidative stress in a mouse model of duchenne muscular dystrophy: A comparison with the gold standard, α-methyl prednisolone.

    Science.gov (United States)

    Capogrosso, Roberta Francesca; Cozzoli, Anna; Mantuano, Paola; Camerino, Giulia Maria; Massari, Ada Maria; Sblendorio, Valeriana Teresa; De Bellis, Michela; Tamma, Roberto; Giustino, Arcangela; Nico, Beatrice; Montagnani, Monica; De Luca, Annamaria

    2016-04-01

    Antioxidants have a great potential as adjuvant therapeutics in patients with Duchenne muscular dystrophy, although systematic comparisons at pre-clinical level are limited. The present study is a head-to-head assessment, in the exercised mdx mouse model of DMD, of natural compounds, resveratrol and apocynin, and of the amino acid taurine, in comparison with the gold standard α-methyl prednisolone (PDN). The rationale was to target the overproduction of reactive oxygen species (ROS) via disease-related pathways that are worsened by mechanical-metabolic impairment such as inflammation and over-activity of NADPH oxidase (NOX) (taurine and apocynin, respectively) or the failing ROS detoxification mechanisms via sirtuin-1 (SIRT1)-peroxisome proliferator-activated receptor γ coactivator 1α (PGC-1α) (resveratrol). Resveratrol (100mg/kg i.p. 5days/week), apocynin (38mg/kg/day per os), taurine (1g/kg/day per os), and PDN (1mg/kg i.p., 5days/week) were administered for 4-5 weeks to mdx mice in parallel with a standard protocol of treadmill exercise and the outcome was evaluated with a multidisciplinary approach in vivo and ex vivo on pathology-related end-points and biomarkers of oxidative stress. Resveratrol≥taurine>apocynin enhanced in vivo mouse force similarly to PDN. All the compounds reduced the production of superoxide anion, assessed by dihydroethidium staining, with apocynin being as effective as PDN, and ameliorated electrophysiological biomarkers of oxidative stress. Resveratrol also significantly reduced plasma levels of creatine kinase and lactate dehydrogenase. Force of isolated muscles was little ameliorated. However, the three compounds improved histopathology of gastrocnemius muscle more than PDN. Taurine>apocynin>PDN significantly decreased activated NF-kB positive myofibers. Thus, compounds targeting NOX-ROS or SIRT1/PGC-1α pathways differently modulate clinically relevant DMD-related endpoints according to their mechanism of action. With the

  17. Higgs bosons in the standard model, the MSSM and beyond

    Indian Academy of Sciences (India)

    Abstract. I summarize the basic theory and selected phenomenology for the Higgs boson(s) of the standard model, the minimal supersymmetric model and some extensions thereof, including the next-to-minimal supersymmetric model.

  18. Neutrinos: in and out of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen; /Fermilab

    2006-07-01

    The particle physics Standard Model has been tremendously successful in predicting the outcome of a large number of experiments. In this model Neutrinos are massless. Yet recent evidence points to the fact that neutrinos are massive particles with tiny masses compared to the other particles in the Standard Model. These tiny masses allow the neutrinos to change flavor and oscillate. In this series of Lectures, I will review the properties of Neutrinos In the Standard Model and then discuss the physics of Neutrinos Beyond the Standard Model. Topics to be covered include Neutrino Flavor Transformations and Oscillations, Majorana versus Dirac Neutrino Masses, the Seesaw Mechanism and Leptogenesis.

  19. Technical Standards on the Safety Assessment of a HLW Repository in Other Countries

    International Nuclear Information System (INIS)

    Lee, Sung Ho; Hwang, Yong Soo

    2009-01-01

    The basic function of HLW disposal system is to prevent excessive radio-nuclides being leaked from the repository in a short time. To do this, many technical standards should be developed and established on the components of disposal system. Safety assessment of a repository is considered as one of technical standards, because it produces quantitative results of the future evolution of a repository based on a reasonably simplified model. In this paper, we investigated other countries' regulations related to safely assessment focused on the assessment period, radiation dose limits and uncertainties of the assessment. Especially, in the investigation process of the USA regulations, the USA regulatory bodies' approach to assessment period and peak dose is worth taking into account in case of a conflict between peak dose from safety assessment and limited value in regulation.

  20. Gauge coupling unification in superstring derived standard-like models

    International Nuclear Information System (INIS)

    Faraggi, A.E.

    1992-11-01

    I discuss gauge coupling unification in a class of superstring standard-like models, which are derived in the free fermionic formulation. Recent calculations indicate that the superstring unification scale is at O(10 18 GeV) while the minimal supersymmetric standard model is consistent with LEP data if the unification scale is at O(10 16 )GeV. A generic feature of the superstring standard-like models is the appearance of extra color triplets (D,D), and electroweak doublets (l,l), in vector-like representations, beyond the supersymmetric standard model. I show that the gauge coupling unification at O(10 18 GeV) in the superstring standard-like models can be consistent with LEP data. I present an explicit standard-like model that can realize superstring gauge coupling unification. (author)

  1. Effects of tailored neck-shoulder pain treatment based on a decision model guided by clinical assessments and standardized functional tests. A study protocol of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Björklund Martin

    2012-05-01

    Full Text Available Abstract Background A major problem with rehabilitation interventions for neck pain is that the condition may have multiple causes, thus a single treatment approach is seldom efficient. The present study protocol outlines a single blinded randomised controlled trial evaluating the effect of tailored treatment for neck-shoulder pain. The treatment is based on a decision model guided by standardized clinical assessment and functional tests with cut-off values. Our main hypothesis is that the tailored treatment has better short, intermediate and long-term effects than either non-tailored treatment or treatment-as-usual (TAU on pain and function. We sub-sequentially hypothesize that tailored and non-tailored treatment both have better effect than TAU. Methods/Design 120 working women with minimum six weeks of nonspecific neck-shoulder pain aged 20–65, are allocated by minimisation with the factors age, duration of pain, pain intensity and disability in to the groups tailored treatment (T, non-tailored treatment (NT or treatment-as-usual (TAU. Treatment is given to the groups T and NT for 11 weeks (27 sessions evenly distributed. An extensive presentation of the tests and treatment decision model is provided. The main treatment components are manual therapy, cranio-cervical flexion exercise and strength training, EMG-biofeedback training, treatment for cervicogenic headache, neck motor control training. A decision algorithm based on the baseline assessment determines the treatment components given to each participant of T- and NT-groups. Primary outcome measures are physical functioning (Neck Disability Index and average pain intensity last week (Numeric Rating Scale. Secondary outcomes are general improvement (Patient Global Impression of Change scale, symptoms (Profile Fitness Mapping neck questionnaire, capacity to work in the last 6 weeks (quality and quantity and pressure pain threshold of m. trapezius. Primary and secondary outcomes will

  2. Beyond the standard model with B and K physics

    International Nuclear Information System (INIS)

    Grossman, Y

    2003-01-01

    In the first part of the talk the flavor physics input to models beyond the standard model is described. One specific example of such new physics model is given: A model with bulk fermions in a non factorizable one extra dimension. In the second part of the talk we discuss several observables that are sensitive to new physics. We explain what type of new physics can produce deviations from the standard model predictions in each of these observables

  3. Positive animal welfare states and reference standards for welfare assessment.

    Science.gov (United States)

    Mellor, D J

    2015-01-01

    Developments in affective neuroscience and behavioural science during the last 10-15 years have together made it increasingly apparent that sentient animals are potentially much more sensitive to their environmental and social circumstances than was previously thought to be the case. It therefore seems likely that both the range and magnitude of welfare trade-offs that occur when animals are managed for human purposes have been underestimated even when minimalistic but arguably well-intentioned attempts have been made to maintain high levels of welfare. In light of these neuroscience-supported behaviour-based insights, the present review considers the extent to which the use of currently available reference standards might draw attention to these previously neglected areas of concern. It is concluded that the natural living orientation cannot provide an all-embracing or definitive welfare benchmark because of its primary focus on behavioural freedom. However assessments of this type, supported by neuroscience insights into behavioural motivation, may now carry greater weight when used to identify management practices that should be avoided, discontinued or substantially modified. Using currently accepted baseline standards as welfare reference points may result in small changes being accorded greater significance than would be the case if they were compared with higher standards, and this could slow the progress towards better levels of welfare. On the other hand, using "what animals want" as a reference standard has the appeal of focusing on the specific resources or conditions the animals would choose themselves and can potentially improve their welfare more quickly than the approach of making small increments above baseline standards. It is concluded that the cautious use of these approaches in different combinations could lead to recommendations that would more effectively promote positive welfare states in hitherto neglected areas of concern.

  4. Public School Finance Assessment Project Aligned with ELCC Standards

    Science.gov (United States)

    Risen, D. Michael

    2008-01-01

    This is a detailed description of an assessment that can be used in a graduate level of study in the area of public school finance. This has been approved by NCATE as meeting all of the stipulated ELCC standards for which it is designed (1.1, 1.2, 1.3, 1.4, 1.5, 2.1, 2.2, 2.3, 2.4, 3.1, 3.2, 3.3, 4.1, 4.2, 4.3, 5.1, 5.2, 5.3.). This course of…

  5. Standardized training in nurse model travel clinics.

    Science.gov (United States)

    Sofarelli, Theresa A; Ricks, Jane H; Anand, Rahul; Hale, Devon C

    2011-01-01

    International travel plays a significant role in the emergence and redistribution of major human diseases. The importance of travel medicine clinics for preventing morbidity and mortality has been increasingly appreciated, although few studies have thus far examined the management and staff training strategies that result in successful travel-clinic operations. Here, we describe an example of travel-clinic operation and management coordinated through the University of Utah School of Medicine, Division of Infectious Diseases. This program, which involves eight separate clinics distributed statewide, functions both to provide patient consult and care services, as well as medical provider training and continuing medical education (CME). Initial training, the use of standardized forms and protocols, routine chart reviews and monthly continuing education meetings are the distinguishing attributes of this program. An Infectious Disease team consisting of one medical doctor (MD) and a physician assistant (PA) act as consultants to travel nurses who comprise the majority of clinic staff. Eight clinics distributed throughout the state of Utah serve approximately 6,000 travelers a year. Pre-travel medical services are provided by 11 nurses, including 10 registered nurses (RNs) and 1 licensed practical nurse (LPN). This trained nursing staff receives continuing travel medical education and participate in the training of new providers. All nurses have completed a full training program and 7 of the 11 (64%) of clinic nursing staff serve more than 10 patients a week. Quality assurance measures show that approximately 0.5% of charts reviewed contain a vaccine or prescription error which require patient notification for correction. Using an initial training program, standardized patient intake forms, vaccine and prescription protocols, preprinted prescriptions, and regular CME, highly trained nurses at travel clinics are able to provide standardized pre-travel care to

  6. Basic Laparoscopic Skills Assessment Study: Validation and Standard Setting among Canadian Urology Trainees.

    Science.gov (United States)

    Lee, Jason Y; Andonian, Sero; Pace, Kenneth T; Grober, Ethan

    2017-06-01

    As urology training programs move to a competency based medical education model, iterative assessments with objective standards will be required. To develop a valid set of technical skills standards we initiated a national skills assessment study focusing initially on laparoscopic skills. Between February 2014 and March 2016 the basic laparoscopic skill of Canadian urology trainees and attending urologists was assessed using 4 standardized tasks from the AUA (American Urological Association) BLUS (Basic Laparoscopic Urological Surgery) curriculum, including peg transfer, pattern cutting, suturing and knot tying, and vascular clip applying. All performances were video recorded and assessed using 3 methods, including time and error based scoring, expert global rating scores and C-SATS (Crowd-Sourced Assessments of Technical Skill Global Rating Scale), a novel, crowd sourced assessment platform. Different methods of standard setting were used to develop pass-fail cut points. Six attending urologists and 99 trainees completed testing. Reported laparoscopic experience and training level correlated with performance (p standard setting methods to define pass-fail cut points for all 4 AUA BLUS tasks. The 4 AUA BLUS tasks demonstrated good construct validity evidence for use in assessing basic laparoscopic skill. Performance scores using the novel C-SATS platform correlated well with traditional time-consuming methods of assessment. Various standard setting methods were used to develop pass-fail cut points for educators to use when making formative and summative assessments of basic laparoscopic skill. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  7. Standard Model Constraints from the LHC

    International Nuclear Information System (INIS)

    Boonekamp, M.

    2007-01-01

    With our current knowledge limited by the absence of physics data, I review our expectations from standard processes measurements at the LHC. Focusing on charged and neutral current processes, I illustrate how their measurement will constrain our uncertainties on discovery physics, and give some arguments about our precision goal for the W mass measurement. Detailed analysis reveals that there is no reason to believe we can not measure this fundamental parameter to about 5 MeV. This sets a natural goal of about 500 MeV for the top mass; to decide whether this is realistic requires further investigation. (author)

  8. Assessment of the Japanese Energy Efficiency Standards Program

    Directory of Open Access Journals (Sweden)

    Jun Arakawa

    2015-03-01

    Full Text Available Japanese energy efficiency standards program for appliances is a unique program which sets and revises mandatory standards based on the products of the highest energy efficiency on the markets. This study assessed the cost-effectiveness of the standard settings for air conditioner as a major residential appliance or typical example in the program. Based on analyses of empirical data, the net costs and effects from 1999 to 2040 were estimated. When applying a discount rate of 3%, the cost of abating CO2 emissions realized through the considered standards was estimated to be -13700 JPY/t-CO2. The sensitivity analysis, however, showed the cost turns into positive at a discount rate of 26% or higher. The authors also revealed that the standards’ “excellent” cost-effectiveness largely depends on that of the 1st standard setting, and the CO2 abatement cost through the 2nd standard was estimated to be as high as 26800 JPY/t-CO2. The results imply that the government is required to be careful about the possible economic burden imposed when considering introducing new, additional standards.

  9. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  10. The thermal evolution of universe: standard model

    International Nuclear Information System (INIS)

    Nascimento, L.C.S. do.

    1975-08-01

    A description of the dynamical evolution of the Universe following a model based on the theory of General Relativity is made. The model admits the Cosmological principle,the principle of Equivalence and the Robertson-Walker metric (of which an original derivation is presented). In this model, the universe is considered as a perfect fluid, ideal and symmetric relatively to the number of particles and antiparticles. The thermodynamic relations deriving from these hypothesis are derived, and from them the several eras of the thermal evolution of the universe are established. Finally, the problems arising from certain specific predictions of the model are studied, and the predictions of the abundances of the elements according to nucleosynthesis and the actual behavior of the universe are analysed in detail. (author) [pt

  11. Integrated Environmental Assessment Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Guardanz, R; Gimeno, B S; Bermejo, V; Elvira, S; Martin, F; Palacios, M; Rodriguez, E; Donaire, I [Ciemat, Madrid (Spain)

    2000-07-01

    This report describes the results of the Spanish participation in the project Coupling CORINAIR data to cost-effect emission reduction strategies based on critical threshold. (EU/LIFE97/ENV/FIN/336). The subproject has focused on three tasks. Develop tools to improve knowledge on the spatial and temporal details of emissions of air pollutants in Spain. Exploit existing experimental information on plant response to air pollutants in temperate ecosystem and Integrate these findings in a modelling framework that can asses with more accuracy the impact of air pollutants to temperate ecosystems. The results obtained during the execution of this project have significantly improved the models of the impact of alternative emission control strategies on ecosystems and crops in the Iberian Peninsula. (Author) 375 refs.

  12. Self-assessment: Strategy for higher standards, consistency, and performance

    International Nuclear Information System (INIS)

    Ide, W.E.

    1996-01-01

    In late 1994, Palo Verde operations underwent a transformation from a unitized structure to a single functional unit. It was necessary to build consistency in watchstanding practices and create a shared mission. Because there was a lack of focus on actual plant operations and because personnel were deeply involved with administrative tasks, command and control of evolutions were weak. Improvement was needed. Consistent performance standards have been set for all three operating units. These expectation focus on nuclear, radiological, and industrial safety. Straightforward descriptions of watchstanding and monitoring practices have been provided to all department personnel. The desired professional and leadership qualities for employee conduct have been defined and communicated thoroughly. A healthy and competitive atmosphere developed with the successful implementation of these standards. Overall performance improved. The auxiliary operators demonstrated increased pride and ownership in the performance of their work activities. In addition, their morale improved. Crew teamwork improved as well as the quality of shift briefs. There was a decrease in the noise level and the administrative functions in the control room. The use of self-assessment helped to anchor and define higher and more consistent standards. The proof of Palo Verde's success was evident when an Institute of Nuclear Power Operations finding was turned into a strength within 1 yr

  13. Radiation protection standards: A practical exercise in risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Clarke, Roger H [National Radiological Protection Board (United Kingdom)

    1992-07-01

    Within 12 months of the discovery of x-rays in 1895, it was reported that large doses of radiation were harmful to living human tissues. The first radiation protection standards were set to avoid the early effects of acute irradiation. By the 1950s, evidence was mounting for late somatic effects - mainly a small excess of cancers - in irradiated populations. In the late 1980's, sufficient human epidemiological data had been accumulated to allow a comprehensive assessment of carcinogenic radiation risks following the delivery of moderately high doses. Workers and the public are exposed to lower doses and dose-rates than the groups from whom good data are available so that risks have had to be estimated for protection purposes. However, in the 1990s, some confirmation of these risk factors has been derived occupationally exposed populations. If an estimate is made of the risk per unit dose, then in order to set dose limits, an unacceptable level of risk must be established for both workers and the public. There has been and continues to be a debate about the definitions of 'acceptable' and 'tolerable' and the attributing of numerical values to these definitions. This paper discusses the issues involved in the quantification of these terms and their application to setting dose limits on risk grounds. Conclusions are drawn about the present protection standards and the application of the methods to other fields of risk assessment. (author)

  14. Radiation protection standards: A practical exercise in risk assessment

    International Nuclear Information System (INIS)

    Clarke, Roger H.

    1992-01-01

    Within 12 months of the discovery of x-rays in 1895, it was reported that large doses of radiation were harmful to living human tissues. The first radiation protection standards were set to avoid the early effects of acute irradiation. By the 1950s, evidence was mounting for late somatic effects - mainly a small excess of cancers - in irradiated populations. In the late 1980's, sufficient human epidemiological data had been accumulated to allow a comprehensive assessment of carcinogenic radiation risks following the delivery of moderately high doses. Workers and the public are exposed to lower doses and dose-rates than the groups from whom good data are available so that risks have had to be estimated for protection purposes. However, in the 1990s, some confirmation of these risk factors has been derived occupationally exposed populations. If an estimate is made of the risk per unit dose, then in order to set dose limits, an unacceptable level of risk must be established for both workers and the public. There has been and continues to be a debate about the definitions of 'acceptable' and 'tolerable' and the attributing of numerical values to these definitions. This paper discusses the issues involved in the quantification of these terms and their application to setting dose limits on risk grounds. Conclusions are drawn about the present protection standards and the application of the methods to other fields of risk assessment. (author)

  15. A 'theory of everything'? [Extending the Standard Model

    International Nuclear Information System (INIS)

    Ross, G.G.

    1993-01-01

    The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)

  16. Standard Model Higgs boson searches with the ATLAS detector

    Indian Academy of Sciences (India)

    The experimental results on the search of the Standard Model Higgs boson with 1 to 2 fb-1 of proton–proton collision data at s = 7 TeV recorded by the ATLAS detector are presented and discussed. No significant excess of events is found with respect to the expectations from Standard Model processes, and the production ...

  17. Can An Amended Standard Model Account For Cold Dark Matter?

    International Nuclear Information System (INIS)

    Goldhaber, Maurice

    2004-01-01

    It is generally believed that one has to invoke theories beyond the Standard Model to account for cold dark matter particles. However, there may be undiscovered universal interactions that, if added to the Standard Model, would lead to new members of the three generations of elementary fermions that might be candidates for cold dark matter particles

  18. Electroweak symmetry breaking beyond the Standard Model

    International Nuclear Information System (INIS)

    Bhattacharyya, Gautam

    2012-01-01

    In this paper, two key issues related to electroweak symmetry breaking are addressed. First, how fine-tuned different models are that trigger this phenomenon? Second, even if a light Higgs boson exists, does it have to be necessarily elementary? After a brief introduction, the fine-tuning aspects of the MSSM, NMSSM, generalized NMSSM and GMSB scenarios shall be reviewed, then the little Higgs, composite Higgs and the Higgsless models shall be compared. Finally, a broad overview will be given on where we stand at the end of 2011. (author)

  19. Psychological distress and streamlined BreastScreen follow-up assessment versus standard assessment.

    Science.gov (United States)

    Sherman, Kerry A; Winch, Caleb J; Borecky, Natacha; Boyages, John

    2013-11-04

    To establish whether altered protocol characteristics of streamlined StepDown breast assessment clinics heightened or reduced the psychological distress of women in attendance compared with standard assessment. Willingness to attend future screening was also compared between the assessment groups. Observational, prospective study of women attending either a mammogram-only StepDown or a standard breast assessment clinic. Women completed questionnaires on the day of assessment and 1 month later. Women attending StepDown (136 women) or standard assessment clinics (148 women) at a BreastScreen centre between 10 November 2009 and 7 August 2010. Breast cancer worries; positive and negative psychological consequences of assessment (Psychological Consequences Questionnaire); breast cancer-related intrusion and avoidance (Impact of Event Scale); and willingness to attend, and uneasiness about, future screening. At 1-month follow-up, no group differences were evident between those attending standard and StepDown clinics on breast cancer worries (P= 0.44), positive (P= 0.88) and negative (P = 0.65) consequences, intrusion (P = 0.64), and avoidance (P = 0.87). Willingness to return for future mammograms was high, and did not differ between groups (P = 0.16), although higher levels of unease were associated with lessened willingness to rescreen (P = 0.04). There was no evidence that attending streamlined StepDown assessments had different outcomes in terms of distress than attending standard assessment clinics for women with a BreastScreen-detected abnormality. However, unease about attending future screening was generally associated with less willingness to do so in both groups; thus, there is a role for psycho-educational intervention to address these concerns.

  20. Modeling and Simulation Network Data Standards

    Science.gov (United States)

    2011-09-30

    approaches . 2.3. JNAT. JNAT is a Web application that provides connectivity and network analysis capability. JNAT uses propagation models and low-fidelity...COMBATXXI Movement Logger Data Output Dictionary. Field # Geocentric Coordinates (GCC) Heading Geodetic Coordinates (GDC) Heading Universal...B-8 Field # Geocentric Coordinates (GCC) Heading Geodetic Coordinates (GDC) Heading Universal Transverse Mercator (UTM) Heading

  1. Electroweak symmetry breaking beyond the Standard Model

    Indian Academy of Sciences (India)

    words, now that the gauge symmetry is established with a significant ..... picture, the Higgs is some kind of a composite bound state emerging from a strongly .... (i) Little Higgs vs. composite: Little Higgs models were introduced to solve the little ...

  2. Development of the Test Of Astronomy STandards (TOAST) Assessment Instrument

    Science.gov (United States)

    Slater, Timothy F.; Slater, S. J.

    2008-05-01

    Considerable effort in the astronomy education research (AER) community over the past several years has focused on developing assessment tools in the form of multiple-choice conceptual diagnostics and content knowledge surveys. This has been critically important in advancing the AER discipline so that researchers could establish the initial knowledge state of students as well as to attempt measure some of the impacts of innovative instructional interventions. Unfortunately, few of the existing instruments were constructed upon a solid list of clearly articulated and widely agreed upon learning objectives. This was not done in oversight, but rather as a result of the relative youth of AER as a discipline. Now that several important science education reform documents exist and are generally accepted by the AER community, we are in a position to develop, validate, and disseminate a new assessment instrument which is tightly aligned to the consensus learning goals stated by the American Astronomical Society - Chair's Conference on ASTRO 101, the American Association of the Advancement of Science's Project 2061 Benchmarks, and the National Research Council's National Science Education Standards. In response, researchers from the Cognition in Astronomy, Physics and Earth sciences Research (CAPER) Team at the University of Wyoming's Science & Math Teaching Center (UWYO SMTC) have designed a criterion-referenced assessment tool, called the Test Of Astronomy STandards (TOAST). Through iterative development, this instrument has a high degree of reliability and validity for instructors and researchers needing information on students’ initial knowledge state at the beginning of a course and can be used, in aggregate, to help measure the impact of course-length duration instructional strategies for courses with learning goals tightly aligned to the consensus goals of our community.

  3. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  4. Big bang nucleosynthesis - The standard model and alternatives

    Science.gov (United States)

    Schramm, David N.

    1991-01-01

    The standard homogeneous-isotropic calculation of the big bang cosmological model is reviewed, and alternate models are discussed. The standard model is shown to agree with the light element abundances for He-4, H-2, He-3, and Li-7 that are available. Improved observational data from recent LEP collider and SLC results are discussed. The data agree with the standard model in terms of the number of neutrinos, and provide improved information regarding neutron lifetimes. Alternate models are reviewed which describe different scenarios for decaying matter or quark-hadron induced inhomogeneities. The baryonic density relative to the critical density in the alternate models is similar to that of the standard model when they are made to fit the abundances. This reinforces the conclusion that the baryonic density relative to critical density is about 0.06, and also reinforces the need for both nonbaryonic dark matter and dark baryonic matter.

  5. The Beyond the standard model working group: Summary report

    Energy Technology Data Exchange (ETDEWEB)

    G. Azuelos et al.

    2004-03-18

    dimensions. There, we considered: constraints on Kaluza Klein (KK) excitations of the SM gauge bosons from existing data (part XIII) and the corresponding projected LHC reach (part XIV); techniques for discovering and studying the radion field which is generic in most extra-dimensional scenarios (part XV); the impact of mixing between the radion and the Higgs sector, a fully generic possibility in extra-dimensional models (part XVI); production rates and signatures of universal extra dimensions at hadron colliders (part XVII); black hole production at hadron colliders, which would lead to truly spectacular events (part XVIII). The above contributions represent a tremendous amount of work on the part of the individuals involved and represent the state of the art for many of the currently most important phenomenological research avenues. Of course, much more remains to be done. For example, one should continue to work on assessing the extent to which the discovery reach will be extended if one goes beyond the LHC to the super-high-luminosity LHC (SLHC) or to a very large hadron collider (VLHC) with {radical}s {approx} 40 TeV. Overall, we believe our work shows that the LHC and future hadronic colliders will play a pivotal role in the discovery and study of any kind of new physics beyond the Standard Model. They provide tremendous potential for incredibly exciting new discoveries.

  6. Sporulation in Bacteria: Beyond the Standard Model.

    Science.gov (United States)

    Hutchison, Elizabeth A; Miller, David A; Angert, Esther R

    2014-10-01

    Endospore formation follows a complex, highly regulated developmental pathway that occurs in a broad range of Firmicutes. Although Bacillus subtilis has served as a powerful model system to study the morphological, biochemical, and genetic determinants of sporulation, fundamental aspects of the program remain mysterious for other genera. For example, it is entirely unknown how most lineages within the Firmicutes regulate entry into sporulation. Additionally, little is known about how the sporulation pathway has evolved novel spore forms and reproductive schemes. Here, we describe endospore and internal offspring development in diverse Firmicutes and outline progress in characterizing these programs. Moreover, comparative genomics studies are identifying highly conserved sporulation genes, and predictions of sporulation potential in new isolates and uncultured bacteria can be made from these data. One surprising outcome of these comparative studies is that core regulatory and some structural aspects of the program appear to be universally conserved. This suggests that a robust and sophisticated developmental framework was already in place in the last common ancestor of all extant Firmicutes that produce internal offspring or endospores. The study of sporulation in model systems beyond B. subtilis will continue to provide key information on the flexibility of the program and provide insights into how changes in this developmental course may confer advantages to cells in diverse environments.

  7. Physics Beyond the Standard Model: Supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Nojiri, M.M.; /KEK, Tsukuba /Tsukuba, Graduate U. Adv. Studies /Tokyo U.; Plehn, T.; /Edinburgh U.; Polesello, G.; /INFN, Pavia; Alexander, John M.; /Edinburgh U.; Allanach, B.C.; /Cambridge U.; Barr, Alan J.; /Oxford U.; Benakli, K.; /Paris U., VI-VII; Boudjema, F.; /Annecy, LAPTH; Freitas, A.; /Zurich U.; Gwenlan, C.; /University Coll. London; Jager, S.; /CERN /LPSC, Grenoble

    2008-02-01

    This collection of studies on new physics at the LHC constitutes the report of the supersymmetry working group at the Workshop 'Physics at TeV Colliders', Les Houches, France, 2007. They cover the wide spectrum of phenomenology in the LHC era, from alternative models and signatures to the extraction of relevant observables, the study of the MSSM parameter space and finally to the interplay of LHC observations with additional data expected on a similar time scale. The special feature of this collection is that while not each of the studies is explicitly performed together by theoretical and experimental LHC physicists, all of them were inspired by and discussed in this particular environment.

  8. Theory of Time beyond the standard model

    International Nuclear Information System (INIS)

    Poliakov, Eugene S.

    2008-01-01

    A frame of non-uniform time is discussed. A concept of 'flow of time' is presented. The principle of time relativity in analogy with Galilean principle of relativity is set. Equivalence principle is set to state that the outcome of non-uniform time in an inertial frame of reference is equivalent to the outcome of a fictitious gravity force external to the frame of reference. Thus it is flow of time that causes gravity rather than mass. The latter is compared to experimental data achieving precision of up to 0.0003%. It is shown that the law of energy conservation is inapplicable to the frames of non-uniform time. A theoretical model of a physical entity (point mass, photon) travelling in the field of non-uniform time is considered. A generalized law that allows the flow of time to replace classical energy conservation is introduced on the basis of the experiment of Pound and Rebka. It is shown that linear dependence of flow of time on spatial coordinate conforms the inverse square law of universal gravitation and Keplerian mechanics. Momentum is shown to still be conserved

  9. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Kevin M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Brennan T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Witt, Adam M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); DeNeale, Scott T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bevelhimer, Mark S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pries, Jason L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burress, Timothy A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kao, Shih-Chieh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mobley, Miles H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Kyutae [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Curd, Shelaine L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tsakiris, Achilleas [Univ. of Tennessee, Knoxville, TN (United States); Mooneyham, Christian [Univ. of Tennessee, Knoxville, TN (United States); Papanicolaou, Thanos [Univ. of Tennessee, Knoxville, TN (United States); Ekici, Kivanc [Univ. of Tennessee, Knoxville, TN (United States); Whisenant, Matthew J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Welch, Tim [US Department of Energy, Washington, DC (United States); Rabon, Daniel [US Department of Energy, Washington, DC (United States)

    2017-08-01

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  10. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  11. Alignment between South African mathematics assessment standards and the TIMSS assessment frameworks

    Directory of Open Access Journals (Sweden)

    Mdutshekelwa Ndlovu

    2012-12-01

    Full Text Available South Africa’s performance in international benchmark tests is a major cause for concern amongst educators and policymakers, raising questions about the effectiveness of the curriculum reform efforts of the democratic era. The purpose of the study reported in this article was to investigate the degree of alignment between the TIMSS 2003 Grade 8 Mathematics assessment frameworks and the Revised National Curriculum Statements (RNCS assessment standards for Grade 8 Mathematics, later revised to become the Curriculum and Assessment Policy Statements (CAPS. Such an investigation could help to partly shed light on why South African learners do not perform well and point out discrepancies that need to be attended to. The methodology of document analysis was adopted for the study, with the RNCS and the TIMSS 2003 Grade 8 Mathematics frameworks forming the principal documents. Porter’s moderately complex index of alignment was adopted for its simplicity. The computed index of 0.751 for the alignment between the RNCS assessment standards and the TIMSS assessment objectives was found to be significantly statistically low, at the alpha level of 0.05, according to Fulmer’s critical values for 20 cells and 90 or 120 standard points. The study suggests that inadequate attention has been paid to the alignment of the South African mathematics curriculum to the successive TIMSS assessment frameworks in terms of the cognitive level descriptions. The study recommends that participation in TIMSS should rigorously and critically inform ongoing curriculum reform efforts.

  12. Exploring standardized precipitation evapotranspiration index for drought assessment in Bangladesh.

    Science.gov (United States)

    Miah, Md Giashuddin; Abdullah, Hasan Muhammad; Jeong, Changyoon

    2017-10-09

    Drought is a critical issue, and it has a pressing, negative impact on agriculture, ecosystems, livelihoods, food security, and sustainability. The problem has been studied globally, but its regional or even local dimension is sometimes overlooked. Local-level drought assessment is necessary for developing adaptation and mitigation strategies for that particular region. Keeping this in understanding, an attempt was made to create a detailed assessment of drought characteristics at the local scale in Bangladesh. Standardized precipitation evapotranspiration (SPEI) is a new drought index that mainly considers the rainfall and evapotranspiration data set. Globally, SPEI has become a useful drought index, but its local scale application is not common. SPEI base (0.5° grid data) for 110 years (1901-2011) was utilized to overcome the lack of long-term climate data in Bangladesh. Available weather data (1955-2011) from Bangladesh Meteorology Department (BMD) were analyzed to calculate SPEI weather station using the SPEI calculator. The drivers for climate change-induced droughts were characterized by residual temperature and residual rainfall data from different BMD stations. Grid data (SPEI base ) of 26 stations of BMD were used for drought mapping. The findings revealed that the frequency and intensity of drought are higher in the northwestern part of the country which makes it vulnerable to both extreme and severe droughts. Based on the results, the SPEI-based drought intensity and frequency analyses were carried out, emphasizing Rangpur (northwest region) as a hot spot, to get an insight of drought assessment in Bangladesh. The findings of this study revealed that SPEI could be a valuable tool to understand the evolution and evaluation of the drought induced by climate change in the country. The study also justified the immediate need for drought risk reduction strategies that should lead to relevant policy formulations and agricultural innovations for developing

  13. Consistent Conformal Extensions of the Standard Model arXiv

    CERN Document Server

    Loebbert, Florian; Plefka, Jan

    The question of whether classically conformal modifications of the standard model are consistent with experimental obervations has recently been subject to renewed interest. The method of Gildener and Weinberg provides a natural framework for the study of the effective potential of the resulting multi-scalar standard model extensions. This approach relies on the assumption of the ordinary loop hierarchy $\\lambda_\\text{s} \\sim g^2_\\text{g}$ of scalar and gauge couplings. On the other hand, Andreassen, Frost and Schwartz recently argued that in the (single-scalar) standard model, gauge invariant results require the consistent scaling $\\lambda_\\text{s} \\sim g^4_\\text{g}$. In the present paper we contrast these two hierarchy assumptions and illustrate the differences in the phenomenological predictions of minimal conformal extensions of the standard model.

  14. Status of the Standard Model at the LHC start

    International Nuclear Information System (INIS)

    Altarelli, G.

    2008-01-01

    I present a concise review of where we stand in particle physics today. First, I will discuss QCD, then the electroweak sector and finally the motivations and the avenues for new physics beyond the Standard Model.

  15. Enhancements to ASHRAE Standard 90.1 Prototype Building Models

    Energy Technology Data Exchange (ETDEWEB)

    Goel, Supriya; Athalye, Rahul A.; Wang, Weimin; Zhang, Jian; Rosenberg, Michael I.; Xie, YuLong; Hart, Philip R.; Mendon, Vrushali V.

    2014-04-16

    This report focuses on enhancements to prototype building models used to determine the energy impact of various versions of ANSI/ASHRAE/IES Standard 90.1. Since the last publication of the prototype building models, PNNL has made numerous enhancements to the original prototype models compliant with the 2004, 2007, and 2010 editions of Standard 90.1. Those enhancements are described here and were made for several reasons: (1) to change or improve prototype design assumptions; (2) to improve the simulation accuracy; (3) to improve the simulation infrastructure; and (4) to add additional detail to the models needed to capture certain energy impacts from Standard 90.1 improvements. These enhancements impact simulated prototype energy use, and consequently impact the savings estimated from edition to edition of Standard 90.1.

  16. Tests of the standard electroweak model in beta decay

    Energy Technology Data Exchange (ETDEWEB)

    Severijns, N.; Beck, M. [Universite Catholique de Louvain (UCL), Louvain-la-Neuve (Belgium); Naviliat-Cuncic, O. [Caen Univ., CNRS-ENSI, 14 (France). Lab. de Physique Corpusculaire

    2006-05-15

    We review the current status of precision measurements in allowed nuclear beta decay, including neutron decay, with emphasis on their potential to look for new physics beyond the standard electroweak model. The experimental results are interpreted in the framework of phenomenological model-independent descriptions of nuclear beta decay as well as in some specific extensions of the standard model. The values of the standard couplings and the constraints on the exotic couplings of the general beta decay Hamiltonian are updated. For the ratio between the axial and the vector couplings we obtain C{sub A},/C{sub V} = -1.26992(69) under the standard model assumptions. Particular attention is devoted to the discussion of the sensitivity and complementarity of different precision experiments in direct beta decay. The prospects and the impact of recent developments of precision tools and of high intensity low energy beams are also addressed. (author)

  17. Modern elementary particle physics explaining and extending the standard model

    CERN Document Server

    Kane, Gordon

    2017-01-01

    This book is written for students and scientists wanting to learn about the Standard Model of particle physics. Only an introductory course knowledge about quantum theory is needed. The text provides a pedagogical description of the theory, and incorporates the recent Higgs boson and top quark discoveries. With its clear and engaging style, this new edition retains its essential simplicity. Long and detailed calculations are replaced by simple approximate ones. It includes introductions to accelerators, colliders, and detectors, and several main experimental tests of the Standard Model are explained. Descriptions of some well-motivated extensions of the Standard Model prepare the reader for new developments. It emphasizes the concepts of gauge theories and Higgs physics, electroweak unification and symmetry breaking, and how force strengths vary with energy, providing a solid foundation for those working in the field, and for those who simply want to learn about the Standard Model.

  18. Overview of the Higgs and Standard Model physics at ATLAS

    CERN Document Server

    Vazquez Schroeder, Tamara; The ATLAS collaboration

    2018-01-01

    This talk presents selected aspects of recent physics results from the ATLAS collaboration in the Standard Model and Higgs sectors, with a focus on the recent evidence for the associated production of the Higgs boson and a top quark pair.

  19. The Beyond the Standard Model Working Group: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Rizzo, Thomas G.

    2002-08-08

    Various theoretical aspects of physics beyond the Standard Model at hadron colliders are discussed. Our focus will be on those issues that most immediately impact the projects pursued as part of the BSM group at this meeting.

  20. Tests of the standard electroweak model in beta decay

    International Nuclear Information System (INIS)

    Severijns, N.; Beck, M.; Naviliat-Cuncic, O.

    2006-05-01

    We review the current status of precision measurements in allowed nuclear beta decay, including neutron decay, with emphasis on their potential to look for new physics beyond the standard electroweak model. The experimental results are interpreted in the framework of phenomenological model-independent descriptions of nuclear beta decay as well as in some specific extensions of the standard model. The values of the standard couplings and the constraints on the exotic couplings of the general beta decay Hamiltonian are updated. For the ratio between the axial and the vector couplings we obtain C A ,/C V = -1.26992(69) under the standard model assumptions. Particular attention is devoted to the discussion of the sensitivity and complementarity of different precision experiments in direct beta decay. The prospects and the impact of recent developments of precision tools and of high intensity low energy beams are also addressed. (author)

  1. Army Model and Simulation Standards Report FY98

    National Research Council Canada - National Science Library

    1997-01-01

    ...) standards efforts as work progresses towards the objective Army M&S environment. This report specifically documents projects approved for funding through the Army Model and Improvement Program (AMIP...

  2. Standard model status (in search of ''new physics'')

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1993-03-01

    A perspective on successes and shortcomings of the standard model is given. The complementarity between direct high energy probes of new physics and lower energy searches via precision measurements and rare reactions is described. Several illustrative examples are discussed

  3. CP violation and electroweak baryogenesis in the Standard Model

    Directory of Open Access Journals (Sweden)

    Brauner Tomáš

    2014-04-01

    Full Text Available One of the major unresolved problems in current physics is understanding the origin of the observed asymmetry between matter and antimatter in the Universe. It has become a common lore to claim that the Standard Model of particle physics cannot produce sufficient asymmetry to explain the observation. Our results suggest that this conclusion can be alleviated in the so-called cold electroweak baryogenesis scenario. On the Standard Model side, we continue the program initiated by Smit eight years ago; one derives the effective CP-violating action for the Standard Model bosons and uses the resulting effective theory in numerical simulations. We address a disagreement between two previous computations performed effectively at zero temperature, and demonstrate that it is very important to include temperature effects properly. Our conclusion is that the cold electroweak baryogenesis scenario within the Standard Model is tightly constrained, yet producing enough baryon asymmetry using just known physics still seems possible.

  4. Almost-commutative geometries beyond the standard model

    International Nuclear Information System (INIS)

    Stephan, Christoph A

    2006-01-01

    In Iochum et al (2004 J. Math. Phys. 45 5003), Jureit and Stephan (2005 J. Math. Phys. 46 043512), Schuecker T (2005 Preprint hep-th/0501181) and Jureit et al (2005 J. Math. Phys. 46 072303), a conjecture is presented that almost-commutative geometries, with respect to sensible physical constraints, allow only the standard model of particle physics and electro-strong models as Yang-Mills-Higgs theories. In this paper, a counter-example will be given. The corresponding almost-commutative geometry leads to a Yang-Mills-Higgs model which consists of the standard model of particle physics and two new fermions of opposite electro-magnetic charge. This is the second Yang-Mills-Higgs model within noncommutative geometry, after the standard model, which could be compatible with experiments. Combined to a hydrogen-like composite particle, these new particles provide a novel dark matter candidate

  5. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  6. Implementation of IEC standard models for power system stability studies

    Energy Technology Data Exchange (ETDEWEB)

    Margaris, Ioannis D.; Hansen, Anca D.; Soerensen, Poul [Technical Univ. of Denmark, Roskilde (Denmark). Dept. of Wind Energy; Bech, John; Andresen, Bjoern [Siemens Wind Power A/S, Brande (Denmark)

    2012-07-01

    This paper presents the implementation of the generic wind turbine generator (WTG) electrical simulation models proposed in the IEC 61400-27 standard which is currently in preparation. A general overview of the different WTG types is given while the main focus is on Type 4B WTG standard model, namely a model for a variable speed wind turbine with full scale power converter WTG including a 2-mass mechanical model. The generic models for fixed and variable speed WTGs models are suitable for fundamental frequency positive sequence response simulations during short events in the power system such as voltage dips. The general configuration of the models is presented and discussed; model implementation in the simulation software platform DIgSILENT PowerFactory is presented in order to illustrate the range of applicability of the generic models under discussion. A typical voltage dip is simulated and results from the basic electrical variables of the WTG are presented and discussed. (orig.)

  7. Assessment and Next Generation Standards: An Interview with Olivia Gude

    Science.gov (United States)

    Sweeny, Robert

    2014-01-01

    This article provides a transcript of an interview with Olivia Gude, member of the National Coalition for Core Arts Standards Writing Team. In the interview, Gude provides an overview of the process for writing the new visual arts standards.

  8. A standard library for modeling satellite orbits on a microcomputer

    Science.gov (United States)

    Beutel, Kenneth L.

    1988-03-01

    Introductory students of astrodynamics and the space environment are required to have a fundamental understanding of the kinematic behavior of satellite orbits. This thesis develops a standard library that contains the basic formulas for modeling earth orbiting satellites. This library is used as a basis for implementing a satellite motion simulator that can be used to demonstrate orbital phenomena in the classroom. Surveyed are the equations of orbital elements, coordinate systems and analytic formulas, which are made into a standard method for modeling earth orbiting satellites. The standard library is written in the C programming language and is designed to be highly portable between a variety of computer environments. The simulation draws heavily on the standards established by the library to produce a graphics-based orbit simulation program written for the Apple Macintosh computer. The simulation demonstrates the utility of the standard library functions but, because of its extensive use of the Macintosh user interface, is not portable to other operating systems.

  9. Conformal Extensions of the Standard Model with Veltman Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Mojaza, Matin; Sannino, Francesco

    2014-01-01

    Using the renormalisation group framework we classify different extensions of the standard model according to their degree of naturality. A new relevant class of perturbative models involving elementary scalars is the one in which the theory simultaneously satisfies the Veltman conditions...... and is conformal at the classical level. We term these extensions perturbative natural conformal (PNC) theories. We show that PNC models are very constrained and thus highly predictive. Among the several PNC examples that we exhibit, we discover a remarkably simple PNC extension of the standard model in which...

  10. Searches for non-Standard Model Higgs bosons

    CERN Document Server

    Dumitriu, Ana Elena; The ATLAS collaboration

    2018-01-01

    This presentation focuses on the Searches for non-Standard Model Higgs bosons using 36.1 fb of data collected by the ATLAS experiment. There are several theoretical models with an extended Higgs sector considered: 2 Higgs Doublet Models (2HDM), Supersymmetry (SUSY), which brings along super-partners of the SM particles (+ The Minimal Supersymmetric Standard Model (MSSM), whose Higgs sector is equivalent to the one of a constrained 2HDM of type II and the next-to MSSM (NMSSM)), General searches and Invisible decaying Higgs boson.

  11. Training in Vocational Assessment: Preparing Rehabilitation Counselors and Meeting the Requirements of the CORE Standards

    Science.gov (United States)

    Tansey, Timothy N.

    2008-01-01

    Assessment represents a foundational component of rehabilitation counseling services. The revised Council on Rehabilitation Education (CORE) standards implemented in 2004 resulted in the redesign of the knowledge and outcomes under the Assessment standard. The author reviews the current CORE standard for training in assessment within the context…

  12. Assessing the standard Molybdenum projector augmented wave VASP potentials

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Ann E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Multi-Scale Science

    2014-07-01

    Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing high confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.

  13. Assessing the costs and benefits of US renewable portfolio standards

    Science.gov (United States)

    Wiser, Ryan; Mai, Trieu; Millstein, Dev; Barbose, Galen; Bird, Lori; Heeter, Jenny; Keyser, David; Krishnan, Venkat; Macknick, Jordan

    2017-09-01

    Renewable portfolio standards (RPS) exist in 29 US states and the District of Columbia. This article summarizes the first national-level, integrated assessment of the future costs and benefits of existing RPS policies; the same metrics are evaluated under a second scenario in which widespread expansion of these policies is assumed to occur. Depending on assumptions about renewable energy technology advancement and natural gas prices, existing RPS policies increase electric system costs by as much as 31 billion, on a present-value basis over 2015-2050. The expanded renewable deployment scenario yields incremental costs that range from 23 billion to 194 billion, depending on the assumptions employed. The monetized value of improved air quality and reduced climate damages exceed these costs. Using central assumptions, existing RPS policies yield 97 billion in air-pollution health benefits and 161 billion in climate damage reductions. Under the expanded RPS case, health benefits total 558 billion and climate benefits equal 599 billion. These scenarios also yield benefits in the form of reduced water use. RPS programs are not likely to represent the most cost effective path towards achieving air quality and climate benefits. Nonetheless, the findings suggest that US RPS programs are, on a national basis, cost effective when considering externalities.

  14. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  15. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  16. Towards a standard model for research in agent-based modeling and simulation

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2015-11-01

    Full Text Available Agent-based modeling (ABM is a bottom-up modeling approach, where each entity of the system being modeled is uniquely represented as an independent decision-making agent. ABMs are very sensitive to implementation details. Thus, it is very easy to inadvertently introduce changes which modify model dynamics. Such problems usually arise due to the lack of transparency in model descriptions, which constrains how models are assessed, implemented and replicated. In this paper, we present PPHPC, a model which aims to serve as a standard in agent based modeling research, namely, but not limited to, conceptual model specification, statistical analysis of simulation output, model comparison and parallelization studies. This paper focuses on the first two aspects (conceptual model specification and statistical analysis of simulation output, also providing a canonical implementation of PPHPC. The paper serves as a complete reference to the presented model, and can be used as a tutorial for simulation practitioners who wish to improve the way they communicate their ABMs.

  17. Simple standard problem for the Preisach moving model

    International Nuclear Information System (INIS)

    Morentin, F.J.; Alejos, O.; Francisco, C. de; Munoz, J.M.; Hernandez-Gomez, P.; Torres, C.

    2004-01-01

    The present work proposes a simple magnetic system as a candidate for a Standard Problem for Preisach-based models. The system consists in a regular square array of magnetic particles totally oriented along the direction of application of an external magnetic field. The behavior of such system was numerically simulated for different values of the interaction between particles and of the standard deviation of the critical fields of the particles. The characteristic parameters of the Preisach moving model were worked out during simulations, i.e., the mean value and the standard deviation of the interaction field. For this system, results reveal that the mean interaction field depends linearly on the system magnetization, as the Preisach moving model predicts. Nevertheless, the standard deviation cannot be considered as independent of the magnetization. In fact, the standard deviation shows a maximum at demagnetization and two minima at magnetization saturation. Furthermore, not all the demagnetization states are equivalent. The plot standard deviation vs. magnetization is a multi-valuated curve when the system undergoes an AC demagnetization procedure. In this way, the standard deviation increases as the system goes from coercivity to the AC demagnetized state

  18. Conformal standard model with an extended scalar sector

    Energy Technology Data Exchange (ETDEWEB)

    Latosiński, Adam [Max-Planck-Institut für Gravitationsphysik (Albert-Einstein-Institut),Mühlenberg 1, D-14476 Potsdam (Germany); Lewandowski, Adrian; Meissner, Krzysztof A. [Faculty of Physics, University of Warsaw,Pasteura 5, 02-093 Warsaw (Poland); Nicolai, Hermann [Max-Planck-Institut für Gravitationsphysik (Albert-Einstein-Institut),Mühlenberg 1, D-14476 Potsdam (Germany)

    2015-10-26

    We present an extended version of the Conformal Standard Model (characterized by the absence of any new intermediate scales between the electroweak scale and the Planck scale) with an enlarged scalar sector coupling to right-chiral neutrinos. The scalar potential and the Yukawa couplings involving only right-chiral neutrinos are invariant under a new global symmetry SU(3){sub N} that complements the standard U(1){sub B−L} symmetry, and is broken explicitly only by the Yukawa interaction, of order O(10{sup −6}), coupling right-chiral neutrinos and the electroweak lepton doublets. We point out four main advantages of this enlargement, namely: (1) the economy of the (non-supersymmetric) Standard Model, and thus its observational success, is preserved; (2) thanks to the enlarged scalar sector the RG improved one-loop effective potential is everywhere positive with a stable global minimum, thereby avoiding the notorious instability of the Standard Model vacuum; (3) the pseudo-Goldstone bosons resulting from spontaneous breaking of the SU(3){sub N} symmetry are natural Dark Matter candidates with calculable small masses and couplings; and (4) the Majorana Yukawa coupling matrix acquires a form naturally adapted to leptogenesis. The model is made perturbatively consistent up to the Planck scale by imposing the vanishing of quadratic divergences at the Planck scale (‘softly broken conformal symmetry’). Observable consequences of the model occur mainly via the mixing of the new scalars and the standard model Higgs boson.

  19. On the Estimation of Standard Errors in Cognitive Diagnosis Models

    Science.gov (United States)

    Philipp, Michel; Strobl, Carolin; de la Torre, Jimmy; Zeileis, Achim

    2018-01-01

    Cognitive diagnosis models (CDMs) are an increasingly popular method to assess mastery or nonmastery of a set of fine-grained abilities in educational or psychological assessments. Several inference techniques are available to quantify the uncertainty of model parameter estimates, to compare different versions of CDMs, or to check model…

  20. Standard model for safety analysis report of fuel fabrication plants

    International Nuclear Information System (INIS)

    1980-09-01

    A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  1. Standard model for safety analysis report of fuel reprocessing plants

    International Nuclear Information System (INIS)

    1979-12-01

    A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  2. Making Validated Educational Models Central in Preschool Standards.

    Science.gov (United States)

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  3. Introduction to gauge theories and the Standard Model

    CERN Document Server

    de Wit, Bernard

    1995-01-01

    The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.

  4. Precision tests of the standard model at LEP

    International Nuclear Information System (INIS)

    Mele, Barbara; Universita La Sapienza, Rome

    1994-01-01

    Recent LEP results on electroweak precision measurements are reviewed. Line-shape and asymmetries analysis on the Z 0 peak is described. Then, the consistency of the Standard Model predictions with experimental data and consequent limits on the top mass are discussed. Finally, the possibility of extracting information and constrains on new theoretical models from present data is examined. (author). 20 refs., 5 tabs

  5. Exponential models applied to automated processing of radioimmunoassay standard curves

    International Nuclear Information System (INIS)

    Morin, J.F.; Savina, A.; Caroff, J.; Miossec, J.; Legendre, J.M.; Jacolot, G.; Morin, P.P.

    1979-01-01

    An improved computer processing is described for fitting of radio-immunological standard curves by means of an exponential model on a desk-top calculator. This method has been applied to a variety of radioassays and the results are in accordance with those obtained by more sophisticated models [fr

  6. Dark matter, constrained minimal supersymmetric standard model, and lattice QCD.

    Science.gov (United States)

    Giedt, Joel; Thomas, Anthony W; Young, Ross D

    2009-11-13

    Recent lattice measurements have given accurate estimates of the quark condensates in the proton. We use these results to significantly improve the dark matter predictions in benchmark models within the constrained minimal supersymmetric standard model. The predicted spin-independent cross sections are at least an order of magnitude smaller than previously suggested and our results have significant consequences for dark matter searches.

  7. Numerical Models of Sewage Dispersion and Statistica Bathing Water Standards

    DEFF Research Database (Denmark)

    Petersen, Ole; Larsen, Torben

    1991-01-01

    As bathing water standards usually are founded in statistical methods, the numerical models used in outfall design should reflect this. A statistical approach, where stochastic variations in source strength and bacterial disappearance is incorporated into a numerical dilution model is presented. ...

  8. When standards become business models: Reinterpreting "failure" in the standardization paradigm

    NARCIS (Netherlands)

    Hawkins, R.; Ballon, P.

    2007-01-01

    Purpose - This paper aims to explore the question: 'What is the relationship between standards and business models?' and illustrate the conceptual linkage with reference to developments in the mobile communications industry. Design/methodology/approach - A succinct overview of literature on

  9. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  10. ATLAS Z Excess in Minimal Supersymmetric Standard Model

    International Nuclear Information System (INIS)

    Lu, Xiaochuan; Terada, Takahiro

    2015-06-01

    Recently the ATLAS collaboration reported a 3 sigma excess in the search for the events containing a dilepton pair from a Z boson and large missing transverse energy. Although the excess is not sufficiently significant yet, it is quite tempting to explain this excess by a well-motivated model beyond the standard model. In this paper we study a possibility of the minimal supersymmetric standard model (MSSM) for this excess. Especially, we focus on the MSSM spectrum where the sfermions are heavier than the gauginos and Higgsinos. We show that the excess can be explained by the reasonable MSSM mass spectrum.

  11. Standard model Higgs boson-inflaton and dark matter

    International Nuclear Information System (INIS)

    Clark, T. E.; Liu Boyang; Love, S. T.; Veldhuis, T. ter

    2009-01-01

    The standard model Higgs boson can serve as the inflaton field of slow roll inflationary models provided it exhibits a large nonminimal coupling with the gravitational scalar curvature. The Higgs boson self interactions and its couplings with a standard model singlet scalar serving as the source of dark matter are then subject to cosmological constraints. These bounds, which can be more stringent than those arising from vacuum stability and perturbative triviality alone, still allow values for the Higgs boson mass which should be accessible at the LHC. As the Higgs boson coupling to the dark matter strengthens, lower values of the Higgs boson mass consistent with the cosmological data are allowed.

  12. Search for Higgs bosons beyond the Standard Model

    Directory of Open Access Journals (Sweden)

    Mankel Rainer

    2015-01-01

    Full Text Available While the existence of a Higgs boson with a mass near 125 GeV has been clearly established, the detailed structure of the entire Higgs sector is yet unclear. Beyond the standard model interpretation, various scenarios for extended Higgs sectors are being considered. Such options include the minimal and next-to-minimal supersymmetric extensions (MSSM and NMSSM of the standard model, more generic Two-Higgs Doublet models (2HDM, as well as truly exotic Higgs bosons decaying e.g. into totally invisible final states. This article presents recent results from the CMS experiment.

  13. Safety standards for near surface disposal and the safety case and supporting safety assessment for demonstrating compliance with the standards

    International Nuclear Information System (INIS)

    Metcalf, P.

    2003-01-01

    The report presents the safety standards for near surface disposal (ICRP guidance and IAEA standards) and the safety case and supporting safety assessment for demonstrating compliance with the standards. Special attention is paid to the recommendations for disposal of long-lived solid radioactive waste. The requirements are based on the principle for the same level of protection of future individuals as for the current generation. Two types of exposure are considered: human intrusion and natural processes and protection measures are discussed. Safety requirements for near surface disposal are discussed including requirements for protection of human health and environment, requirements or safety assessments, waste acceptance and requirements etc

  14. Precision calculations in supersymmetric extensions of the Standard Model

    International Nuclear Information System (INIS)

    Slavich, P.

    2013-01-01

    This dissertation is organized as follows: in the next chapter I will summarize the structure of the supersymmetric extensions of the standard model (SM), namely the MSSM (Minimal Supersymmetric Standard Model) and the NMSSM (Next-to-Minimal Supersymmetric Standard Model), I will provide a brief overview of different patterns of SUSY (supersymmetry) breaking and discuss some issues on the renormalization of the input parameters that are common to all calculations of higher-order corrections in SUSY models. In chapter 3 I will review and describe computations on the production of MSSM Higgs bosons in gluon fusion. In chapter 4 I will review results on the radiative corrections to the Higgs boson masses in the NMSSM. In chapter 5 I will review the calculation of BR(B → X s γ in the MSSM with Minimal Flavor Violation (MFV). Finally, in chapter 6 I will briefly summarize the outlook of my future research. (author)

  15. The effective Standard Model after LHC Run I

    International Nuclear Information System (INIS)

    Ellis, John; Sanz, Verónica; You, Tevong

    2015-01-01

    We treat the Standard Model as the low-energy limit of an effective field theory that incorporates higher-dimensional operators to capture the effects of decoupled new physics. We consider the constraints imposed on the coefficients of dimension-6 operators by electroweak precision tests (EWPTs), applying a framework for the effects of dimension-6 operators on electroweak precision tests that is more general than the standard S,T formalism, and use measurements of Higgs couplings and the kinematics of associated Higgs production at the Tevatron and LHC, as well as triple-gauge couplings at the LHC. We highlight the complementarity between EWPTs, Tevatron and LHC measurements in obtaining model-independent limits on the effective Standard Model after LHC Run 1. We illustrate the combined constraints with the example of the two-Higgs doublet model.

  16. Minimal extension of the standard model scalar sector

    International Nuclear Information System (INIS)

    O'Connell, Donal; Wise, Mark B.; Ramsey-Musolf, Michael J.

    2007-01-01

    The minimal extension of the scalar sector of the standard model contains an additional real scalar field with no gauge quantum numbers. Such a field does not couple to the quarks and leptons directly but rather through its mixing with the standard model Higgs field. We examine the phenomenology of this model focusing on the region of parameter space where the new scalar particle is significantly lighter than the usual Higgs scalar and has small mixing with it. In this region of parameter space most of the properties of the additional scalar particle are independent of the details of the scalar potential. Furthermore the properties of the scalar that is mostly the standard model Higgs can be drastically modified since its dominant branching ratio may be to a pair of the new lighter scalars

  17. The Effective Standard Model after LHC Run I

    CERN Document Server

    Ellis, John; You, Tevong

    2015-01-01

    We treat the Standard Model as the low-energy limit of an effective field theory that incorporates higher-dimensional operators to capture the effects of decoupled new physics. We consider the constraints imposed on the coefficients of dimension-6 operators by electroweak precision tests (EWPTs), applying a framework for the effects of dimension-6 operators on electroweak precision tests that is more general than the standard $S,T$ formalism, and use measurements of Higgs couplings and the kinematics of associated Higgs production at the Tevatron and LHC, as well as triple-gauge couplings at the LHC. We highlight the complementarity between EWPTs, Tevatron and LHC measurements in obtaining model-independent limits on the effective Standard Model after LHC Run~1. We illustrate the combined constraints with the example of the two-Higgs doublet model.

  18. Assessment of liquefaction-induced hazards using Bayesian networks based on standard penetration test data

    Science.gov (United States)

    Tang, Xiao-Wei; Bai, Xu; Hu, Ji-Lei; Qiu, Jiang-Nan

    2018-05-01

    Liquefaction-induced hazards such as sand boils, ground cracks, settlement, and lateral spreading are responsible for considerable damage to engineering structures during major earthquakes. Presently, there is no effective empirical approach that can assess different liquefaction-induced hazards in one model. This is because of the uncertainties and complexity of the factors related to seismic liquefaction and liquefaction-induced hazards. In this study, Bayesian networks (BNs) are used to integrate multiple factors related to seismic liquefaction, sand boils, ground cracks, settlement, and lateral spreading into a model based on standard penetration test data. The constructed BN model can assess four different liquefaction-induced hazards together. In a case study, the BN method outperforms an artificial neural network and Ishihara and Yoshimine's simplified method in terms of accuracy, Brier score, recall, precision, and area under the curve (AUC) of the receiver operating characteristic (ROC). This demonstrates that the BN method is a good alternative tool for the risk assessment of liquefaction-induced hazards. Furthermore, the performance of the BN model in estimating liquefaction-induced hazards in Japan's 2011 Tōhoku earthquake confirms its correctness and reliability compared with the liquefaction potential index approach. The proposed BN model can also predict whether the soil becomes liquefied after an earthquake and can deduce the chain reaction process of liquefaction-induced hazards and perform backward reasoning. The assessment results from the proposed model provide informative guidelines for decision-makers to detect the damage state of a field following liquefaction.

  19. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  20. Social Moderation, Assessment and Assuring Standards for Accounting Graduates

    Science.gov (United States)

    Watty, Kim; Freeman, Mark; Howieson, Bryan; Hancock, Phil; O'Connell, Brendan; de Lange, Paul; Abraham, Anne

    2014-01-01

    Evidencing student achievement of standards is a growing imperative worldwide. Key stakeholders (including current and prospective students, government, regulators and employers) want confidence that threshold learning standards in an accounting degree have been assured. Australia's new higher education regulatory environment requires that student…

  1. Tying Together the Common Core of Standards, Instruction, and Assessments

    Science.gov (United States)

    Phillips, Vicki; Wong, Carina

    2010-01-01

    Clear, high standards will enable us to develop an education system that ensures that high school graduates are ready for college. The Bill & Melinda Gates Foundation has been working with other organizations to develop a Common Core of Standards. The partners working with the foundation are developing tools that will show teachers what is…

  2. The standard model on non-commutative space-time

    International Nuclear Information System (INIS)

    Calmet, X.; Jurco, B.; Schupp, P.; Wohlgenannt, M.; Wess, J.

    2002-01-01

    We consider the standard model on a non-commutative space and expand the action in the non-commutativity parameter θ μν . No new particles are introduced; the structure group is SU(3) x SU(2) x U(1). We derive the leading order action. At zeroth order the action coincides with the ordinary standard model. At leading order in θ μν we find new vertices which are absent in the standard model on commutative space-time. The most striking features are couplings between quarks, gluons and electroweak bosons and many new vertices in the charged and neutral currents. We find that parity is violated in non-commutative QCD. The Higgs mechanism can be applied. QED is not deformed in the minimal version of the NCSM to the order considered. (orig.)

  3. The standard model on non-commutative space-time

    Energy Technology Data Exchange (ETDEWEB)

    Calmet, X.; Jurco, B.; Schupp, P.; Wohlgenannt, M. [Sektion Physik, Universitaet Muenchen (Germany); Wess, J. [Sektion Physik, Universitaet Muenchen (Germany); Max-Planck-Institut fuer Physik, Muenchen (Germany)

    2002-03-01

    We consider the standard model on a non-commutative space and expand the action in the non-commutativity parameter {theta}{sup {mu}}{sup {nu}}. No new particles are introduced; the structure group is SU(3) x SU(2) x U(1). We derive the leading order action. At zeroth order the action coincides with the ordinary standard model. At leading order in {theta}{sup {mu}}{sup {nu}} we find new vertices which are absent in the standard model on commutative space-time. The most striking features are couplings between quarks, gluons and electroweak bosons and many new vertices in the charged and neutral currents. We find that parity is violated in non-commutative QCD. The Higgs mechanism can be applied. QED is not deformed in the minimal version of the NCSM to the order considered. (orig.)

  4. Solar Luminosity on the Main Sequence, Standard Model and Variations

    Science.gov (United States)

    Ayukov, S. V.; Baturin, V. A.; Gorshkov, A. B.; Oreshina, A. V.

    2017-05-01

    Our Sun became Main Sequence star 4.6 Gyr ago according Standard Solar Model. At that time solar luminosity was 30% lower than current value. This conclusion is based on assumption that Sun is fueled by thermonuclear reactions. If Earth's albedo and emissivity in infrared are unchanged during Earth history, 2.3 Gyr ago oceans had to be frozen. This contradicts to geological data: there was liquid water 3.6-3.8 Gyr ago on Earth. This problem is known as Faint Young Sun Paradox. We analyze luminosity change in standard solar evolution theory. Increase of mean molecular weight in the central part of the Sun due to conversion of hydrogen to helium leads to gradual increase of luminosity with time on the Main Sequence. We also consider several exotic models: fully mixed Sun; drastic change of pp reaction rate; Sun consisting of hydrogen and helium only. Solar neutrino observations however exclude most non-standard solar models.

  5. Lattice Gauge Theories Within and Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Gelzer, Zechariah John [Iowa U.

    2017-01-01

    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \

  6. Beyond the Standard Model Higgs searches at the LHC

    CERN Document Server

    Meridiani, P

    2015-01-01

    The Run I at the LHC marks the birth of the "Higgs physics", a path which will be followed at its full extent in the future runs of the LHC. Indeed there are two complementary paths to be followed to new physics in the Higgs sector: precision measurements of the Higgs properties (couplings, mass, spin and parity), where new physics can manifest as deviation from the Standard Model, or direct search for processes not foreseen in the Standard Model (Higgs decays not foreseen in the Standard Model, additional scalars which would indicate an extended Higgs sector). The current status of these studies at the LHC is presented, focussing in particular on the direct searches for rare or invisible Higgs decays or for an extended Higgs sector. The results are based on the analysis of the proton-proton collisions at 7 and 8 TeV center-of-mass energy at the LHC by the ATLAS and CMS collaborations.

  7. Constraints on Nc in extensions of the standard model

    International Nuclear Information System (INIS)

    Shrock, Robert

    2007-01-01

    We consider a class of theories involving an extension of the standard model gauge group to an a priori arbitrary number of colors, N c , and derive constraints on N c . One motivation for this is the string theory landscape. For two natural classes of embeddings of this N c -extended standard model in a supersymmetric grand unified theory, we show that requiring unbroken electromagnetic gauge invariance, asymptotic freedom of color, and three generations of quarks and leptons forces one to choose N c =3. Similarly, we show that for a theory combining the N c -extended standard model with a one-family SU(2) TC technicolor theory, only the value N c =3 is allowed

  8. Genetic Programming and Standardization in Water Temperature Modelling

    Directory of Open Access Journals (Sweden)

    Maritza Arganis

    2009-01-01

    Full Text Available An application of Genetic Programming (an evolutionary computational tool without and with standardization data is presented with the aim of modeling the behavior of the water temperature in a river in terms of meteorological variables that are easily measured, to explore their explanatory power and to emphasize the utility of the standardization of variables in order to reduce the effect of those with large variance. Recorded data corresponding to the water temperature behavior at the Ebro River, Spain, are used as analysis case, showing a performance improvement on the developed model when data are standardized. This improvement is reflected in a reduction of the mean square error. Finally, the models obtained in this document were applied to estimate the water temperature in 2004, in order to provide evidence about their applicability to forecasting purposes.

  9. CP violation in the standard model and beyond

    International Nuclear Information System (INIS)

    Buras, A.J.

    1984-01-01

    The present status of CP violation in the standard six quark model is reviewed and a combined analysis with B-meson decays is presented. The theoretical uncertainties in the analysis are discussed and the resulting KM weak mixing angles, the phase delta and the ratio epsilon'/epsilon are given as functions of Tsub(B), GAMMA(b -> u)/GAMMA(b -> c), msub(t) and the B parameter. For certain ranges of the values of these parameters the standard model is not capable in reproducing the experimental values for epsilon' and epsilon parameters. Anticipating possible difficulties we discuss various alternatives to the standard explanation of CP violation such as horizontal interactions, left-right symmetric models and supersymmetry. CP violation outside the kaon system is also briefly discussed. (orig.)

  10. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  11. The Wada Test: contributions to standardization of the stimulus for language and memory assessment

    Directory of Open Access Journals (Sweden)

    Mäder Maria Joana

    2004-01-01

    Full Text Available The Wada Test (WT is part of the presurgical evaluation for refractory epilepsy. The WT is not standardized and the protocols differ in important ways, including stimulus type of material presented for memory testing, timing of presentations and methods of assessment. The aim of this study was to contribute to establish parameters for a WT to Brazilian population investigating the performance of 100 normal subjects, without medication. Two parallel models were used based on Montreal Procedure adapted from Gail Risse's (MEG-MN,EUA protocol. The proportions of correct responses of normal subjects submitted to two parallel WT models were investigated and the two models were compared. The results showed that the two models are similar but significant differences among the stimulus type were observed. The results suggest that the stimulus type may influence the results of the WT and should be considered when constructing models and comparing different protocols.

  12. Constraining new physics with collider measurements of Standard Model signatures

    Energy Technology Data Exchange (ETDEWEB)

    Butterworth, Jonathan M. [Department of Physics and Astronomy, University College London,Gower St., London, WC1E 6BT (United Kingdom); Grellscheid, David [IPPP, Department of Physics, Durham University,Durham, DH1 3LE (United Kingdom); Krämer, Michael; Sarrazin, Björn [Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University,Sommerfeldstr. 16, 52056 Aachen (Germany); Yallup, David [Department of Physics and Astronomy, University College London,Gower St., London, WC1E 6BT (United Kingdom)

    2017-03-14

    A new method providing general consistency constraints for Beyond-the-Standard-Model (BSM) theories, using measurements at particle colliders, is presented. The method, ‘Constraints On New Theories Using Rivet’, CONTUR, exploits the fact that particle-level differential measurements made in fiducial regions of phase-space have a high degree of model-independence. These measurements can therefore be compared to BSM physics implemented in Monte Carlo generators in a very generic way, allowing a wider array of final states to be considered than is typically the case. The CONTUR approach should be seen as complementary to the discovery potential of direct searches, being designed to eliminate inconsistent BSM proposals in a context where many (but perhaps not all) measurements are consistent with the Standard Model. We demonstrate, using a competitive simplified dark matter model, the power of this approach. The CONTUR method is highly scaleable to other models and future measurements.

  13. Using Microsoft Excel to Assess Standards: A "Techtorial". Article #2 in a 6-Part Series

    Science.gov (United States)

    Mears, Derrick

    2009-01-01

    Standards-based assessment is a term currently being used quite often in educational reform discussions. The philosophy behind this initiative is to utilize "standards" or "benchmarks" to focus instruction and assessments of student learning. The National Standards for Physical Education (NASPE, 2004) provide a framework to guide this process for…

  14. ATLAS discovery potential of the Standard Model Higgs boson

    CERN Document Server

    Weiser, C; The ATLAS collaboration

    2009-01-01

    The Standard Model of elementary particles is remarkably succesful in describing experimental data. The Higgs mechanism as the origin of electroweak symmetry breaking and mass generation, however, has not yet been confirmed experimentally. The search for the Higgs boson is thus one of the most important tasks of the ATLAS experiment at the Large Hadron Collider (LHC). This talk will present an overview of the potential of the ATLAS detector for the discovery of the Standard Model Higgs boson. Different production processes and decay channels -to cover a wide mass range- will be discussed.

  15. ATLAS Discovery Potential of the Standard Model Higgs Boson

    CERN Document Server

    Weiser, C; The ATLAS collaboration

    2010-01-01

    The Standard Model of elementary particles is remarkably succesful in describing experimental data. The Higgs mechanism as the origin of electroweak symmetry breaking and mass generation, however, has not yet been confirmed experimentally. The search for the Higgs boson is thus one of the most important tasks of the ATLAS experiment at the Large Hadron Collider (LHC). This talk will present an overview of the potential of the ATLAS detector for the discovery of the Standard Model Higgs boson. Different production processes and decay channels -to cover a wide mass range- will be discussed.

  16. Loop Corrections to Standard Model fields in inflation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics,60 Garden Street, Cambridge, MA 02138 (United States); Department of Physics, The University of Texas at Dallas,800 W Campbell Rd, Richardson, TX 75080 (United States); Wang, Yi [Department of Physics, The Hong Kong University of Science and Technology,Clear Water Bay, Kowloon, Hong Kong (China); Xianyu, Zhong-Zhi [Center of Mathematical Sciences and Applications, Harvard University,20 Garden Street, Cambridge, MA 02138 (United States)

    2016-08-08

    We calculate 1-loop corrections to the Schwinger-Keldysh propagators of Standard-Model-like fields of spin-0, 1/2, and 1, with all renormalizable interactions during inflation. We pay special attention to the late-time divergences of loop corrections, and show that the divergences can be resummed into finite results in the late-time limit using dynamical renormalization group method. This is our first step toward studying both the Standard Model and new physics in the primordial universe.

  17. Majorana neutrinos in a warped 5D standard model

    International Nuclear Information System (INIS)

    Huber, S.J.; Shafi, Q.

    2002-05-01

    We consider neutrino oscillations and neutrinoless double beta decay in a five dimensional standard model with warped geometry. Although the see-saw mechanism in its simplest form cannot be implemented because of the warped geometry, the bulk standard model neutrinos can acquire the desired (Majorana) masses from dimension five interactions. We discuss how large mixings can arise, why the large mixing angle MSW solution for solar neutrinos is favored, and provide estimates for the mixing angle U e3 . Implications for neutrinoless double beta decay are also discussed. (orig.)

  18. Primordial alchemy: a test of the standard model

    International Nuclear Information System (INIS)

    Steigman, G.

    1987-01-01

    Big Bang Nucleosynthesis provides the only probe of the early evolution of the Universe constrained by observational data. The standard, hot, big bang model predicts the synthesis of the light elements (D, 3 He, 4 He, 7 Li) in astrophysically interesting abundances during the first few minutes in the evolution of the Universe. A quantitative comparison of the predicted abundances with those observed astronomically confirms the consistency of the standard model and yields valuable constraints on the parameters of cosmology and elementary particle physics. The current status of the comparison between theory and observation will be reviewed and the opportunities for future advances outlined

  19. Development of proliferation resistance assessment methodology based on international standard

    International Nuclear Information System (INIS)

    Ko, W. I.; Chang, H. L.; Lee, Y. D.; Lee, J. W.; Park, J. H.; Kim, Y. I.; Ryu, J. S.; Ko, H. S.; Lee, K. W.

    2012-04-01

    Nonproliferation is one of the main requirements to be satisfied by the advanced future nuclear energy systems that have been developed in the Generation IV and INPRO studies. The methodologies to evaluate proliferation resistance has been developed since 1980s, however, the systematic evaluation approach has begun from around 2000. Domestically a study to develop national method to evaluate proliferation resistance (PR) of advanced future nuclear energy systems has started in 2007 as one of the long-term nuclear R and D subjects in order to promote export and international credibility and transparency of national nuclear energy systems and nuclear fuel cycle technology development program. In the first phase (2007-2010) development and improvement of intrinsic evaluation parameters for the evaluation of proliferation resistance, quantification of evaluation parameters, development of evaluation models, and development of permissible ranges of evaluation parameters have been carried out. In the second phase (2010-2012) generic principle of to evaluate PR was established, and techincal guidelines, nuclear material diversion pathway analysis method, and a method to integrate evaluation parameters have been developed. which were applied to 5 alternative nuclear fuel cycles to estimate their appicability and objectivity. In addition, measures to enhance PR of advanced future nuclear energy systems and technical guidelines of PR assessment using intrinsic PR evaluation parameters were developed. Lastly, requlatory requirements to secure nonproliferation requirements of nuclear energy systems from the early design stage, operation and to decommissioning which will support the export of newly developed advanced future nuclear energy system

  20. Assessment of non-standard HIV antiretroviral therapy regimens at ...

    African Journals Online (AJOL)

    2016-03-06

    Mar 6, 2016 ... Most patients were transitioned to standard regimens, ... In cases of first-line regimen treatment failure, ..... tute; National Heart, Lung, and Blood Institute; National. Institute of Dental & Craniofacial Research; National Insti-.

  1. Environmental assessment for the Consumer Products Efficiency Standards program

    Energy Technology Data Exchange (ETDEWEB)

    1980-05-23

    The Energy Policy and Conservation Act of 1975 as amended by the National Energy Conservation Policy Act of 1978, requires the DOE to prescribe energy efficiency standards for thirteen consumer products. The Consumer Products Efficiency Standards (CPES) program covers the following products: refrigerators and refrigerator-freezers; freezers;clothes dryers;water heaters; room air conditioners; home heating equipment (not including furnaces); kitchen ranges and ovens; central air conditioners (cooling and heat pumps); furnaces; dishwashers; television sets; clothes washers; and humidifiers and dehumidifiers. DOE is proposing two sets of standards for all thirteen consumer products: intermediate standards to become effective in 1981 for the first nine products and in 1982 for the second four products, and final standards to become effective in 1986 and 1987, respectively. The final standards are more restrictive than the intermediate standards and will provide manufacturers with the maximum time permitted under the Act to plan and develop extensive new lines of efficient consumer products. The final standards proposed by DOE require the maximum improvements in efficiency which are technologically feasible and economically justified, as required by Section 325(c) of EPCA. The thirteen consumer products account for approximately 90% of all the energy consumed in the nation's residences, or more than 20% of the nation's energy needs. Increases in the energy efficiency of these consumer products can help to narrow the gap between the nation's increasing demand for energy and decreasing supplies of domestic oil and natural gas. Improvements in the efficiency of consumer products can thus help to solve the nation's energy crisis.

  2. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  3. Assessment of the Impacts of Standards and Labeling Programs inMexico (four products).

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Itha; Pulido, Henry; McNeil, Michael A.; Turiel, Isaac; della Cava, Mirka

    2007-06-12

    This study analyzes impacts from energy efficiency standards and labeling in Mexico from 1994 through 2005 for four major products: household refrigerators, room air conditioners, three-phase (squirrel cage) induction motors, and clothes washers. It is a retrospective analysis, seeking to assess verified impacts on product efficiency in the Mexican market in the first ten years after standards were implemented. Such an analysis allows the Mexican government to compare actual to originally forecast program benefits. In addition, it provides an extremely valuable benchmark for other countries considering standards, and to the energy policy community as a whole. The methodology for evaluation begins with historical test data taken for a large number of models of each product type between 1994 and 2005. The pre-standard efficiency of models in 1994 is taken as a baseline throughout the analysis. Model efficiency data were provided by an independent certification laboratory (ANCE), which tested products as part of the certification and enforcement mechanism defined by the standards program. Using this data, together with economic and market data provided by both government and private sector sources, the analysis considers several types of national level program impacts. These include: Energy savings; Environmental (emissions) impacts, and Net financial impacts to consumers, manufacturers and utilities. Energy savings impacts are calculated using the same methodology as the original projections, allowing a comparison. Other impacts are calculated using a robust and sophisticated methodology developed by the Instituto de Investigaciones Electricas (IIE) and Lawrence Berkeley National Laboratory (LBNL), in a collaboration supported by the Collaborative Labeling and Standards Program (CLASP).

  4. Reconsidering the risk assessment concept: Standardizing the impact description as a building block for vulnerability assessment

    Directory of Open Access Journals (Sweden)

    K. Hollenstein

    2005-01-01

    Full Text Available Risk assessments for natural hazards are becoming more widely used and accepted. Using an extended definition of risk, it becomes obvious that performant procedures for vulnerability assessments are vital for the success of the risk concept. However, there are large gaps in knowledge about vulnerability. To alleviate the situation, a conceptual extension of the scope of existing and new models is suggested. The basis of the suggested concept is a stadardization of the output of hazard assessments. This is achieved by defining states of the target objects that depend on the impact and at the same time affect the object's performance characteristics. The possible state variables can be related to a limited set of impact descriptors termed generic impact description interface. The concept suggests that both hazard and vulnerability assessment models are developed according to the specification of this interface, thus facilitating modularized risk assessments. Potential problems related to the application of the concept include acceptance issues and the lacking accuracy of transformation of outputs of existing models. Potential applications and simple examples for adapting existing models are briefly discussed.

  5. Variation in Students' Conceptions of Self-Assessment and Standards

    Directory of Open Access Journals (Sweden)

    Heng Kiat Kelvin Tan

    2011-01-01

    Full Text Available This paper reports the results of a phenomenographic study on the different ways that secondary students understood and utilized student self-assessment and how various ego types could affect the accuracy of self-assessment. The study sought to contribute to the growing literature which recognizes the critical role that students play in assessment processes, and in particular the different roles that they assume in student self-assessment. The results of the study provide insights into how different students experience self-assessment by articulating the variation in the perception and purposes of assessing one's own learning. This variation is depicted as a hierarchy of logically related students' conceptions of self-assessment.

  6. A CDO option market model on standardized CDS index tranches

    DEFF Research Database (Denmark)

    Dorn, Jochen

    We provide a market model which implies a dynamic for standardized CDS index tranche spreads. This model is useful for pricing options on tranches with future Issue Dates as well as for modeling emerging options on struc- tured credit derivatives. With the upcoming regulation of the CDS market...... in perspective, the model presented here is also an attempt to face the e ects on pricing approaches provoked by an eventual Clearing Chamber . It becomes also possible to calibrate Index Tranche Options with bespoke tenors/tranche subordination to market data obtained by more liquid Index Tranche Options...

  7. Supersymmetric standard model from the heterotic string (II)

    International Nuclear Information System (INIS)

    Buchmueller, W.; Hamaguchi, K.; Tokyo Univ.; Lebedev, O.; Ratz, M.

    2006-06-01

    We describe in detail a Z 6 orbifold compactification of the heterotic E 8 x E 8 string which leads to the (supersymmetric) standard model gauge group and matter content. The quarks and leptons appear as three 16-plets of SO(10), two of which are localized at fixed points with local SO(10) symmetry. The model has supersymmetric vacua without exotics at low energies and is consistent with gauge coupling unification. Supersymmetry can be broken via gaugino condensation in the hidden sector. The model has large vacuum degeneracy. Certain vacua with approximate B-L symmetry have attractive phenomenological features. The top quark Yukawa coupling arises from gauge interactions and is of the order of the gauge couplings. The other Yukawa couplings are suppressed by powers of standard model singlet fields, similarly to the Froggatt-Nielsen mechanism. (Orig.)

  8. Search for the standard model Higgs boson in $l\

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dikai [Pierre and Marie Curie Univ., Paris (France)

    2013-01-01

    Humans have always attempted to understand the mystery of Nature, and more recently physicists have established theories to describe the observed phenomena. The most recent theory is a gauge quantum field theory framework, called Standard Model (SM), which proposes a model comprised of elementary matter particles and interaction particles which are fundamental force carriers in the most unified way. The Standard Model contains the internal symmetries of the unitary product group SU(3)c ⓍSU(2)L Ⓧ U(1)Y , describes the electromagnetic, weak and strong interactions; the model also describes how quarks interact with each other through all of these three interactions, how leptons interact with each other through electromagnetic and weak forces, and how force carriers mediate the fundamental interactions.

  9. Higgs detectability in the extended supersymmetric standard model

    International Nuclear Information System (INIS)

    Kamoshita, Jun-ichi

    1995-01-01

    Higgs detectability at a future linear collider are discussed in the minimal supersymmetric standard model (MSSM) and a supersymmetric standard model with a gauge singlet Higgs field (NMSSM). First, in the MSSM at least one of the neutral scalar Higgs is shown to be detectable irrespective of parameters of the model in a future e + e - linear collider at √s = 300-500 GeV. Next the Higgs sector of the NMSSM is considered, since the lightest Higgs boson can be singlet dominated and therefore decouple from Z 0 boson it is important to consider the production of heavier Higgses. It is shown that also in this case at least one of the neutral scalar Higgs will be detectable in a future linear collider. We extend the analysis and show that the same is true even if three singlets are included. Thus the detectability of these Higgs bosons of these models is guaranteed. (author)

  10. Astrophysical neutrinos flavored with beyond the Standard Model physics

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Rasmus W.; Ackermann, Markus; Winter, Walter [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Lechner, Lukas [Vienna Univ. of Technology (Austria). Dept. of Physics; Kowalski, Marek [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2017-07-15

    We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or non-standard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow to efficiently test and discriminate models. More detailed information can be obtained from additional observables such as the energy-dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.

  11. Non-perturbative effective interactions in the standard model

    CERN Document Server

    Arbuzov, Boris A

    2014-01-01

    This monograph is devoted to the nonperturbative dynamics in the Standard Model (SM), the basic theory of all, but gravity, fundamental interactions in nature. The Standard Model is devided into two parts: the Quantum chromodynamics (QCD) and the Electro-weak theory (EWT) are well-defined renormalizable theories in which the perturbation theory is valid. However, for the adequate description of the real physics nonperturbative effects are inevitable. This book describes how these nonperturbative effects may be obtained in the framework of spontaneous generation of effective interactions. The well-known example of such effective interaction is provided by the famous Nambu--Jona-Lasinio effective interaction. Also a spontaneous generation of this interaction in the framework of QCD is described and applied to the method for other effective interactions in QCD and EWT. The method is based on N.N. Bogoliubov conception of compensation equations. As a result we then describe the principle feathures of the Standard...

  12. Astrophysical neutrinos flavored with beyond the Standard Model physics

    International Nuclear Information System (INIS)

    Rasmussen, Rasmus W.; Ackermann, Markus; Winter, Walter; Lechner, Lukas; Kowalski, Marek; Humboldt-Universitaet, Berlin

    2017-07-01

    We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or non-standard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow to efficiently test and discriminate models. More detailed information can be obtained from additional observables such as the energy-dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.

  13. ATLAS Standard Model Measurements Using Jet Grooming and Substructure

    CERN Document Server

    Ucchielli, Giulia; The ATLAS collaboration

    2017-01-01

    Boosted topologies allow to explore Standard Model processes in kinematical regimes never tested before. In such LHC challenging environments, standard reconstruction techniques quickly hit the wall. Targeting hadronic final states means to properly reconstruct energy and multiplicity of the jets in the event. In order to be able to identify the decay product of boosted objects, i.e. W bosons, $t\\bar{t}$ pairs or Higgs produced in association with $t\\bar{t}$ pairs, ATLAS experiment is currently exploiting several algorithms using jet grooming and jet substructure. This contribution will mainly cover the following ATLAS measurements: $t\\bar{t}$ differential cross section production and jet mass using the soft drop procedure. Standard Model measurements offer the perfect field to test the performances of new jet tagging techniques which will become even more important in the search for new physics in highly boosted topologies.”

  14. Safety and efficacy assessment of standardized herbal formula PM012

    Science.gov (United States)

    2012-01-01

    Background This study was conducted to evaluate the efficacy of the herbal formula PM012 on an Alzheimer's disease model, human presenilin 2 mutant transgenic mice (hPS2m), and also to evaluate the toxicity of PM012 in Sprague-Dawely rats after 4 or 26 weeks treatment with repeated oral administration. Methods Spatial learning and memory capacities of hPS2m transgenic mice were evaluated using the Morris Water Maze. Simultaneously, PM012 was repeatedly administered orally to male and female SD rats (15/sex/group) at doses of 0 (vehicle control), 500, 1,000 and 2,000 mg/kg/day for 4 or 26 weeks. To evaluate the recovery potential, 5 animals of each sex were assigned to vehicle control and 2,000 mg/kg/day groups during the 4-week recovery period. Results The results showed that PM012-treated hPS2m transgenic mice showed significantly reduced escape latency when compared with the hPS2m transgenic mice. The repeated oral administration of PM012 over 26 weeks in male and female rats induced an increase and increasing trend in thymus weight in the female treatment groups (main and recovery groups), but the change was judged to be toxicologically insignificant. In addition, the oral administration of the herbal medicine PM012 did not cause adverse effects as assessed by clinical signs, mortality, body weight, food and water consumption, ophthalmology, urinalysis, hematology, serum biochemistry, blood clotting time, organ weights and histopathology. The No Observed Adverse Effects Levels of PM012 was determined to be 2,000 mg/kg/day for both sexes, and the target organ was not identified. Conclusion These results suggest that PM012 has potential for use in the treatment of the Alzheimer's disease without serious adverse effects. PMID:22458507

  15. Transformative Shifts in Art History Teaching: The Impact of Standards-Based Assessment

    Science.gov (United States)

    Ormond, Barbara

    2011-01-01

    This article examines pedagogical shifts in art history teaching that have developed as a response to the implementation of a standards-based assessment regime. The specific characteristics of art history standards-based assessment in the context of New Zealand secondary schools are explained to demonstrate how an exacting form of assessment has…

  16. Comparing Panelists' Understanding of Standard Setting across Multiple Levels of an Alternate Science Assessment

    Science.gov (United States)

    Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi

    2013-01-01

    Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…

  17. Newly graduated doctors' competence in managing cardiopulmonary arrests assessed using a standardized Advanced Life Support (ALS) assessment

    DEFF Research Database (Denmark)

    Jensen, Morten Lind; Hesselfeldt, Rasmus; Rasmussen, Maria Birkvad

    2008-01-01

    Aim of the study: Several studies using a variety of assessment approaches have demonstrated that young doctors possess insufficient resuscitation competence. The aims of this study were to assess newly graduated doctors’ resuscitation competence against an internationally recognised standard and...

  18. Assessment of technologies to meet a low carbon fuel standard.

    Science.gov (United States)

    Yeh, Sonia; Lutsey, Nicholas P; Parker, Nathan C

    2009-09-15

    California's low carbon fuel standard (LCFS) was designed to incentivize a diverse array of available strategies for reducing transportation greenhouse gas (GHG) emissions. It provides strong incentives for fuels with lower GHG emissions, while explicitly requiring a 10% reduction in California's transportation fuel GHG intensity by 2020. This paper investigates the potential for cost-effective GHG reductions from electrification and expanded use of biofuels. The analysis indicates that fuel providers could meetthe standard using a portfolio approach that employs both biofuels and electricity, which would reduce the risks and uncertainties associated with the progress of cellulosic and battery technologies, feedstock prices, land availability, and the sustainability of the various compliance approaches. Our analysis is based on the details of California's development of an LCFS; however, this research approach could be generalizable to a national U.S. standard and to similar programs in Europe and Canada.

  19. Challenging the Standard Model with the muon g − 2

    Indian Academy of Sciences (India)

    The discrepancy between experiment and the Standard Model prediction of ... The measurement of the anomalous magnetic moment of the muon aµ = (g−2)/2 ( ... to evaluate the leading-order hadronic term (see [3,4] for recent reviews). .... update of their previous analysis and a new preliminary one based on data collected.

  20. Radiation therapy: model standards for determination of need

    International Nuclear Information System (INIS)

    Lagasse, L.G.; Devins, T.B.

    1982-03-01

    Contents: Health planning process; Health care requirements (model for projecting need for megavoltage radiation therapy); Operational objectives (manpower, megavoltage therapy and treatment planning equipment, support services, management and evaluation of patient care, organization and administration); Compliance with other standards imposed by law; Financial feasibility and capability; Reasonableness of expenditures and costs; Relative merit; Environmental impact

  1. Searches for physics beyond the Standard Model at the Tevatron

    Indian Academy of Sciences (India)

    Publications ... Beyond Standard Model Physics Volume 79 Issue 4 October 2012 pp 703-717 ... a centre-of-mass energy of 1.96 TeV that the CDF and DO Collaborations have scrutinized looking for new physics in a wide range of final states.

  2. Precision tests of quantum chromodynamics and the standard model

    International Nuclear Information System (INIS)

    Brodsky, S.J.; Lu, H.J.

    1995-06-01

    The authors discuss three topics relevant to testing the Standard Model to high precision: commensurate scale relations, which relate observables to each other in perturbation theory without renormalization scale or scheme ambiguity, the relationship of compositeness to anomalous moments, and new methods for measuring the anomalous magnetic and quadrupole moments of the W and Z

  3. Mathematical Modeling, Sense Making, and the Common Core State Standards

    Science.gov (United States)

    Schoenfeld, Alan H.

    2013-01-01

    On October 14, 2013 the Mathematics Education Department at Teachers College hosted a full-day conference focused on the Common Core Standards Mathematical Modeling requirements to be implemented in September 2014 and in honor of Professor Henry Pollak's 25 years of service to the school. This article is adapted from my talk at this conference…

  4. Searches for phenomena beyond the Standard Model at the Large

    Indian Academy of Sciences (India)

    The LHC has delivered several fb-1 of data in spring and summer 2011, opening new windows of opportunity for discovering phenomena beyond the Standard Model. A summary of the searches conducted by the ATLAS and CMS experiments based on about 1 fb-1 of data is presented.

  5. Charged and neutral minimal supersymmetric standard model Higgs ...

    Indian Academy of Sciences (India)

    physics pp. 759–763. Charged and neutral minimal supersymmetric standard model Higgs boson decays and measurement of tan β at the compact linear collider. E CONIAVITIS and A FERRARI∗. Department of Nuclear and Particle Physics, Uppsala University, 75121 Uppsala, Sweden. ∗E-mail: ferrari@tsl.uu.se. Abstract.

  6. Search for Higgs boson in beyond standard model scenarios

    Indian Academy of Sciences (India)

    The principal physics motivation of the LHC experiments is to search for the Higgs boson and to probe the physics of TeV energy scale. Potential of discovery for Higgs bosons in various scenarios beyond standard model have been estimated for both CMS and ATLAS experiments through detailed detector simulations.

  7. Land administration domain model is an ISO standard now

    NARCIS (Netherlands)

    Lemmen, Christiaan; van Oosterom, Peter; Uitermark, Harry; de Zeeuw, Kees

    2013-01-01

    A group of land administration professionals initiated the development of a data model that facilitates the quick and efficient set-up of land registrations. Just like social issues benefit from proper land administration, land administration systems themselves benefit from proper data standards. In

  8. The Dawn of physics beyond the standard model

    CERN Multimedia

    Kane, Gordon

    2003-01-01

    "The Standard Model of particle physics is at a pivotal moment in its history: it is both at the height of its success and on the verge of being surpassed [...] A new era in particle physics could soon be heralded by the detection of supersymmetric particles at the Tevatron collider at Fermi National Accelerator Laboratory in Batavia, Ill." (8 pages)

  9. Renormalization of seesaw neutrino masses in the standard model ...

    Indian Academy of Sciences (India)

    the neutrino-mass-operator in the standard model with two-Higgs doublets, and also the QCD–QED ... data of atmospheric muon deficits, thereby suggesting a large mixing angle with ЖС¾. Ь ~ ... One method consists of running the gauge.

  10. Systematics of quark mass matrices in the standard electroweak model

    International Nuclear Information System (INIS)

    Frampton, P.H.; Jarlskog, C.; Stockholm Univ.

    1985-01-01

    It is shown that the quark mass matrices in the standard electroweak model satisfy the empirical relation M = M' + O(lambda 2 ), where M(M') refers to the mass matrix of the charge 2/3 (-1/3) quarks normalized to the largest eigenvalue, msub(t) (msub(b)), and lambda = Vsub(us) approx.= 0.22. (orig.)

  11. Particle dark matter from physics beyond the standard model

    International Nuclear Information System (INIS)

    Matchev, Konstantin

    2004-01-01

    In this talk I contrast three different particle dark matter candidates, all motivated by new physics beyond the Standard Model: supersymmetric dark matter, Kaluza-Klein dark matter, and scalar dark matter. I then discuss the prospects for their discovery and identification in both direct detection as well as collider experiments

  12. Standardizing measurement, sampling and reporting for public exposure assessments

    Energy Technology Data Exchange (ETDEWEB)

    Rochedo, Elaine R.R. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/No. CEP 22780-160 Rio de Janeiro, RJ (Brazil)], E-mail: elaine@ird.gov.br

    2008-11-15

    UNSCEAR assesses worldwide public exposure from natural and man-made sources of ionizing radiation based on information submitted to UNSCEAR by United Nations Member States and from peer reviewed scientific literature. These assessments are used as a basis for radiation protection programs of international and national regulatory and research organizations. Although UNSCEAR describes its assessment methodologies, the data are based on various monitoring approaches. In order to reduce uncertainties and improve confidence in public exposure assessments, it would be necessary to harmonize the methodologies used for sampling, measuring and reporting of environmental results.

  13. Standardized binomial models for risk or prevalence ratios and differences.

    Science.gov (United States)

    Richardson, David B; Kinlaw, Alan C; MacLehose, Richard F; Cole, Stephen R

    2015-10-01

    Epidemiologists often analyse binary outcomes in cohort and cross-sectional studies using multivariable logistic regression models, yielding estimates of adjusted odds ratios. It is widely known that the odds ratio closely approximates the risk or prevalence ratio when the outcome is rare, and it does not do so when the outcome is common. Consequently, investigators may decide to directly estimate the risk or prevalence ratio using a log binomial regression model. We describe the use of a marginal structural binomial regression model to estimate standardized risk or prevalence ratios and differences. We illustrate the proposed approach using data from a cohort study of coronary heart disease status in Evans County, Georgia, USA. The approach reduces problems with model convergence typical of log binomial regression by shifting all explanatory variables except the exposures of primary interest from the linear predictor of the outcome regression model to a model for the standardization weights. The approach also facilitates evaluation of departures from additivity in the joint effects of two exposures. Epidemiologists should consider reporting standardized risk or prevalence ratios and differences in cohort and cross-sectional studies. These are readily-obtained using the SAS, Stata and R statistical software packages. The proposed approach estimates the exposure effect in the total population. © The Author 2015; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  14. Assessing validity of a depression screening instrument in the absence of a gold standard.

    Science.gov (United States)

    Gelaye, Bizu; Tadesse, Mahlet G; Williams, Michelle A; Fann, Jesse R; Vander Stoep, Ann; Andrew Zhou, Xiao-Hua

    2014-07-01

    We evaluated the extent to which use of a hypothesized imperfect gold standard, the Composite International Diagnostic Interview (CIDI), biases the estimates of diagnostic accuracy of the Patient Health Questionnaire-9 (PHQ-9). We also evaluate how statistical correction can be used to address this bias. The study was conducted among 926 adults where structured interviews were conducted to collect information about participants' current major depressive disorder using PHQ-9 and CIDI instruments. First, we evaluated the relative psychometric properties of PHQ-9 using CIDI as a gold standard. Next, we used a Bayesian latent class model to correct for the bias. In comparison with CIDI, the relative sensitivity and specificity of the PHQ-9 for detecting major depressive disorder at a cut point of 10 or more were 53.1% (95% confidence interval: 45.4%-60.8%) and 77.5% (95% confidence interval, 74.5%-80.5%), respectively. Using a Bayesian latent class model to correct for the bias arising from the use of an imperfect gold standard increased the sensitivity and specificity of PHQ-9 to 79.8% (95% Bayesian credible interval, 64.9%-90.8%) and 79.1% (95% Bayesian credible interval, 74.7%-83.7%), respectively. Our results provided evidence that assessing diagnostic validity of mental health screening instrument, where application of a gold standard might not be available, can be accomplished by using appropriate statistical methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Development and Application of Assessment Standards to Advanced Written Assignments

    Science.gov (United States)

    Miihkinen, Antti; Virtanen, Tuija

    2018-01-01

    This study describes the results of a project that focused on developing an assessment rubric to be used as the assessment criteria for the written thesis of accounting majors and the quality of the coursework during the seminar. We used descriptive analysis and the survey method to collect information for the development work and to examine the…

  16. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  17. The SAVI vulnerability assessment model

    International Nuclear Information System (INIS)

    Winblad, A.E.

    1987-01-01

    The assessment model ''Systematic Analysis of Vulnerability to Intrusion'' (SAVI) presented in this report is a PC-based path analysis model. It can provide estimates of protection system effectiveness (or vulnerability) against a spectrum of outsider threats including collusion with an insider adversary. It calculates one measure of system effectiveness, the probability of interruption P(I), for all potential adversary paths. SAVI can perform both theft and sabotage vulnerability analyses. For theft, the analysis is based on the assumption that adversaries should be interrupted either before they can accomplish removal of the target material from its normal location or removal from the site boundary. For sabotage, the analysis is based on the assumption that adversaries should be interrupted before completion of their sabotage task

  18. Innovation Process Planning Model in the Bpmn Standard

    Directory of Open Access Journals (Sweden)

    Jurczyk-Bunkowska Magdalena

    2013-12-01

    Full Text Available The aim of the article is to show the relations in the innovation process planning model. The relations argued here guarantee the stable and reliable way to achieve the result in the form of an increased competitiveness by a professionally directed development of the company. The manager needs to specify the effect while initiating the realisation of the process, has to be achieved this by the system of indirect goals. The original model proposed here shows the standard of dependence between the plans of the fragments of the innovation process which make up for achieving its final goal. The relation in the present article was shown by using the standard Business Process Model and Notation. This enabled the specification of interrelations between the decision levels at which subsequent fragments of the innovation process are planned. This gives the possibility of a better coordination of the process, reducing the time needed for the achievement of its effect. The model has been compiled on the basis of the practises followed in Polish companies. It is not, however, the reflection of these practises, but rather an idealised standard of proceedings which aims at improving the effectiveness of the management of innovations on the operational level. The model shown could be the basis of the creation of systems supporting the decision making, supporting the knowledge management or those supporting the communication in the innovation processes.

  19. Neutron electric dipole moment and extension of the standard model

    International Nuclear Information System (INIS)

    Oshimo, Noriyuki

    2001-01-01

    A nonvanishing value for the electric dipole moment (EDM) of the neutron is a prominent signature for CP violation. The EDM induced by the Kobayashi-Maskawa mechanism of the standard model (SM) has a small magnitude and its detection will be very difficult. However, since baryon asymmetry of the universe cannot be accounted for by the SM, there should exist some other source of CP violation, which may generate a large magnitude for the EDM. One of the most hopeful candidates for physics beyond the SM is the supersymmetric standard model, which contains such sources of CP violation. This model suggests that the EDM has a magnitude not much smaller than the present experimental bounds. Progress in measuring the EDM provides very interesting information about extension of the SM. (author)

  20. Distinguishing standard model extensions using monotop chirality at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Allahverdi, Rouzbeh [Department of Physics and Astronomy, University of New Mexico,Albuquerque, NM 87131 (United States); Dalchenko, Mykhailo; Dutta, Bhaskar [Department of Physics and Astronomy,Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843-4242 (United States); Flórez, Andrés [Departamento de Física, Universidad de los Andes,Bogotá, Carrera 1 18A-10, Bloque IP (Colombia); Gao, Yu [Department of Physics and Astronomy,Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843-4242 (United States); Kamon, Teruki [Department of Physics and Astronomy,Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843-4242 (United States); Department of Physics, Kyungpook National University,Daegu 702-701 (Korea, Republic of); Kolev, Nikolay [Department of Physics, University of Regina,SK, S4S 0A2 (Canada); Mueller, Ryan [Department of Physics and Astronomy,Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843-4242 (United States); Segura, Manuel [Departamento de Física, Universidad de los Andes,Bogotá, Carrera 1 18A-10, Bloque IP (Colombia)

    2016-12-13

    We present two minimal extensions of the standard model, each giving rise to baryogenesis. They include heavy color-triplet scalars interacting with a light Majorana fermion that can be the dark matter (DM) candidate. The electroweak charges of the new scalars govern their couplings to quarks of different chirality, which leads to different collider signals. These models predict monotop events at the LHC and the energy spectrum of decay products of highly polarized top quarks can be used to establish the chiral nature of the interactions involving the heavy scalars and the DM. Detailed simulation of signal and standard model background events is performed, showing that top quark chirality can be distinguished in hadronic and leptonic decays of the top quarks.

  1. Overview of the Standard Model Measurements with the ATLAS Detector

    CERN Document Server

    Liu, Yanwen; The ATLAS collaboration

    2017-01-01

    The ATLAS Collaboration is engaged in precision measurement of fundamental Standard Model parameters, such as the W boson mass, the weak-mixing angle or the strong coupling constant. In addition, the production cross-sections of a large variety of final states involving high energetic jets, photons as well as single and multi vector bosons are measured multi differentially at several center of mass energies. This allows to test perturbative QCD calculations to highest precision. In addition, these measurements allow also to test models beyond the SM, e.g. those leading to anomalous gauge couplings. In this talk, we give a broad overview of the Standard Model measurement campaign of the ATLAS collaboration, where selected topics will be discussed in more detail.

  2. Yukawa couplings in Superstring derived Standard-like models

    International Nuclear Information System (INIS)

    Faraggi, A.E.

    1991-01-01

    I discuss Yukawa couplings in Standard-like models which are derived from Superstring in the free fermionic formulation. I introduce new notation for the construction of these models. I show how choice of boundary conditions selects a trilevel Yukawa coupling either for +2/3 charged quark or for -1/3 charged quark. I prove this selection rule. I make the conjecture that in this class of standard-like models a possible connection may exist between the requirements of F and D flatness at the string level and the heaviness of the top quark relative to lighter quarks and leptons. I discuss how the choice of boundary conditions determines the non vanishing mass terms for quartic order terms. I discuss the implication on the mass of the top quark. (author)

  3. Our sun. I. The standard model: Successes and failures

    International Nuclear Information System (INIS)

    Sackmann, I.J.; Boothroyd, A.I.; Fowler, W.A.

    1990-01-01

    The results of computing a number of standard solar models are reported. A presolar helium content of Y = 0.278 is obtained, and a Cl-37 capture rate of 7.7 SNUs, consistently several times the observed rate of 2.1 SNUs, is determined. Thus, the solar neutrino problem remains. The solar Z value is determined primarily by the observed Z/X ratio and is affected very little by differences in solar models. Even large changes in the low-temperature molecular opacities have no effect on Y, nor even on conditions at the base of the convective envelope. Large molecular opacities do cause a large increase in the mixing-length parameter alpha but do not cause the convective envelope to reach deeper. The temperature remains too low for lithium burning, and there is no surface lithium depletion; thus, the lithium problem of the standard solar model remains. 103 refs

  4. Irrigation in dose assessments models

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, Ulla; Barkefors, Catarina [Studsvik RadWaste AB, Nykoeping (Sweden)

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  5. Irrigation in dose assessments models

    International Nuclear Information System (INIS)

    Bergstroem, Ulla; Barkefors, Catarina

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  6. Stable Asymptotically Free Extensions (SAFEs) of the Standard Model

    International Nuclear Information System (INIS)

    Holdom, Bob; Ren, Jing; Zhang, Chen

    2015-01-01

    We consider possible extensions of the standard model that are not only completely asymptotically free, but are such that the UV fixed point is completely UV attractive. All couplings flow towards a set of fixed ratios in the UV. Motivated by low scale unification, semi-simple gauge groups with elementary scalars in various representations are explored. The simplest model is a version of the Pati-Salam model. The Higgs boson is truly elementary but dynamical symmetry breaking from strong interactions may be needed at the unification scale. A hierarchy problem, much reduced from grand unified theories, is still in need of a solution.

  7. Quantum gravity and Standard-Model-like fermions

    International Nuclear Information System (INIS)

    Eichhorn, Astrid; Lippoldt, Stefan

    2017-01-01

    We discover that chiral symmetry does not act as an infrared attractor of the renormalization group flow under the impact of quantum gravity fluctuations. Thus, observationally viable quantum gravity models must respect chiral symmetry. In our truncation, asymptotically safe gravity does, as a chiral fixed point exists. A second non-chiral fixed point with massive fermions provides a template for models with dark matter. This fixed point disappears for more than 10 fermions, suggesting that an asymptotically safe ultraviolet completion for the standard model plus gravity enforces chiral symmetry.

  8. From the standard model to composite quarks and leptons

    International Nuclear Information System (INIS)

    Harari, H.

    1981-01-01

    An updated version of lectures delivered at the SLAC Summer Institute, 1980 is presented. Part I describes the present status of the standard model and gives a short survey of topics such as extensions of the electroweak group, grand unification, the generation puzzle and the connection between quark masses and generalized Cabibbo angles. Part II is devoted to the possibility that quarks and leptons are composite. The general theoretical difficulties are described and several published models are reviewed, including the dynamical rishon model. (H.K.)

  9. On light dilaton extensions of the Standard Model

    International Nuclear Information System (INIS)

    Megías, Eugenio; Pujolàs, Oriol; Quirós, Mariano

    2016-01-01

    We discuss the presence of a light dilaton in Conformal Field Theories deformed by a single scalar operator, in the holographic realization consisting of confining Renormalization Group flows. Then, we apply this formalism to study the extension of the Standard Model with a light dilaton in a 5D warped model. We study the spectrum of scalar and vector perturbations, compare the model predictions with Electroweak Precision Tests and find the corresponding bounds for the lightest modes. Finally, we analyze the possibility that the Higgs resonance found at the LHC be a dilaton

  10. 49 CFR 1572.5 - Standards for security threat assessments.

    Science.gov (United States)

    2010-10-01

    ... assessment includes biometric identification and a biometric credential. (2) To apply for a comparability... process and provide biometric information to obtain a TWIC, if the applicant seeks unescorted access to a...

  11. 75 FR 66038 - Planning Resource Adequacy Assessment Reliability Standard

    Science.gov (United States)

    2010-10-27

    ..., providing a common framework for resource adequacy analysis, assessment, and documentation) effectively and...://www.nerc.com/commondocs.php?cd=2 . D. Proposed Effective Date 24. Proposed regional Reliability...

  12. Assessing cultural validity in standardized tests in stem education

    Science.gov (United States)

    Gassant, Lunes

    This quantitative ex post facto study examined how race and gender, as elements of culture, influence the development of common misconceptions among STEM students. Primary data came from a standardized test: the Digital Logic Concept Inventory (DLCI) developed by Drs. Geoffrey L. Herman, Michael C. Louis, and Craig Zilles from the University of Illinois at Urbana-Champaign. The sample consisted of a cohort of 82 STEM students recruited from three universities in Northern Louisiana. Microsoft Excel and the Statistical Package for the Social Sciences (SPSS) were used for data computation. Two key concepts, several sub concepts, and 19 misconceptions were tested through 11 items in the DLCI. Statistical analyses based on both the Classical Test Theory (Spearman, 1904) and the Item Response Theory (Lord, 1952) yielded similar results: some misconceptions in the DLCI can reliably be predicted by the Race or the Gender of the test taker. The research is significant because it has shown that some misconceptions in a STEM discipline attracted students with similar ethnic backgrounds differently; thus, leading to the existence of some cultural bias in the standardized test. Therefore the study encourages further research in cultural validity in standardized tests. With culturally valid tests, it will be possible to increase the effectiveness of targeted teaching and learning strategies for STEM students from diverse ethnic backgrounds. To some extent, this dissertation has contributed to understanding, better, the gap between high enrollment rates and low graduation rates among African American students and also among other minority students in STEM disciplines.

  13. Searches for Physics Beyond Standard Model at LHC with ATLAS

    CERN Document Server

    Soni, N; The ATLAS collaboration

    2013-01-01

    This contribution summarises some of the recent results on the searches for physics beyond the Standard Model using the pp-collision data collected at Large Hadron Collider (LHC) with ATLAS detector at centre-of-mass energy of sqrt{s} = 8 TeV. The search for supersymmetry (SUSY) is carried out in a large variety of production modes such as strong production of squarks and gluinos, weak production of sleptons and gauginos os production of massive long-lived particles through R-parity violation. No excess above the Standard Model background expectation is observed and exclusion limits are derived on the production of new physics. The results are interpreted as lower limits on sparticle masses in SUSY breaking scenarios. Searches for new exotic phenomena such as dark matter, large extra dimensions and black holes are also performed at ATLAS. As in the case of SUSY searches, no new exotic phenomena is observed and results are presented as upper limits on event yields from non-Standard-Model processes in a model i...

  14. Simple standard model extension by heavy charged scalar

    Science.gov (United States)

    Boos, E.; Volobuev, I.

    2018-05-01

    We consider a Standard Model (SM) extension by a heavy charged scalar gauged only under the UY(1 ) weak hypercharge gauge group. Such an extension, being gauge invariant with respect to the SM gauge group, is a simple special case of the well-known Zee model. Since the interactions of the charged scalar with the Standard Model fermions turn out to be significantly suppressed compared to the Standard Model interactions, the charged scalar provides an example of a long-lived charged particle being interesting to search for at the LHC. We present the pair and single production cross sections of the charged scalar at different colliders and the possible decay widths for various boson masses. It is shown that the current ATLAS and CMS searches at 8 and 13 TeV collision energy lead to the bounds on the scalar boson mass of about 300-320 GeV. The limits are expected to be much larger for higher collision energies and, assuming 15 a b-1 integrated luminosity, reach about 2.7 TeV at future 27 TeV LHC thus covering the most interesting mass region.

  15. The Assessment Of The Level Of Management Control Standards Completion In Treasury Sector

    Directory of Open Access Journals (Sweden)

    Kulińska Ewa

    2015-06-01

    Full Text Available This paper concerns the rules of the functioning of management control standards used in the Treasury Control Office. Its purpose is to present research results conducted in the years 2013–2014 in Polish Treasury Control Offices. Obtained results are the effect of applying author’s model of the assessment of management control implementation. The research was conducted for management personnel and the rest of offices employees separately. Significant discrepancies between these two groups of respondents were indicated. Based on the results, the areas of deviation from expected level of management control standards were established and the areas where implementation of control mechanisms relying on increasing the supervision of board of directors over managers were indicated, providing permanent and efficient elements of managers supervision over subordinate employees and making purposes and tasks put on the Treasury Control Office for given year more precise and familiarization of employees and carrying out trainings and series of other corrective measures.

  16. SLHAplus: A library for implementing extensions of the standard model

    Science.gov (United States)

    Bélanger, G.; Christensen, Neil D.; Pukhov, A.; Semenov, A.

    2011-03-01

    We provide a library to facilitate the implementation of new models in codes such as matrix element and event generators or codes for computing dark matter observables. The library contains an SLHA reader routine as well as diagonalisation routines. This library is available in CalcHEP and micrOMEGAs. The implementation of models based on this library is supported by LanHEP and FeynRules. Program summaryProgram title: SLHAplus_1.3 Catalogue identifier: AEHX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6283 No. of bytes in distributed program, including test data, etc.: 52 119 Distribution format: tar.gz Programming language: C Computer: IBM PC, MAC Operating system: UNIX (Linux, Darwin, Cygwin) RAM: 2000 MB Classification: 11.1 Nature of problem: Implementation of extensions of the standard model in matrix element and event generators and codes for dark matter observables. Solution method: For generic extensions of the standard model we provide routines for reading files that adopt the standard format of the SUSY Les Houches Accord (SLHA) file. The procedure has been generalized to take into account an arbitrary number of blocks so that the reader can be used in generic models including non-supersymmetric ones. The library also contains routines to diagonalize real and complex mass matrices with either unitary or bi-unitary transformations as well as routines for evaluating the running strong coupling constant, running quark masses and effective quark masses. Running time: 0.001 sec

  17. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  18. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  19. Assessment of Usability Benchmarks: Combining Standardized Scales with Specific Questions

    Directory of Open Access Journals (Sweden)

    Stephanie Bettina Linek

    2011-12-01

    Full Text Available The usability of Web sites and online services is of rising importance. When creating a completely new Web site, qualitative data are adequate for identifying the most usability problems. However, changes of an existing Web site should be evaluated by a quantitative benchmarking process. The proposed paper describes the creation of a questionnaire that allows a quantitative usability benchmarking, i.e. a direct comparison of the different versions of a Web site and an orientation on general standards of usability. The questionnaire is also open for qualitative data. The methodology will be explained by the digital library services of the ZBW.

  20. Assessment of Offshore Wind System Design, Safety, and Operation Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sirnivas, Senu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Musial, Walt [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bailey, Bruce [AWS Trupower LLC, Albany, NY (United States); Filippelli, Matthew [AWS Trupower LLC, Albany, NY (United States)

    2014-01-01

    This report is a deliverable for a project sponsored by the U.S. Department of Energy (DOE) entitled National Offshore Wind Energy Resource and Design Data Campaign -- Analysis and Collaboration (contract number DE-EE0005372; prime contractor -- AWS Truepower). The project objective is to supplement, facilitate, and enhance ongoing multiagency efforts to develop an integrated national offshore wind energy data network. The results of this initiative are intended to 1) produce a comprehensive definition of relevant met-ocean resource assets and needs and design standards, and 2) provide a basis for recommendations for meeting offshore wind energy industry data and design certification requirements.

  1. Upgrade of internal events PSA model using the AESJ level-1 PSA standard for operating state

    International Nuclear Information System (INIS)

    Sato, Teruyoshi; Yoneyama, Mitsuru; Hirokawa, Naoki; Sato, Chikahiro; Sato, Eisuke; Tomizawa, Shigeatsu

    2009-01-01

    In 2003, the Atomic Energy Society of Japan (AESJ) started to develop the Level-1 Probabilistic Safety Assessment (PSA) standard of internal events for operating state (AESJ standard). The AESJ standard has been finished to be asked for public comment. Using the AESJ standard (draft version), the authors have upgraded the PSA model for Tokyo Electric Power Company (TEPCO) BWR-5 plant not only to reflect latest knowledge but also to ensure high quality of PSA model (not yet peer-reviewed) for the purpose of better operation and maintenance management of TEPCO BWR plants. For example, the categorization of structures, systems and components (SSCs) will be performed to improve nuclear reactor safety using information of risk importance. (author)

  2. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin

    2016-04-06

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  3. Precision Electroweak Measurements and Constraints on the Standard Model

    CERN Document Server

    ,

    2010-01-01

    This note presents constraints on Standard Model parameters using published and preliminary precision electroweak results measured at the electron-positron colliders LEP and SLC. The results are compared with precise electroweak measurements from other experiments, notably CDF and DØ at the Tevatron. Constraints on the input parameters of the Standard Model are derived from the combined set of results obtained in high-$Q^2$ interactions, and used to predict results in low-$Q^2$ experiments, such as atomic parity violation, Møller scattering, and neutrino-nucleon scattering. The main changes with respect to the experimental results presented in 2009 are new combinations of results on the width of the W boson and the mass of the top quark.

  4. Lorentz-violating theories in the standard model extension

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira Junior, Manoel Messias [Universidade Federal do Maranhao (UFMA), Sao Luis, MA (Brazil)

    2012-07-01

    Full text: Lorentz-violating theories have been an issue of permanent interest in the latest years. Many of these investigations are developed under the theoretical framework of the Standard Model Extension (SME), a broad extension of the minimal Standard Model embracing Lorentz-violating (LV) terms, generated as vacuum expectation values of tensor quantities, in all sectors of interaction. In this talk, we comment on some general properties of the SME, concerning mainly the gauge and fermion sectors, focusing in new phenomena induced by Lorentz violation. The LV terms are usually separated in accordance with the behavior under discrete symmetries, being classified as CPT-odd or CPT-even, parity-even or parity-odd. We follow this classification scheme discussing some features and new properties of the CPT-even and CPT-odd parts of the gauge and fermion sectors. We finalize presenting some upper bounds imposed on the corresponding LV coefficients. (author)

  5. Search for the Standard Model Higgs Boson at LEP

    CERN Document Server

    Barate, R.; De Bonis, I.; Decamp, D.; Goy, C.; Jezequel, S.; Lees, J.P.; Martin, F.; Merle, E.; Minard, M.N.; Pietrzyk, B.; Trocme, B.; Boix, G.; Bravo, S.; Casado, M.P.; Chmeissani, M.; Crespo, J.M.; Fernandez, E.; Fernandez-Bosman, M.; Garrido, L.; Grauges, E.; Lopez, J.; Martinez, M.; Merino, G.; Miquel, R.; Mir, L.M.; Pacheco, A.; Paneque, D.; Ruiz, H.; Heister, A.; Schael, S.; Colaleo, A.; Creanza, D.; De Filippis, N.; de Palma, M.; Iaselli, G.; Maggi, G.; Maggi, M.; Nuzzo, S.; Ranieri, A.; Raso, G.; Ruggieri, F.; Selvaggi, G.; Silvestris, L.; Tempesta, P.; Tricomi, A.; Zito, G.; Huang, X.; Lin, J.; Quyang, Q.; Wang, T.; Xie, Y.; Xu, R.; Xue, S.; Zhang, J.; Zhang, L.; Zhao, W.; Abbaneo, D.; Azzurri, P.; Barklow, T.; Buchmuller, O.; Cattaneo, M.; Cerutti, F.; Clerbaux, B.; Drevermann, H.; Forty, R.W.; Frank, M.; Gianotti, F.; Greening, T.C.; Hansen, J.B.; Harvey, J.; Hutchcroft, D.E.; Janot, P.; Jost, B.; Kado, M.; Maley, P.; Mato, P.; Moutoussi, A.; Ranjard, F.; Rolandi, Gigi; Schlatter, D.; Sguazzoni, G.; Tejessy, W.; Teubert, F.; Valassi, A.; Videau, I.; Ward, J.J.; Badaud, F.; Dessagne, S.; Falvard, A.; Fayolle, D.; Gay, P.; Jousset, J.; Michel, B.; Monteil, S.; Pallin, D.; Pascolo, J.M.; Perret, P.; Hansen, J.D.; Hansen, J.R.; Hansen, P.H.; Nilsson, B.S.; Waananen, A.; Kyriakis, A.; Markou, C.; Simopoulou, E.; Vayaki, A.; Zachariadou, K.; Blondel, A.; Brient, J.C.; Machefert, F.; Rouge, A.; Swynghedauw, M.; Tanaka, R.; Videau, H.; Ciulli, V.; Focardi, E.; Parrini, G.; Antonelli, A.; Antonelli, M.; Bencivenni, G.; Bologna, G.; Bossi, F.; Campana, P.; Capon, G.; Chiarella, V.; Laurelli, P.; Mannocchi, G.; Murtas, F.; Murtas, G.P.; Passalacqua, L.; Pepe-Altarelli, M.; Spagnolo, P.; Kennedy, J.; Lynch, J.G.; Negus, P.; O'Shea, V.; Smith, D.; Thompson, A.S.; Wasserbaech, S.; Cavanaugh, R.; Dhamotharan, S.; Geweniger, C.; Hanke, P.; Hepp, V.; Kluge, E.E.; Leibenguth, G.; Putzer, A.; Stenzel, H.; Tittel, K.; Werner, S.; Wunsch, M.; Beuselinck, R.; Binnie, D.M.; Cameron, W.; Davies, G.; Dornan, P.J.; Girone, M.; Hill, R.D.; Marinelli, N.; Nowell, J.; Przysiezniak, H.; Rutherford, S.A.; Sedgbeer, J.K.; Thompson, J.C.; White, R.; Ghete, V.M.; Girtler, P.; Kneringer, E.; Kuhn, D.; Rudolph, G.; Bouhova-Thacker, E.; Bowdery, C.K.; Clarke, D.P.; Ellis, G.; Finch, A.J.; Foster, F.; Hughes, G.; Jones, R.W.L.; Pearson, M.R.; Robertson, N.A.; Smizanska, M.; Lemaitre, V.; Blumenschein, U.; Holldorfer, F.; Jakobs, K.; Kayser, F.; Kleinknecht, K.; Muller, A.S.; Quast, G.; Renk, B.; Sander, H.G.; Schmeling, S.; Wachsmuth, H.; Zeitnitz, C.; Ziegler, T.; Bonissent, A.; Carr, J.; Coyle, P.; Curtil, C.; Ealet, A.; Fouchez, D.; Leroy, O.; Kachelhoffer, T.; Payre, P.; Rousseau, D.; Tilquin, A.; Ragusa, F.; David, A.; Dietl, H.; Ganis, G.; Huttmann, K.; Lutjens, G.; Mannert, C.; Manner, W.; Moser, H.G.; Settles, R.; Wolf, G.; Boucrot, J.; Callot, O.; Davier, M.; Duflot, L.; Grivaz, J.F.; Heusse, P.; Jacholkowska, A.; Loomis, C.; Serin, L.; Veillet, J.J.; de Vivie de Regie, J.B.; Yuan, C.; Bagliesi, Giuseppe; Boccali, T.; Foa, L.; Giammanco, A.; Giassi, A.; Ligabue, F.; Messineo, A.; Palla, F.; Sanguinetti, G.; Sciaba, A.; Tenchini, R.; Venturi, A.; Verdini, P.G.; Awunor, O.; Blair, G.A.; Coles, J.; Cowan, G.; Garcia-Bellido, A.; Green, M.G.; Jones, L.T.; Medcalf, T.; Misiejuk, A.; Strong, J.A.; Teixeira-Dias, P.; Clifft, R.W.; Edgecock, T.R.; Norton, P.R.; Tomalin, I.R.; Bloch-Devaux, Brigitte; Boumediene, D.; Colas, P.; Fabbro, B.; Lancon, E.; Lemaire, M.C.; Locci, E.; Perez, P.; Rander, J.; Renardy, J.F.; Rosowsky, A.; Seager, P.; Trabelsi, A.; Tuchming, B.; Vallage, B.; Konstantinidis, N.; Litke, A.M.; Taylor, G.; Booth, C.N.; Cartwright, S.; Combley, F.; Hodgson, P.N.; Lehto, M.; Thompson, L.F.; Affholderbach, K.; Boehrer, Armin; Brandt, S.; Grupen, C.; Hess, J.; Ngac, A.; Prange, G.; Sieler, U.; Borean, C.; Giannini, G.; He, H.; Putz, J.; Rothberg, J.; Armstrong, S.R.; Berkelman, Karl; Cranmer, K.; Ferguson, D.P.S.; Gao, Y.; Gonzalez, S.; Hayes, O.J.; Hu, H.; Jin, S.; Kile, J.; McNamara, P.A., III; Nielsen, J.; Pan, Y.B.; von Wimmersperg-Toeller, J.H.; Wiedenmann, W.; Wu, J.; Wu, S.L.; Wu, X.; Zobernig, G.; Dissertori, G.; Abdallah, J.; Abreu, P.; Adam, W.; Adzic, P.; Albrecht, T.; Alderweireld, T.; Alemany-Fernandez, R.; Allmendinger, T.; Allport, P.P.; Amaldi, U.; Amapane, N.; Amato, S.; Anashkin, E.; Andreazza, A.; Andringa, S.; Anjos, N.; Antilogus, P.; Apel, W.D.; Arnoud, Y.; Ask, S.; Asman, B.; Augustin, J.E.; Augustinus, A.; Baillon, P.; Ballestrero, A.; Bambade, P.; Barbier, R.; Bardin, D.; Barker, G.J.; Baroncelli, A.; Battaglia, M.; Baubillier, M.; Becks, K.H.; Begalli, M.; Behrmann, A.; Ben-Haim, E.; Benekos, N.; Benvenuti, A.; Berat, C.; Berggren, M.; Berntzon, L.; Bertrand, D.; Besancon, M.; Besson, N.; Bloch, D.; Blom, M.; Bluj, M.; Bonesini, M.; Boonekamp, M.; Booth, P.S.L.; Borisov, G.; Botner, O.; Bouquet, B.; Bowcock, T.J.V.; Boyko, I.; Bracko, M.; Brenner, R.; Brodet, E.; Bruckman, P.; Brunet, J.M.; Bugge, L.; Buschmann, P.; Calvi, M.; Camporesi, T.; Canale, V.; Carena, F.; Castro, Nuno Filipe; Cavallo, F.; Chapkin, M.; Charpentier, P.; Checchia, P.; Chierici, R.; Chliapnikov, P.; Chudoba, J.; Chung, S.U.; Cieslik, K.; Collins, P.; Contri, R.; Cosme, G.; Cossutti, F.; Costa, M.J.; Crawley, B.; Crennell, D.; Cuevas, J.; DHondt, J.; Dalmau, J.; da Silva, T.; Da Silva, W.; Della Ricca, G.; De Angelis, A.; De Boer, W.; De Clercq, C.; De Lotto, B.; De Maria, N.; De Min, A.; de Paula, L.; Di Ciaccio, L.; Di Simone, A.; Doroba, K.; Drees, J.; Dris, M.; Eigen, G.; Ekelof, T.; Ellert, M.; Elsing, M.; Espirito Santo, M.C.; Fanourakis, G.; Fassouliotis, D.; Feindt, M.; Fernandez, J.; Ferrer, A.; Ferro, F.; Flagmeyer, U.; Foeth, H.; Fokitis, E.; Fulda-Quenzer, F.; Fuster, J.; Gandelman, M.; Garcia, C.; Gavillet, P.; Gazis, Evangelos; Gokieli, R.; Golob, B.; Gomez-Ceballos, G.; Goncalves, P.; Graziani, E.; Grosdidier, G.; Grzelak, K.; Guy, J.; Haag, C.; Hallgren, A.; Hamacher, K.; Hamilton, K.; Hansen, J.; Haug, S.; Hauler, F.; Hedberg, V.; Hennecke, M.; Herr, H.; Hoffman, J.; Holmgren, S.O.; Holt, P.J.; Houlden, M.A.; Hultqvist, K.; Jackson, John Neil; Jarlskog, G.; Jarry, P.; Jeans, D.; Johansson, Erik Karl; Johansson, P.D.; Jonsson, P.; Joram, C.; Jungermann, L.; Kapusta, Frederic; Katsanevas, S.; Katsoufis, E.; Kernel, G.; Kersevan, B.P.; Kiiskinen, A.; King, B.T.; Kjaer, N.J.; Kluit, P.; Kokkinias, P.; Kourkoumelis, C.; Kouznetsov, O.; Krumstein, Z.; Kucharczyk, M.; Lamsa, J.; Leder, G.; Ledroit, Fabienne; Leinonen, L.; Leitner, R.; Lemonne, J.; Lepeltier, V.; Lesiak, T.; Liebig, W.; Liko, D.; Lipniacka, A.; Lopes, J.H.; Lopez, J.M.; Loukas, D.; Lutz, P.; Lyons, L.; MacNaughton, J.; Malek, A.; Maltezos, S.; Mandl, F.; Marco, J.; Marco, R.; Marechal, B.; Margoni, M.; Marin, J.C.; Mariotti, C.; Markou, A.; Martinez-Rivero, C.; Masik, J.; Mastroyiannopoulos, N.; Matorras, F.; Matteuzzi, C.; Mazzucato, F.; Mazzucato, M.; McNulty, R.; Meroni, C.; Meyer, W.T.; Migliore, E.; Mitaroff, W.; Mjoernmark, U.; Moa, T.; Moch, M.; Monig, Klaus; Monge, R.; Montenegro, J.; Moraes, D.; Moreno, S.; Morettini, P.; Mueller, U.; Muenich, K.; Mulders, M.; Mundim, L.; Murray, W.; Muryn, B.; Myatt, G.; Myklebust, T.; Nassiakou, M.; Navarria, F.; Nawrocki, K.; Nicolaidou, R.; Nikolenko, M.; Oblakowska-Mucha, A.; Obraztsov, V.; Olshevski, A.; Onofre, A.; Orava, R.; Osterberg, K.; Ouraou, A.; Oyanguren, A.; Paganoni, M.; Paiano, S.; Palacios, J.P.; Palka, H.; Papadopoulou, T.D.; Pape, L.; Parkes, C.; Parodi, F.; Parzefall, U.; Passeri, A.; Passon, O.; Peralta, L.; Perepelitsa, V.; Perrotta, A.; Petrolini, A.; Piedra, J.; Pieri, L.; Pierre, F.; Pimenta, M.; Piotto, E.; Podobnik, T.; Poireau, V.; Pol, M.E.; Polok, G.; Poropat, P.; Pozdniakov, V.; Pukhaeva, N.; Pullia, A.; Rames, J.; Ramler, L.; Read, Alexander L.; Rebecchi, P.; Rehn, J.; Reid, D.; Reinhardt, R.; Renton, P.; Richard, F.; Ridky, J.; Rivero, M.; Rodriguez, D.; Romero, A.; Ronchese, P.; Rosenberg, E.; Roudeau, P.; Rovelli, T.; Ruhlmann-Kleider, V.; Ryabtchikov, D.; Sadovsky, A.; Salmi, L.; Salt, J.; Savoy-Navarro, A.; Schwickerath, U.; Segar, A.; Sekulin, R.; Siebel, M.; Sisakian, A.; Smadja, G.; Smirnova, O.; Sokolov, A.; Sopczak, A.; Sosnowski, R.; Spassov, T.; Stanitzki, M.; Stocchi, A.; Strauss, J.; Stugu, B.; Szczekowski, M.; Szeptycka, M.; Szumlak, T.; Tabarelli, T.; Taffard, A.C.; Tegenfeldt, F.; Timmermans, Jan; Tkatchev, L.; Tobin, M.; Todorovova, S.; Tome, B.; Tonazzo, A.; Tortosa, P.; Travnicek, P.; Treille, D.; Tristram, G.; Trochimczuk, M.; Troncon, C.; Turluer, M.L.; Tyapkin, I.A.; Tyapkin, P.; Tzamarias, S.; Uvarov, V.; Valenti, G.; Van Dam, Piet; Van Eldik, J.; Van Lysebetten, A.; Van Remortel, N.; Van Vulpen, I.; Vegni, G.; Veloso, F.; Venus, W.; Verbeure, F.; Verdier, P.; Verzi, V.; Vilanova, D.; Vitale, L.; Vrba, V.; Wahlen, H.; Washbrook, A.J.; Weiser, C.; Wicke, D.; Wickens, J.; Wilkinson, G.; Winter, M.; Witek, M.; Yushchenko, O.; Zalewska, A.; Zalewski, P.; Zavrtanik, D.; Zhuravlov, V.; Zimine, N.I.; Zintchenko, A.; Zupan, M.; Achard, P.; Adriani, O.; Aguilar-Benitez, M.; Alcaraz, J.; Alemanni, G.; Allaby, J.; Aloisio, A.; Alviggi, M.G.; Anderhub, H.; Andreev, Valery P.; Anselmo, F.; Arefiev, A.; Azemoon, T.; Aziz, T.; Baarmand, M.; Bagnaia, P.; Bajox, A.; Baksay, G.; Baksay, L.; Baldew, S.V.; Banerjee, S.; Barczyk, A.; Barillere, R.; Bartalini, P.; Basile, M.; Batalova, N.; Battiston, R.; Bay, A.; Becattini, F.; Becker, U.; Behner, F.; Bellucci, L.; Berbeco, R.; Berdugo, J.; Berges, P.; Bertucci, B.; Betev, B.L.; Biasini, M.; Biglietti, M.; Biland, A.; Blaising, J.J.; Blyth, S.C.; Bobbink, G.J.; Bohm, A.; Boldizsar, L.; Borgia, B.; Bourilkov, D.; Bourquin, M.; Braccini, S.; Branson, J.G.; Brochu, F.; Buijs, A.; Burger, J.D.; Burger, W.J.; Cai, X.D.; Capell, M.; Cara Romeo, G.; Carlino, G.; Cartacci, A.; Casau, J.; Cavallari, F.; Cavallo, N.; Cecchi, C.; Cerrada, M.; Chamizo, M.; Chang, Y.H.; Chemarin, M.; Chen, A.; Chen, G.; Chen, G.M.; Chen, H.F.; Chen, H.S.; Chiefari, G.; Cifarelli, L.; Cindolo, F.; Clare, I.; Clare, R.; Coignet, G.; Colino, N.; Costantini, S.; de la Cruz, B.; Cucciarelli, S.; Dai, T.S.; van Dalen, J.A.; de Asmundis, R.; Deglont, P.; Debreczeni, J.; Degre, A.; Deiters, K.; della Volpe, D.; Delmeire, E.; Denes, P.; De Notaristefani, F.; De Salvo, A.; Diemoz, M.; Dierckxsens, M.; van Dierendonck, D.; Dionisi, C.; Dittmar, M.; Doria, A.; Dova, M.T.; Duchesneau, D.; Duinker, P.; Echenard, B.; Eline, A.; El Mamouni, H.; Engler, A.; Eppling, F.J.; Ewers, A.; Extermann, P.; Falagan, M.A.; Falciano, S.; Favara, A.; Fay, J.; Fedin, O.; Felcini, M.; Ferguson, T.; Fesefeldt, H.; Fiandrini, E.; Field, J.H.; Filthaut, F.; Fisher, P.H.; Fisher, W.; Fisk, I.; Forconi, G.; Freudenreich, K.; Furetta, C.; Galaktionov, Iouri; Ganguli, S.N.; Garcia-Abia, Pablo; Gataullin, M.; Gentile, S.; Giagu, S.; Gong, Z.F.; Grenier, Gerald Jean; Grimm, O.; Gruenewald, M.W.; Guida, M.; van Gulik, R.; Gupta, V.K.; Gurtu, A.; Gutay, L.J.; Haas, D.; Hatzifotiadou, D.; Hebbeker, T.; Herve, Alain; Hirschfelder, J.; Hofer, H.; Holzner, G.; Hou, S.R.; Hu, Y.; Jin, B.N.; Jones, Lawrence W.; de Jong, P.; Josa-Mutuberria, I.; Kafer, D.; Kaur, M.; Kienzle-Focacci, M.N.; Kim, J.K.; Kirkby, Jasper; Kittel, W.; Klimentov, A.; Konig, A.C.; Kopal, M.; Koutsenko, V.; Kraber, M.; Kraemer, R.W.; Krenz, W.; Kruger, A.; Kunin, A.; Ladron de Guevara, P.; Laktineh, I.; Landi, G.; Lebeau, M.; Lebedev, A.; Lebrun, P.; Lecomte, P.; Lecoq, P.; Le Coultre, P.; Lee, H.J.; Le Goff, J.M.; Leiste, R.; Levtchenko, P.; Li, C.; Likhoded, S.; Lin, C.H.; Lin, W.T.; Linde, F.L.; Lista, L.; Liu, Z.A.; Lohmann, W.; Longo, E.; Lu, Y.S.; Lubelsmeyer, K.; Luci, C.; Luckey, David; Luminari, L.; Lustermann, W.; Ma, W.G.; Malgeri, L.; Malinin, A.; Mana, C.; Mangeol, D.; Mans, J.; Martin, J.P.; Marzano, F.; Mazumdar, K.; McNeil, R.R.; Mele, S.; Merola, L.; Meschini, M.; Metzger, W.J.; Mihul, A.; Milcent, H.; Mirabelli, G.; Mnich, J.; Mohanty, G.B.; Muanza, G.S.; Muijs, A.J.M.; Musicar, B.; Musy, M.; Nagy, S.; Napolitano, M.; Nessi-Tedaldi, F.; Newman, H.; Niessen, T.; Nisati, A.; Kluge, Hannelies; Ofierzynski, R.; Organtini, G.; Palomares, C.; Pandoulas, D.; Paolucci, P.; Paramatti, R.; Passaleva, G.; Patricelli, S.; Paul, Thomas Cantzon; Pauluzzi, M.; Paus, C.; Pauss, F.; Pedace, M.; Pensotti, S.; Perret-Gallix, D.; Petersen, B.; Piccolo, D.; Pierella, F.; Piroue, P.A.; Pistolesi, E.; Plyaskin, V.; Pohl, M.; Pojidaev, V.; Postema, H.; Pothier, J.; Prokofiev, D.O.; Prokofiev, D.; Quartieri, J.; Rahal-Callot, G.; Rahaman, Mohammad Azizur; Raics, P.; Raja, N.; Ramelli, R.; Rancoita, P.G.; Ranieri, R.; Raspereza, A.; Razis, P.; Ren, D.; Rescigno, M.; Reucroft, S.; Riemann, S.; Riles, Keith; Roe, B.P.; Romero, L.; Rosca, A.; Rosier-Lee, S.; Roth, Stefan; Rosenbleck, C.; Roux, B.; Rubio, J.A.; Ruggiero, G.; Rykaczewski, H.; Sakharov, A.; Saremi, S.; Sarkar, S.; Salicio, J.; Sanchez, E.; Sanders, M.P.; Schafer, C.; Schegelsky, V.; Schmidt-Kaerst, S.; Schmitz, D.; Schopper, H.; Schotanus, D.J.; Schwering, G.; Sciacca, C.; Servoli, L.; Shevchenko, S.; Shivarov, N.; Shoutko, V.; Shumilov, E.; Shvorob, A.; Siedenburg, T.; Son, D.; Spillantini, P.; Steuer, M.; Stickland, D.P.; Stoyanov, B.; Straessner, A.; Sudhakar, K.; Sultanov, G.; Sun, L.Z.; Sushkov, S.; Suter, H.; Swain, J.D.; Szillasi, Z.; Tang, X.W.; Tarjan, P.; Tauscher, L.; Taylor, L.; Tellili, B.; Teyssier, D.; Timmermans, Charles; Ting, Samuel C.C.; Ting, S.M.; Tonwar, S.C.; Toth, J.; Tully, C.; Tung, K.L.; Uchida, Y.; Ulbricht, J.; Valente, E.; Van de Walle, R.T.; Veszpremi, V.; Vesztergombi, G.; Vetlitsky, I.; Vicinanza, D.; Viertel, G.; Villa, S.; Vivargent, M.; Vlachos, S.; Vodopianov, I.; Vogel, H.; Vogt, H.; Vorobiev, I.; Vorobyov, A.A.; Wadhwa, M.; Wallraff, W.; Wang, M.; Wang, X.L.; Wang, Z.M.; Weber, M.; Wienemann, P.; Wilkens, H.; Wu, S.X.; Wynhoff, S.; Xia, L.; Xu, Z.Z.; Yamamoto, J.; Yang, B.Z.; Yang, C.G.; Yang, H.J.; Yang, M.; Yeh, S.C.; Zalite, A.; Zalite, Yu.; Zhang, Z.P.; Zhao, J.; Zhu, G.Y.; Zhu, R.Y.; Zhuang, H.L.; Zichichi, A.; Zilizi, G.; Zimmermann, B.; Zoller, M.; Abbiendi, G.; Ainsley, C.; Akesson, P.F.; Alexander, G.; Allison, John; Amaral, P.; Anagnostou, G.; Anderson, K.J.; Arcelli, S.; Asai, S.; Axen, D.; Azuelos, G.; Bailey, I.; Barberio, E.; Barlow, R.J.; Batley, R.J.; Bechtle, P.; Behnke, T.; Bell, Kenneth Watson; Bell, P.J.; Bella, G.; Bellerive, A.; Benelli, G.; Bethke, S.; Biebel, O.; Bloodworth, I.J.; Boeriu, O.; Bock, P.; Bonacorsi, D.; Boutemeur, M.; Braibant, S.; Brigliadori, L.; Brown, Robert M.; Buesser, K.; Burckhart, H.J.; Campana, S.; Carnegie, R.K.; Caron, B.; Carter, A.A.; Carter, J.R.; Chang, C.Y.; Charlton, David G.; Csilling, A.; Cuffiani, M.; Dado, S.; Dallavalle, G.Marco; Dallison, S.; De Roeck, A.; De Wolf, E.A.; Desch, K.; Dienes, B.; Donkers, M.; Dubbert, J.; Duchovni, E.; Duckeck, G.; Duerdoth, I.P.; Elfgren, E.; Etzion, E.; Fabbri, F.; Feld, L.; Ferrari, P.; Fiedler, F.; Fleck, I.; Ford, M.; Frey, A.; Furtjes, A.; Gagnon, P.; Gary, John William; Gaycken, G.; Geich-Gimbel, C.; Giacomelli, G.; Giacomelli, P.; Giunta, Marina; Goldberg, J.; Gross, E.; Grunhaus, J.; Gruwe, M.; Gunther, P.O.; Gupta, A.; Hajdu, C.; Hamann, M.; Hanson, G.G.; Harder, K.; Harel, A.; Harin-Dirac, M.; Hauschild, M.; Hauschildt, J.; Hawkes, C.M.; Hawkings, R.; Hemingway, R.J.; Hensel, C.; Herten, G.; Heuer, R.D.; Hill, J.C.; Hoffman, Kara Dion; Homer, R.J.; Horvath, D.; Howard, R.; Huntemeyer, P.; Igo-Kemenes, P.; Ishii, K.; Jeremie, H.; Jovanovic, P.; Junk, T.R.; Kanaya, N.; Kanzaki, J.; Karapetian, G.; Karlen, D.; Kartvelishvili, V.; Kawagoe, K.; Kawamoto, T.; Keeler, R.K.; Kellogg, R.G.; Kennedy, B.W.; Kim, D.H.; Klein, K.; Klier, A.; Kluth, S.; Kobayashi, T.; Kobel, M.; Komamiya, S.; Kormos, Laura L.; Kowalewski, Robert V.; Kramer, T.; Kress, T.; Krieger, P.; von Krogh, J.; Krop, D.; Kruger, K.; Kupper, M.; Lafferty, G.D.; Landsman, H.; Lanske, D.; Layter, J.G.; Leins, A.; Lellouch, D.; Letts, J.; Levinson, L.; Lillich, J.; Lloyd, S.L.; Loebinger, F.K.; Lu, J.; Ludwig, J.; Macpherson, A.; Mader, W.; Marcellini, S.; Marchant, T.E.; Martin, A.J.; Masetti, G.; Mashimo, T.; Mattig, Peter; McDonald, W.J.; McKenna, J.; McMahon, T.J.; McPherson, R.A.; Meijers, F.; Mendez-Lorenzo, P.; Menges, W.; Merritt, F.S.; Mes, H.; Michelini, A.; Mihara, S.; Mikenberg, G.; Miller, D.J.; Moed, S.; Mohr, W.; Mori, T.; Mutter, A.; Nagai, K.; Nakamura, I.; Neal, H.A.; Nisius, R.; ONeale, S.W.; Oh, A.; Okpara, A.; Oreglia, M.J.; Orito, S.; Pahl, C.; Pasztor, G.; Pater, J.R.; Patrick, G.N.; Pilcher, J.E.; Pinfold, J.; Plane, David E.; Poli, B.; Polok, J.; Pooth, O.; Przybycien, M.; Quadt, A.; Rabbertz, K.; Rembser, C.; Renkel, P.; Rick, H.; Roney, J.M.; Rosati, S.; Rozen, Y.; Runge, K.; Sachs, K.; Saeki, T.; Sahr, O.; Sarkisyan, E.K.G.; Schaile, A.D.; Schaile, O.; Scharff-Hansen, P.; Schieck, J.; Schoerner-Sadenius, Thomas; Schroder, Matthias; Schumacher, M.; Schwick, C.; Scott, W.G.; Seuster, R.; Shears, T.G.; Shen, B.C.; Shepherd-Themistocleous, C.H.; Sherwood, P.; Siroli, G.; Skuja, A.; Smith, A.M.; Sobie, R.; Soldner-Rembold, S.; Spagnolo, S.; Spano, F.; Stahl, A.; Stephens, K.; Strom, David M.; Strohmer, R.; Tarem, S.; Tasevsky, M.; Taylor, R.J.; Teuscher, R.; Thomson, M.A.; Torrence, E.; Toya, D.; Tran, P.; Trefzger, T.; Tricoli, A.; Trigger, I.; Trocsanyi, Z.; Tsur, E.; Turner-Watson, M.F.; Ueda, I.; Ujvari, B.; Vachon, B.; Vollmer, C.F.; Vannerem, P.; Verzocchi, M.; Voss, H.; Vossebeld, J.; Waller, D.; Ward, C.P.; Ward, D.R.; Watkins, P.M.; Watson, A.T.; Watson, N.K.; Wells, P.S.; Wengler, T.; Wermes, N.; Wetterling, D.; Wilson, G.W.; Wilson, J.A.; Wyatt, T.R.; Yamashita, S.; Zer-Zion, D.; Zivkovic, Lidija; Heinemeyer, S.; Weiglein, G.

    2003-01-01

    The four LEP collaborations, ALEPH, DELPHI, L3 and OPAL, have collected a total of 2461 pb-1 of e+e- collision data at centre-of-mass energies between 189 and 209 GeV. The data are used to search for the Standard Model Higgs boson. The search results of the four collaborations are combined and examined in a likelihood test for their consistency with two hypotheses: the background hypothesis and the signal plus background hypothesis. The corresponding confidences have been computed as functions of the hypothetical Higgs boson mass. A lower bound of 114.4 GeV/c2 is established, at the 95% confidence level, on the mass of the Standard Model Higgs boson. The LEP data are also used to set upper bounds on the HZZ coupling for various assumptions concerning the decay of the Higgs boson.

  6. Noncommutative geometry and its application to the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Martinetti, Pierre [Georg-August Universitaet, Goettingen (Germany)

    2009-07-01

    We give an overview of the description of the standard model of particle physics minimally coupled to gravity within the framework of noncommutative geometry. Especially we study in detail the metric structure of spacetime that emerges from the spectral triple recently proposed by Chamseddine, Connes and Marcolli. Within this framework points of spacetime acquire an internal structure inherited from the gauge group of the standard model. A distance is defined on this generalized spacetime which is fully encoded by the Yang-Mills gauge fields together with the Higgs field. We focus on some explicit examples, underlying the link between this distance and other distances well known by physicists and mathematicians, such has the Carnot-Caratheodory horizontal distance or the Monge-Kantorovitch transport distance.

  7. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin; Shearer, Peter; Ampuero, Jean‐Paul; Lay, Thorne

    2016-01-01

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  8. Standard model fermions and K(E10

    Directory of Open Access Journals (Sweden)

    Axel Kleinschmidt

    2015-07-01

    Full Text Available In recent work [1] it was shown how to rectify Gell-Mann's proposal for identifying the 48 quarks and leptons of the Standard Model with the 48 spin-12 fermions of maximal SO(8 gauged supergravity remaining after the removal of eight Goldstinos, by deforming the residual U(1 symmetry at the SU(3 × U(1 stationary point of N=8 supergravity, so as to also achieve agreement of the electric charge assignments. In this Letter we show that the required deformation, while not in SU(8, does belong to K(E10, the ‘maximal compact’ subgroup of E10 which is a possible candidate symmetry underlying M theory. The incorporation of infinite-dimensional Kac–Moody symmetries of hyperbolic type, apparently unavoidable for the present scheme to work, opens up completely new perspectives on embedding Standard Model physics into a Planck scale theory of quantum gravity.

  9. Challenges to the standard model of Big Bang nucleosynthesis

    International Nuclear Information System (INIS)

    Steigman, G.

    1993-01-01

    Big Bang nucleosynthesis provides a unique probe of the early evolution of the Universe and a crucial test of the consistency of the standard hot Big Bang cosmological model. Although the primordial abundances of 2 H, 3 He, 4 He, and 7 Li inferred from current observational data are in agreement with those predicted by Big Bang nucleosynthesis, recent analysis has severely restricted the consistent range for the nucleon-to-photon ratio: 3.7 ≤ η 10 ≤ 4.0. Increased accuracy in the estimate of primordial 4 he and observations of Be and B in Pop II stars are offering new challenges to the standard model and suggest that no new light particles may be allowed (N ν BBN ≤ 3.0, where N ν is the number of equivalent light neutrinos). 23 refs

  10. Assessment matrix for timber structures : basis for standardized building checks

    NARCIS (Netherlands)

    Abels, M.

    2011-01-01

    How can futurily be secured enabling people to enter stable and utilisable buildings permanently and everywhere for them to stay, live and work safely? In order to avoid catastrophies like the collapse of the Bad Reichenhall ice pavillion (Germany) in 2006, a method of evaluation (= assessment

  11. Asymptotically Safe Standard Model Extensions arXiv

    CERN Document Server

    Pelaggi, Giulio Maria; Salvio, Alberto; Sannino, Francesco; Smirnov, Juri; Strumia, Alessandro

    We consider theories with a large number NF of charged fermions and compute the renormalisation group equations for the gauge, Yukawa and quartic couplings resummed at leading order in NF. We construct extensions of the Standard Model where SU(2) and/or SU(3) are asymptotically safe. When the same procedure is applied to the Abelian U(1) factor, we find that the Higgs quartic can not be made asymptotically safe and stay perturbative at the same time.

  12. Quantum Gravity and Maximum Attainable Velocities in the Standard Model

    International Nuclear Information System (INIS)

    Alfaro, Jorge

    2007-01-01

    A main difficulty in the quantization of the gravitational field is the lack of experiments that discriminate among the theories proposed to quantize gravity. Recently we showed that the Standard Model(SM) itself contains tiny Lorentz invariance violation(LIV) terms coming from QG. All terms depend on one arbitrary parameter α that set the scale of QG effects. In this talk we review the LIV for mesons nucleons and leptons and apply it to study several effects, including the GZK anomaly

  13. Framework for an asymptotically safe standard model via dynamical breaking

    DEFF Research Database (Denmark)

    Abel, Steven; Sannino, Francesco

    2017-01-01

    We present a consistent embedding of the matter and gauge content of the Standard Model into an underlying asymptotically safe theory that has a well-determined interacting UV fixed point in the large color/flavor limit. The scales of symmetry breaking are determined by two mass-squared parameters...... with the breaking of electroweak symmetry being driven radiatively. There are no other free parameters in the theory apart from gauge couplings....

  14. Standard model parameters and the search for new physics

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1988-04-01

    In these lectures, my aim is to present an up-to-date status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows: I discuss the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also briefly commented on. In addition, because these lectures are intended for students and thus somewhat pedagogical, I have included an appendix on dimensional regularization and a simple computational example that employs that technique. Next, I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, supersymmetry, extra Z/prime/ bosons, and compositeness are also discussed. I discuss weak neutral current phenomenology and the extraction of sin/sup 2/ /theta//sub W/ from experiment. The results presented there are based on a recently completed global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, and implications for grand unified theories (GUTS). The potential for further experimental progress is also commented on. I depart from the narrowest version of the standard model and discuss effects of neutrino masses and mixings. I have chosen to concentrate on oscillations, the Mikheyev-Smirnov- Wolfenstein (MSW) effect, and electromagnetic properties of neutrinos. On the latter topic, I will describe some recent work on resonant spin-flavor precession. Finally, I conclude with a prospectus on hopes for the future. 76 refs

  15. DPF'96: the triumph of the standard model

    International Nuclear Information System (INIS)

    Dawson, S.

    1996-01-01

    I summarize some of the highlights of the 1996 DPF meeting, paying particular attention to new measurements of the W, Z, and top quark masses. Precision electroweak measurements from LEP are discussed with emphasis on recent measurements of R b and values of the coupling constants α(M Z 2 ) and α s (M Z 2 ) are presented. Taken as a whole, the data are in spectacular agreement with the predictions of the Standard Model

  16. Standard model physics with the ATLAS early data

    CERN Document Server

    Bruckman de Renstrom, Pawel

    2006-01-01

    The Standard Model, despite its open questions, has proved its consistency and predictive power to very high accuracy within the currently available energy reach. LHC, with its high CM energy and luminosity, will give us insight into new processes, possibly showing evidence of “new physics”. Excellent understanding of the SM processes will also be a key to discriminate against any new phenomena. Prospects of selected SM measurements with the ATLAS detector using early LHC luminosity are presented.

  17. Gravity, CPT, and the standard-model extension

    Energy Technology Data Exchange (ETDEWEB)

    Tasson, Jay D., E-mail: tasson1@stolaf.edu [St. Olaf College (United States)

    2015-08-15

    Exotic atoms provide unique opportunities to search for new physics. The search for CPT and Lorentz violation in the context of the general field-theory based framework of the gravitational Standard-Model Extension (SME) is one such opportunity. This work summarizes the implications of Lorentz and CPT violation for gravitational experiments with antiatoms and atoms containing higher-generation matter as well as recent nongravitational proposals to test CPT and Lorentz symmetry with muons and muonic systems.

  18. LEP asymmetries and fits of the standard model

    International Nuclear Information System (INIS)

    Pietrzyk, B.

    1994-01-01

    The lepton and quark asymmetries measured at LEP are presented. The results of the Standard Model fits to the electroweak data presented at this conference are given. The top mass obtained from the fit to the LEP data is 172 -14-20 +13+18 GeV; it is 177 -11-19 +11+18 when also the collider, ν and A LR data are included. (author). 10 refs., 3 figs., 2 tabs

  19. Observations in particle physics: from two neutrinos to standard model

    International Nuclear Information System (INIS)

    Lederman, L.M.

    1990-01-01

    Experiments, which have made their contribution to creation of the standard model, are discussed. Results of observations on the following concepts: long-lived neutral V-particles, violation of preservation of parity and charge invariance in meson decays, reaction with high-energy neutrino and existence of neutrino of two types, partons and dynamic quarks, dimuon resonance at 9.5 GeV in 400 GeV-proton-nucleus collisions, are considered

  20. The Standard Model and the neutron beta-decay

    CERN Document Server

    Abele, H

    2000-01-01

    This article reviews the relationship between the observables in neutron beta-decay and the accepted modern theory of particle physics known as the Standard Model. Recent neutron-decay measurements of various mixed American-British-French-German-Russian collaborations try to shed light on the following topics: the coupling strength of charged weak currents, the universality of the electroweak interaction and the origin of parity violation.

  1. Standard model fermion hierarchies with multiple Higgs doublets

    International Nuclear Information System (INIS)

    Solaguren-Beascoa Negre, Ana

    2016-01-01

    The hierarchies between the Standard Model (SM) fermion masses and mixing angles and the origin of neutrino masses are two of the biggest mysteries in particle physics. We extend the SM with new Higgs doublets to solve these issues. The lightest fermion masses and the mixing angles are generated through radiative effects, correctly reproducing the hierarchy pattern. Neutrino masses are generated in the see-saw mechanism.

  2. arXiv Asymptotically Safe Standard Model Extensions?

    CERN Document Server

    Pelaggi, Giulio Maria; Salvio, Alberto; Sannino, Francesco; Smirnov, Juri; Strumia, Alessandro

    2018-05-15

    We consider theories with a large number NF of charged fermions and compute the renormalization group equations for the gauge, Yukawa and quartic couplings resummed at leading order in 1/NF. We construct extensions of the standard model where SU(2) and/or SU(3) are asymptotically safe. When the same procedure is applied to the Abelian U(1) factor, we find that the Higgs quartic can not be made asymptotically safe and stay perturbative at the same time.

  3. Exploring and testing the Standard Model and beyond

    International Nuclear Information System (INIS)

    West, G.; Cooper, F.; Ginsparg, P.; Habib, S.; Gupta, R.; Mottola, E.; Nieto, M.; Mattis, M.

    1998-01-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). The goal of this project was to extend and develop the predictions of the Standard Model of particle physics in several different directions. This includes various aspects of the strong nuclear interactions in quantum chromodynamics (QCD), electroweak interactions and the origin of baryon asymmetry in the universe, as well as gravitational physics

  4. Non-perturbative effective interactions in the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Arbuzov, Boris A. [Moscow Lomonosov State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2014-07-01

    This monograph is devoted to the nonperturbative dynamics in the Standard Model (SM), the basic theory of allfundamental interactions in natureexcept gravity. The Standard Model is divided into two parts: the quantum chromodynamics (QCD) and the electro-weak theory (EWT) are well-defined renormalizable theories in which the perturbation theory is valid. However, for the adequate description of the real physics nonperturbative effects are inevitable. This book describes how these nonperturbative effects may be obtained in the framework of spontaneous generation of effective interactions. The well-known example of such effective interaction is provided by the famous Nambu-Jona-Lasinio effective interaction. Also a spontaneous generation of this interaction in the framework of QCD is described and applied to the method for other effective interactions in QCD and EWT. The method is based on N.N. Bogolyubov's conception of compensation equations. As a result we then describe the principal features of the Standard Model, e.g. Higgs sector, and significant nonperturbative effects including recent results obtained at LHC and TEVATRON.

  5. Review of Current Standard Model Results in ATLAS

    CERN Document Server

    Brandt, Gerhard; The ATLAS collaboration

    2018-01-01

    This talk highlights results selected from the Standard Model research programme of the ATLAS Collaboration at the Large Hadron Collider. Results using data from $p-p$ collisions at $\\sqrt{s}=7,8$~TeV in LHC Run-1 as well as results using data at $\\sqrt{s}=13$~TeV in LHC Run-2 are covered. The status of cross section measurements from soft QCD processes and jet production as well as photon production are presented. The presentation extends to vector boson production with associated jets. Precision measurements of the production of $W$ and $Z$ bosons, including a first measurement of the mass of the $W$ bosons, $m_W$, are discussed. The programme to measure electroweak processes with di-boson and tri-boson final states is outlined. All presented measurements are compatible with Standard Model descriptions and allow to further constrain it. In addition they allow to probe new physics which would manifest through extra gauge couplings, or Standard Model gauge couplings deviating from their predicted value.

  6. Domain walls in the extensions of the Standard Model

    Science.gov (United States)

    Krajewski, Tomasz; Lalak, Zygmunt; Lewicki, Marek; Olszewski, Paweł

    2018-05-01

    Our main interest is the evolution of domain walls of the Higgs field in the early Universe. The aim of this paper is to understand how dynamics of Higgs domain walls could be influenced by yet unknown interactions from beyond the Standard Model. We assume that the Standard Model is valid up to certain, high, energy scale Λ and use the framework of the effective field theory to describe physics below that scale. Performing numerical simulations with different values of the scale Λ we are able to extend our previous analysis [1]. Our recent numerical simulations show that evolution of Higgs domain walls is rather insensitive to interactions beyond the Standard Model as long as masses of new particles are grater than 1012 GeV. For lower values of Λ the RG improved effective potential is strongly modified at field strengths crucial to the evolution of domain walls. However, we find that even for low values of Λ, Higgs domain walls decayed shortly after their formation for generic initial conditions. On the other hand, in simulations with specifically chosen initial conditions Higgs domain walls can live longer and enter the scaling regime. We also determine the energy spectrum of gravitational waves produced by decaying domain walls of the Higgs field. For generic initial field configurations the amplitude of the signal is too small to be observed in planned detectors.

  7. Impersonating the Standard Model Higgs boson: alignment without decoupling

    International Nuclear Information System (INIS)

    Carena, Marcela; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E.M.

    2014-01-01

    In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derived in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. Moreover, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the m A −tan β parameter space

  8. Supersymmetry and String Theory: Beyond the Standard Model

    International Nuclear Information System (INIS)

    Rocek, Martin

    2007-01-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)

  9. Stress-testing the Standard Model at the LHC

    CERN Document Server

    2016-01-01

    With the high-energy run of the LHC now underway, and clear manifestations of beyond-Standard-Model physics not yet seen in data from the previous run, the search for new physics at the LHC may be a quest for small deviations with big consequences. If clear signals are present, precise predictions and measurements will again be crucial for extracting the maximum information from the data, as in the case of the Higgs boson. Precision will therefore remain a key theme for particle physics research in the coming years. The conference will provide a forum for experimentalists and theorists to identify the challenges and refine the tools for high-precision tests of the Standard Model and searches for signals of new physics at Run II of the LHC. Topics to be discussed include: pinning down Standard Model corrections to key LHC processes; combining fixed-order QCD calculations with all-order resummations and parton showers; new developments in jet physics concerning jet substructure, associated jets and boosted je...

  10. No Evidence for Extensions to the Standard Cosmological Model

    Science.gov (United States)

    Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna

    2017-09-01

    We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).

  11. Secure Certificateless Signature with Revocation in the Standard Model

    Directory of Open Access Journals (Sweden)

    Tung-Tso Tsai

    2014-01-01

    previously proposed certificateless signature schemes were insecure under a considerably strong security model in the sense that they suffered from outsiders’ key replacement attacks or the attacks from the key generation center (KGC. In this paper, we propose a certificateless signature scheme without random oracles. Moreover, our scheme is secure under the strong security model and provides a public revocation mechanism, called revocable certificateless signature (RCLS. Under the standard computational Diffie-Hellman assumption, we formally demonstrate that our scheme possesses existential unforgeability against adaptive chosen-message attacks.

  12. R parity in standard-like superstring models

    International Nuclear Information System (INIS)

    Halyo, Edi.

    1994-01-01

    We investigate the R symmetries of standard-like superstring models. At the level of the cubic superpotential there are three global U(1) R symmetries. These are broken explicitly by N > 3 terms in the superpotential and spontaneously by scalar Vacuum Expectation values necessary to preserve supersymmetry at Mp. A Z 2 discrete symmetry remains but is equivalent to fermion number modulo 2. These models possess an effective R parity which arises from the interplay between the gauged U(1) B-L and U(1) r j+3 . (author). 14 refs

  13. Efficient Lattice-Based Signcryption in Standard Model

    Directory of Open Access Journals (Sweden)

    Jianhua Yan

    2013-01-01

    Full Text Available Signcryption is a cryptographic primitive that can perform digital signature and public encryption simultaneously at a significantly reduced cost. This advantage makes it highly useful in many applications. However, most existing signcryption schemes are seriously challenged by the booming of quantum computations. As an interesting stepping stone in the post-quantum cryptographic community, two lattice-based signcryption schemes were proposed recently. But both of them were merely proved to be secure in the random oracle models. Therefore, the main contribution of this paper is to propose a new lattice-based signcryption scheme that can be proved to be secure in the standard model.

  14. Elementary particles, dark matter candidate and new extended standard model

    Science.gov (United States)

    Hwang, Jaekwang

    2017-01-01

    Elementary particle decays and reactions are discussed in terms of the three-dimensional quantized space model beyond the standard model. Three generations of the leptons and quarks correspond to the lepton charges. Three heavy leptons and three heavy quarks are introduced. And the bastons (new particles) are proposed as the possible candidate of the dark matters. Dark matter force, weak force and strong force are explained consistently. Possible rest masses of the new particles are, tentatively, proposed for the experimental searches. For more details, see the conference paper at https://www.researchgate.net/publication/308723916.

  15. Framework for Assessing the ICT Competency in Teachers up to the Requirements of "Teacher" Occupational Standard

    Science.gov (United States)

    Avdeeva, Svetlana; Zaichkina, Olga; Nikulicheva, Nataliya; Khapaeva, Svetlana

    2016-01-01

    The paper deals with problems of working out a test framework for the assessment of teachers' ICT competency in line with the requirements of "Teacher" occupational standard. The authors have analyzed the known approaches to assessing teachers' ICT competency--ISTE Standards and UNESCO ICT CFT and have suggested their own approach to…

  16. 77 FR 23250 - HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations

    Science.gov (United States)

    2012-04-18

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...

  17. 76 FR 25355 - HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations

    Science.gov (United States)

    2011-05-04

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...

  18. 78 FR 29134 - HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations

    Science.gov (United States)

    2013-05-17

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...

  19. 7 CFR 319.40-11 - Plant pest risk assessment standards.

    Science.gov (United States)

    2010-01-01

    ... analysis to determine the plant pest risks associated with each requested importation in order to determine... 7 Agriculture 5 2010-01-01 2010-01-01 false Plant pest risk assessment standards. 319.40-11... Unmanufactured Wood Articles § 319.40-11 Plant pest risk assessment standards. When evaluating a request to...

  20. EMBEDding the CEFR in Academic Writing Assessment : A case study in training and standardization

    NARCIS (Netherlands)

    Haines, Kevin; Lowie, Wander; Jansma, Petra; Schmidt, Nicole

    2013-01-01

    The CEFR is increasingly being used as the framework of choice for the assessment of language proficiency at universities across Europe. However, to attain consistent assessment, familiarization and standardization are essential. In this paper we report a case study of embedding a standardization

  1. The Universal Thermal Climate Index UTCI compared to ergonomics standards for assessing the thermal environment.

    Science.gov (United States)

    Bröde, Peter; Błazejczyk, Krzysztof; Fiala, Dusan; Havenith, George; Holmér, Ingvar; Jendritzky, Gerd; Kuklane, Kalev; Kampmann, Bernhard

    2013-01-01

    The growing need for valid assessment procedures of the outdoor thermal environment in the fields of public weather services, public health systems, urban planning, tourism & recreation and climate impact research raised the idea to develop the Universal Thermal Climate Index UTCI based on the most recent scientific progress both in thermo-physiology and in heat exchange theory. Following extensive validation of accessible models of human thermoregulation, the advanced multi-node 'Fiala' model was selected to form the basis of UTCI. This model was coupled with an adaptive clothing model which considers clothing habits by the general urban population and behavioral changes in clothing insulation related to actual environmental temperature. UTCI was developed conceptually as an equivalent temperature. Thus, for any combination of air temperature, wind, radiation, and humidity, UTCI is defined as the air temperature in the reference condition which would elicit the same dynamic response of the physiological model. This review analyses the sensitivity of UTCI to humidity and radiation in the heat and to wind in the cold and compares the results with observational studies and internationally standardized assessment procedures. The capabilities, restrictions and potential future extensions of UTCI are discussed.

  2. Measuring Life Skills: Standardizing the Assessment of Youth Development Indicators

    Directory of Open Access Journals (Sweden)

    Mat D. Duerden

    2012-03-01

    Full Text Available While the development of life skills (e.g., communication, problem solving, etc. is a commonly targeted youth program outcome, the lack of standardized conceptualizations and instrumentation make it difficult to compare impacts across programs and develop validated best practices. In order to promote a more unified approach to life skill development, literature reviews were conducted for 10 life skill domains to identify common definitions and, if available, appropriate outcome measures. Data were then collected from an ethnically diverse sample (N = 758 of elementary, middle, and high school aged youth for the 10 identified instruments. Analyses were conducted to ascertain the psychometric qualities of each measure, the interrelationships among measures, and the measures’ relationships with gender, ethnicity, and school level. Results are discussed in terms of their relevance to life skill theory and measurement.

  3. Standard guide for three methods of assessing buried steel tanks

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1998-01-01

    1.1 This guide covers procedures to be implemented prior to the application of cathodic protection for evaluating the suitability of a tank for upgrading by cathodic protection alone. 1.2 Three procedures are described and identified as Methods A, B, and C. 1.2.1 Method A—Noninvasive with primary emphasis on statistical and electrochemical analysis of external site environment corrosion data. 1.2.2 Method B—Invasive ultrasonic thickness testing with external corrosion evaluation. 1.2.3 Method C—Invasive permanently recorded visual inspection and evaluation including external corrosion assessment. 1.3 This guide presents the methodology and the procedures utilizing site and tank specific data for determining a tank's condition and the suitability for such tanks to be upgraded with cathodic protection. 1.4 The tank's condition shall be assessed using Method A, B, or C. Prior to assessing the tank, a preliminary site survey shall be performed pursuant to Section 8 and the tank shall be tightness test...

  4. Risk assessment of manual material handling activities (case study: PT BRS Standard Industry)

    Science.gov (United States)

    Deviani; Triyanti, V.

    2017-12-01

    The process of moving material manually has the potential for injury to workers. The risk of injury will increase if we do not pay attention to the working conditions. The purpose of this study is to assess and analyze the injury risk level in manual handling material activity, as well as to improve the condition. The observed manual material handling activities is pole lifting and goods loading. These activities were analyzed using Job Strain Index method, Rapid Entire Body Assessment, and Chaffin’s 2D Planar Static Model. The results show that most workers who perform almost all activities have a high level of risk level with the score of JSI and REBA exceeds 9 points. For some activities, the estimated compression forces in the lumbar area also exceed the standard limits of 3400 N. Concerning this condition, several suggestions for improvement were made, improving the composition of packing, improving body posture, and making guideline posters.

  5. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  6. The renewables portfolio standard in Texas: an early assessment

    International Nuclear Information System (INIS)

    Langniss, Ole; Wiser, Ryan

    2003-01-01

    Texas has rapidly emerged as one of the leading wind power markets in the United States. This development can be largely traced to a well-designed and carefully implemented renewables portfolio standard (RPS). The RPS is a new policy mechanism that has received increasing attention as an attractive approach to support renewable power generation. Though replacing existing renewable energy policies with an as-of-yet untested approach in the RPS is risky, early experience from Texas suggests that an RPS can effectively spur renewables development and encourage competition among renewable energy producers. Initial RPS targets in Texas were well exceeded by the end of 2001, with 915 MW of wind installed in that year alone. RPS compliance costs appear negligible with new wind projects reportedly contracted for well under 3(US) cents/kWh, in part as a result of a 1.7(US) cents/kWh production tax credit, an outstanding wind resource and an RPS that is sizable enough to drive project economies of scale. Obliged retail suppliers have been willing to enter into long-term contracts with renewable generators, reducing important risks for both the developer and the retail supplier. Finally, the country's first comprehensive renewable energy certificate program has been put into place to monitor and track RPS compliance

  7. Prostate motion during standard radiotherapy as assessed by fiducial markers

    International Nuclear Information System (INIS)

    Raymond, Y.; Crook, J.M.; Salhani, D.; Yang, H.; Esche, B.

    1995-01-01

    From November 1993 to August 1994, 55 patients with localized prostate carcinoma had three gold seeds placed in the prostate under transrectal ultrasound guidance prior to the start of radiotherapy in order to track prostate motion. Patients had a planning CT scan before initial simulation and again at about 40 Gy, just prior to simulation of a field reduction. Seed position relative to fixed bony landmarks (pubic symphysis and both ischial tuberosities) was digitized from each pair of orthogonal films from the initial and boost simulation using the Nucletron brachytherapy planning system. Vector analysis was performed to rule out the possibility of independent seed migration within the prostate between the time of initial and boost simulation. Prostate motion was seen in the posterior (mean: 0.56 cm; SD: 0.41 cm) and inferior directions (mean: 0.59 cm; SD: 0.45 cm). The base of the prostate was displaced more than 1 cm posteriorly in 30% of patients and in 11% in the inferior direction. Prostate position is related to rectal and bladder filling. Distension of these organs displaces the prostate in an anterosuperior direction, with lesser degrees of filling allowing the prostate to move posteriorly and inferiorly. Conformal therapy planning must take this motion into consideration. Changes in prostate position of this magnitude preclude the use of standard margins

  8. Standard Format for Chromatographic-polarimetric System small samples assessment

    International Nuclear Information System (INIS)

    Naranjo, S.; Fajer, V.; Fonfria, C.; Patinno, R.

    2012-01-01

    The treatment of samples containing optically active substances to be evaluated as part of quality control of raw material entering industrial process, and also during the modifications exerted on it to obtain the desired final composition is still and unsolved problem for many industries. That is the case of sugarcane industry. Sometimes the troubles implied are enlarged because samples to be evaluated are not bigger than one milliliter. Reduction of gel beds in G-10 and G-50 chromatographic columns having an inner diameter of 16 mm, instead of 25, and bed heights adjustable to requirements by means of sliding stoppers to increase analytical power were evaluated with glucose and sucrose standards in concentrations from 1 to 10 g/dL, using aliquots of 1 ml without undesirable dilutions that could affect either detection or chromatographic profile. Assays with seaweed extracts gave good results that are shown. It is established the advantage to know concentration of a separated substance by the height of its peak and the savings in time and reagents resulting . Sample expanded uncertainty in both systems is compared. It is also presented several programs for data acquisition, storing and processing. (Author)

  9. The renewables portfolio standard in Texas: An early assessment; TOPICAL

    International Nuclear Information System (INIS)

    Wiser, Ryan H.; Langniss, Ole

    2001-01-01

    Texas has rapidly emerged as one of the leading wind power markets in the United States. This development can be largely traced to a well-designed and carefully implemented renewables portfolio standard (RPS). The RPS is a new policy mechanism that has received increasing attention as an attractive approach to support renewable power generation. Though replacing existing renewable energy policies with an as-of-yet largely untested approach in the RPS is risky, early experience from Texas suggests that an RPS can effectively spur renewables development and encourage competition among renewable energy producers. Initial RPS targets in Texas will be far exceeded by the end of 2001, with as much as 930 MW of wind slated for installation this year. RPS compliance costs appear negligible, with new wind projects reportedly contracted for under 3(US)/242/kWh, in part as a result of a 1.7(US)/242/kWh production tax credit, an outstanding wind resource, and an RPS that is sizable enough to drive project economies of scale. Obliged retail suppliers have been willing to enter into long-term contracts with renewable generators, reducing important risks for both the developer and the retail supplier. Finally, the country's first comprehensive renewable energy certificate program has been put into place to monitor and track RPS compliance

  10. Non-generic couplings in supersymmetric standard models

    Directory of Open Access Journals (Sweden)

    Evgeny I. Buchbinder

    2015-09-01

    Full Text Available We study two phases of a heterotic standard model, obtained from a Calabi–Yau compactification of the E8×E8 heterotic string, in the context of the associated four-dimensional effective theories. In the first phase we have a standard model gauge group, an MSSM spectrum, four additional U(1 symmetries and singlet fields. In the second phase, obtained from the first by continuing along the singlet directions, three of the additional U(1 symmetries are spontaneously broken and the remaining one is a B–L symmetry. In this second phase, dimension five operators inducing proton decay are consistent with all symmetries and as such, they are expected to be present. We show that, contrary to this expectation, these operators are forbidden due to the additional U(1 symmetries present in the first phase of the model. We emphasise that such “unexpected” absences of operators, due to symmetry enhancement at specific loci in the moduli space, can be phenomenologically relevant and, in the present case, protect the model from fast proton decay.

  11. Flavour alignment in physics beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Braeuninger, Carolin Barbara

    2012-11-21

    There are numerous reasons to think that the Standard Model of physics is not the ultimate theory of nature on very small scales. However, attempts to construct theories that go beyond the Standard Model generically lead to high rates of flavour changing neutral processes that are in conflict with experiment: Quarks are the fundamental constituents of protons and neutrons. Together with electrons they form the visible matter of the universe1. They come in three generations or ''flavours''. In interactions, quarks of different generations can mix, i.e. a quark of one flavour can transform into a quark of another flavour. In the Standard Model, at first order in perturbation theory, such processes occur only via the exchange of a charged particle. Flavour changing neutral processes can only arise in processes involving loops of charged particles. This is due to the fact that all couplings of two quarks to a neutral particle are diagonal in the basis of the mass eigenstates of the quarks. There is thus no mixing of quarks of different flavour at first order. Since the loop processes are suppressed by a loop factor, the Standard Model predicts very low rates for neutral processes that change the flavour of quarks. So far, this is in agreement with experiment. In extensions of the Standard Model, new couplings to the quarks are usually introduced. In general there is no reason why the new coupling matrices should be diagonal in the mass basis of the quarks. These models therefore predict high rates for processes that mix quarks of different flavour. Extensions of the Standard Model must therefore have a non-trivial flavour structure. A possibility to avoid flavour violation is to assume that the new couplings are aligned with the mass matrices of the quarks, i.e. diagonal in the same basis. This alignment could be due to a flavour symmetry. In this thesis, two extensions of the Standard Model with alignment are studied. The first is a simple

  12. Flavour alignment in physics beyond the standard model

    International Nuclear Information System (INIS)

    Braeuninger, Carolin Barbara

    2012-01-01

    There are numerous reasons to think that the Standard Model of physics is not the ultimate theory of nature on very small scales. However, attempts to construct theories that go beyond the Standard Model generically lead to high rates of flavour changing neutral processes that are in conflict with experiment: Quarks are the fundamental constituents of protons and neutrons. Together with electrons they form the visible matter of the universe1. They come in three generations or ''flavours''. In interactions, quarks of different generations can mix, i.e. a quark of one flavour can transform into a quark of another flavour. In the Standard Model, at first order in perturbation theory, such processes occur only via the exchange of a charged particle. Flavour changing neutral processes can only arise in processes involving loops of charged particles. This is due to the fact that all couplings of two quarks to a neutral particle are diagonal in the basis of the mass eigenstates of the quarks. There is thus no mixing of quarks of different flavour at first order. Since the loop processes are suppressed by a loop factor, the Standard Model predicts very low rates for neutral processes that change the flavour of quarks. So far, this is in agreement with experiment. In extensions of the Standard Model, new couplings to the quarks are usually introduced. In general there is no reason why the new coupling matrices should be diagonal in the mass basis of the quarks. These models therefore predict high rates for processes that mix quarks of different flavour. Extensions of the Standard Model must therefore have a non-trivial flavour structure. A possibility to avoid flavour violation is to assume that the new couplings are aligned with the mass matrices of the quarks, i.e. diagonal in the same basis. This alignment could be due to a flavour symmetry. In this thesis, two extensions of the Standard Model with alignment are studied. The first is a simple extension of the Standard

  13. Standardization of natural phenomena risk assessment methodology at the Savannah River Plant

    International Nuclear Information System (INIS)

    Huang, J.C.; Hsu, Y.S.

    1985-01-01

    Safety analyses at the Savannah River Plant (SRP) normally require consideration of the risks of incidents caused by natural events such as high-velocity straight winds, tornadic winds, and earthquakes. The probabilities for these events to occur at SRP had been studied independently by several investigators, but the results of their studies were never systematically evaluated. As part of the endeavor to standardize our environmental risk assessment methodology, these independent studies have been thoroughly reviewed and critiqued, and appropriate probability models for these natural events have been selected. The selected probability models for natural phenomena, high-velocity straight winds and tornadic winds in particular, are in agreement with those being used at other DOE sites, and have been adopted as a guide for all safety studies conducted for SRP operations and facilities. 7 references, 3 figures

  14. Symmetry and the Standard Model mathematics and particle physics

    CERN Document Server

    Robinson, Matthew

    2011-01-01

    While elementary particle physics is an extraordinarily fascinating field, the huge amount of knowledge necessary to perform cutting-edge research poses a formidable challenge for students. The leap from the material contained in the standard graduate course sequence to the frontiers of M-theory, for example, is tremendous. To make substantial contributions to the field, students must first confront a long reading list of texts on quantum field theory, general relativity, gauge theory, particle interactions, conformal field theory, and string theory. Moreover, waves of new mathematics are required at each stage, spanning a broad set of topics including algebra, geometry, topology, and analysis. Symmetry and the Standard Model: Mathematics and Particle Physics, by Matthew Robinson, is the first volume of a series intended to teach math in a way that is catered to physicists. Following a brief review of classical physics at the undergraduate level and a preview of particle physics from an experimentalist's per...

  15. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  16. Standardization of Thermo-Fluid Modeling in Modelica.Fluid

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Rudiger; Casella, Francesco; Sielemann, Michael; Proelss, Katrin; Otter, Martin; Wetter, Michael

    2009-09-01

    This article discusses the Modelica.Fluid library that has been included in the Modelica Standard Library 3.1. Modelica.Fluid provides interfaces and basic components for the device-oriented modeling of onedimensional thermo-fluid flow in networks containing vessels, pipes, fluid machines, valves and fittings. A unique feature of Modelica.Fluid is that the component equations and the media models as well as pressure loss and heat transfer correlations are decoupled from each other. All components are implemented such that they can be used for media from the Modelica.Media library. This means that an incompressible or compressible medium, a single or a multiple substance medium with one or more phases might be used with one and the same model as long as the modeling assumptions made hold. Furthermore, trace substances are supported. Modeling assumptions can be configured globally in an outer System object. This covers in particular the initialization, uni- or bi-directional flow, and dynamic or steady-state formulation of mass, energy, and momentum balance. All assumptions can be locally refined for every component. While Modelica.Fluid contains a reasonable set of component models, the goal of the library is not to provide a comprehensive set of models, but rather to provide interfaces and best practices for the treatment of issues such as connector design and implementation of energy, mass and momentum balances. Applications from various domains are presented.

  17. On a radiative origin of the Standard Model from trinification

    Science.gov (United States)

    Camargo-Molina, José Eliel; Morais, António P.; Pasechnik, Roman; Wessén, Jonas

    2016-09-01

    In this work, we present a trinification-based grand unified theory incorporating a global SU(3) family symmetry that after a spontaneous breaking leads to a left-right symmetric model. Already at the classical level, this model can accommodate the matter content and the quark Cabbibo mixing in the Standard Model (SM) with only one Yukawa coupling at the unification scale. Considering the minimal low-energy scenario with the least amount of light states, we show that the resulting effective theory enables dynamical breaking of its gauge group down to that of the SM by means of radiative corrections accounted for by the renormalisation group evolution at one loop. This result paves the way for a consistent explanation of the SM breaking scale and fermion mass hierarchies.

  18. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  19. Patients’ vs. Physicians’ Assessments of Emergencies: The Prudent Layperson Standard

    Directory of Open Access Journals (Sweden)

    Langdorf, Mark I

    2003-01-01

    Full Text Available Objective: To compare perception of the need for emergency care by emergency department (ED patients vs. emergency physicians (EPs. Methods: Mailed survey to EPs and a convenience sample of ED patients. Survey rated urgency of acute sore throat, ankle injury, abdominal pain, and hemiparesis, as well as the best definition of “emergency.” Responses were compared with chi-square (p < .05. Results: 119/140 (85% of EPs and 1453 ED patients responded. EPs were more likely to judge acute abdominal pain (79.8% vs. 43.4%, p < 0.001, odds ratio (OR 5.16, 95% confidence interval (CI 3.19-8.40 and hemiparesis (100% vs. 82.6%, p < 0.001, OR 24.9, 95% CI 3.75-94.4 as an emergency. Similar proportions of ED patients and EPs considered sore throat (12.2% vs. 7.6%, p = 0.18, OR 0.59, CI 0.27-1.23 and ankle injury (46.9% vs. 38.6%, p = 0.10, OR 0.71, CI 0.48-1.06 an emergency. EPs (35% and ED patients (40% agreed to a similar degree with the “prudent layperson” definition, “a condition that may result in death, permanent disability, or severe pain.” (p = .36, OR 1.22, CI 0.81-1.84. EPs were more likely to add, “the condition prevented work,” (27% vs. 16%, p = 0.003, OR 0.51, CI 0.33-0.81. Patients more often added, “occurred outside business hours” (15% vs. 4%, p = 0.002, OR 4.0, CI = 1.5-11.3. Conclusion: For serious complaints, ED patients’ thresholds for seeking care are higher than judged appropriate by EPs. Stroke is not uniformly recognized as an emergency. Absent consensus for the “correct” threshold, the prudent layperson standard is appropriate.

  20. Comparative life cycle assessment of standard and green roofs.

    Science.gov (United States)

    Saiz, Susana; Kennedy, Christopher; Bass, Brad; Pressnail, Kim

    2006-07-01

    Life cycle assessment (LCA) is used to evaluate the benefits, primarily from reduced energy consumption, resulting from the addition of a green roof to an eight story residential building in Madrid. Building energy use is simulated and a bottom-up LCA is conducted assuming a 50 year building life. The key property of a green roof is its low solar absorptance, which causes lower surface temperature, thereby reducing the heat flux through the roof. Savings in annual energy use are just over 1%, but summer cooling load is reduced by over 6% and reductions in peak hour cooling load in the upper floors reach 25%. By replacing the common flat roof with a green roof, environmental impacts are reduced by between 1.0 and 5.3%. Similar reductions might be achieved by using a white roof with additional insulation for winter, but more substantial reductions are achieved if common use of green roofs leads to reductions in the urban heat island.

  1. Early universe cosmology. In supersymmetric extensions of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Baumann, Jochen Peter

    2012-03-19

    In this thesis we investigate possible connections between cosmological inflation and leptogenesis on the one side and particle physics on the other side. We work in supersymmetric extensions of the Standard Model. A key role is played by the right-handed sneutrino, the superpartner of the right-handed neutrino involved in the type I seesaw mechanism. We study a combined model of inflation and non-thermal leptogenesis that is a simple extension of the Minimal Supersymmetric Standard Model (MSSM) with conserved R-parity, where we add three right-handed neutrino super fields. The inflaton direction is given by the imaginary components of the corresponding scalar component fields, which are protected from the supergravity (SUGRA) {eta}-problem by a shift symmetry in the Kaehler potential. We discuss the model first in a globally supersymmetric (SUSY) and then in a supergravity context and compute the inflationary predictions of the model. We also study reheating and non-thermal leptogenesis in this model. A numerical simulation shows that shortly after the waterfall phase transition that ends inflation, the universe is dominated by right-handed sneutrinos and their out-of-equilibrium decay can produce the desired matter-antimatter asymmetry. Using a simplified time-averaged description, we derive analytical expressions for the model predictions. Combining the results from inflation and leptogenesis allows us to constrain the allowed parameter space from two different directions, with implications for low energy neutrino physics. As a second thread of investigation, we discuss a generalisation of the inflationary model discussed above to include gauge non-singlet fields as inflatons. This is motivated by the fact that in left-right symmetric, supersymmetric Grand Unified Theories (SUSY GUTs), like SUSY Pati-Salam unification or SUSY SO(10) GUTs, the righthanded (s)neutrino is an indispensable ingredient and does not have to be put in by hand as in the MSSM. We discuss

  2. Early universe cosmology. In supersymmetric extensions of the standard model

    International Nuclear Information System (INIS)

    Baumann, Jochen Peter

    2012-01-01

    In this thesis we investigate possible connections between cosmological inflation and leptogenesis on the one side and particle physics on the other side. We work in supersymmetric extensions of the Standard Model. A key role is played by the right-handed sneutrino, the superpartner of the right-handed neutrino involved in the type I seesaw mechanism. We study a combined model of inflation and non-thermal leptogenesis that is a simple extension of the Minimal Supersymmetric Standard Model (MSSM) with conserved R-parity, where we add three right-handed neutrino super fields. The inflaton direction is given by the imaginary components of the corresponding scalar component fields, which are protected from the supergravity (SUGRA) η-problem by a shift symmetry in the Kaehler potential. We discuss the model first in a globally supersymmetric (SUSY) and then in a supergravity context and compute the inflationary predictions of the model. We also study reheating and non-thermal leptogenesis in this model. A numerical simulation shows that shortly after the waterfall phase transition that ends inflation, the universe is dominated by right-handed sneutrinos and their out-of-equilibrium decay can produce the desired matter-antimatter asymmetry. Using a simplified time-averaged description, we derive analytical expressions for the model predictions. Combining the results from inflation and leptogenesis allows us to constrain the allowed parameter space from two different directions, with implications for low energy neutrino physics. As a second thread of investigation, we discuss a generalisation of the inflationary model discussed above to include gauge non-singlet fields as inflatons. This is motivated by the fact that in left-right symmetric, supersymmetric Grand Unified Theories (SUSY GUTs), like SUSY Pati-Salam unification or SUSY SO(10) GUTs, the righthanded (s)neutrino is an indispensable ingredient and does not have to be put in by hand as in the MSSM. We discuss the

  3. Baryon number dissipation at finite temperature in the standard model

    International Nuclear Information System (INIS)

    Mottola, E.; Raby, S.; Starkman, G.

    1990-01-01

    We analyze the phenomenon of baryon number violation at finite temperature in the standard model, and derive the relaxation rate for the baryon density in the high temperature electroweak plasma. The relaxation rate, γ is given in terms of real time correlation functions of the operator E·B, and is directly proportional to the sphaleron transition rate, Γ: γ preceq n f Γ/T 3 . Hence it is not instanton suppressed, as claimed by Cohen, Dugan and Manohar (CDM). We show explicitly how this result is consistent with the methods of CDM, once it is recognized that a new anomalous commutator is required in their approach. 19 refs., 2 figs

  4. Standard Model-like corrections to Dilatonic Dynamics

    DEFF Research Database (Denmark)

    Antipin, Oleg; Krog, Jens; Mølgaard, Esben

    2013-01-01

    the same non-abelian global symmetries as a technicolor-like theory with matter in a complex representation of the gauge group. We then embed the electroweak gauge group within the global flavor structure and add also ordinary quark-like states to mimic the effects of the top. We find that the standard...... model-like induced corrections modify the original phase diagram and the details of the dilatonic spectrum. In particular, we show that the corrected theory exhibits near-conformal behavior for a smaller range of flavors and colors. For this range of values, however, our results suggest that near...

  5. B_{s,d} -> l+ l- in the Standard Model

    CERN Document Server

    Bobeth, Christoph; Hermann, Thomas; Misiak, Mikolaj; Stamou, Emmanuel; Steinhauser, Matthias

    2014-01-01

    We combine our new results for the O(alpha_em) and O(alpha_s^2) corrections to B_{s,d} -> l^+ l^-, and present updated branching ratio predictions for these decays in the standard model. Inclusion of the new corrections removes major theoretical uncertainties of perturbative origin that have just begun to dominate over the parametric ones. For the recently observed muonic decay of the B_s meson, our calculation gives BR(B_s -> mu^+ mu^-) = (3.65 +_ 0.23) * 10^(-9).

  6. Dark Matter and Color Octets Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Krnjaic, Gordan Zdenko [Johns Hopkins Univ., Baltimore, MD (United States)

    2012-07-01

    Although the Standard Model (SM) of particles and interactions has survived forty years of experimental tests, it does not provide a complete description of nature. From cosmological and astrophysical observations, it is now clear that the majority of matter in the universe is not baryonic and interacts very weakly (if at all) via non-gravitational forces. The SM does not provide a dark matter candidate, so new particles must be introduced. Furthermore, recent Tevatron results suggest that SM predictions for benchmark collider observables are in tension with experimental observations. In this thesis, we will propose extensions to the SM that address each of these issues.

  7. Modeling RHIC using the standard machine formal accelerator description

    International Nuclear Information System (INIS)

    Pilat, F.; Trahern, C.G.; Wei, J.

    1997-01-01

    The Standard Machine Format (SMF) is a structured description of accelerator lattices which supports both the hierarchy of beam lines and generic lattice objects as well as those deviations (field errors, alignment efforts, etc.) associated with each component of the as-installed machine. In this paper we discuss the use of SMF to describe the Relativistic Heavy Ion Collider (RHIC) as well as the ancillary data structures (such as field quality measurements) that are necessarily incorporated into the RHIC SMF model. Future applications of SMF are outlined, including its use in the RHIC operational environment

  8. What is special about the group of the standard model?

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Brene, N.

    1989-03-01

    The standard model is based on the algebra of U 1 xSU 2 xSU 3 . The systematics of charges of the fundamental fermions seems to suggest the importance of a particular group having this algebra, viz. S(U 2 xU 3 ). This group is distinguished from all other connected compact non semisimple groups with dimensionality up to 12 by a characteristic property: it is very 'skew'. By this we mean that the group has relatively few 'generalised outer automorphisms'. One may speculate about physical reasons for this fact. (orig.)

  9. Developments in standard model: electroweak theory/phenomenology

    International Nuclear Information System (INIS)

    Deshpande, N.G.

    1986-01-01

    The authors review new developments in four topics. Higgs detection D in the intermediate mass range (100 GeV 2M/sub W/) is discussed in detail. It is found that the backgrounds are a serious problem in hadronic colliders except for purely leptonic signals, which unfortunately have low event rates. Recent work on topological solutions to standard model, with new states in TeV range are discussed. Large rate of BB vector production at SSC may allow determination of rare modes of B decay. The fourth topic concerns the feasibility of detecting Horizontal gauge bosons at SSC. 17 references, 9 figures

  10. The strong interactions beyond the standard model of particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bergner, Georg [Muenster Univ. (Germany). Inst. for Theoretical Physics

    2016-11-01

    SuperMUC is one of the most convenient high performance machines for our project since it offers a high performance and flexibility regarding different applications. This is of particular importance for investigations of new theories, where on the one hand the parameters and systematic uncertainties have to be estimated in smaller simulations and on the other hand a large computational performance is needed for the estimations of the scale at zero temperature. Our project is just the first investigation of the new physics beyond the standard model of particle physics and we hope to proceed with our studies towards more involved Technicolour candidates, supersymmetric QCD, and extended supersymmetry.

  11. Detecting physics beyond the Standard Model with the REDTOP experiment

    Science.gov (United States)

    González, D.; León, D.; Fabela, B.; Pedraza, M. I.

    2017-10-01

    REDTOP is an experiment at its proposal stage. It belongs to the High Intensity class of experiments. REDTOP will use a 1.8 GeV continuous proton beam impinging on a fixed target. It is expected to produce about 1013 η mesons per year. The main goal of REDTOP is to look for physics beyond the Standard Model by detecting rare η decays. The detector is designed with innovative technologies based on the detection of prompt Cherenkov light, such that interesting events can be observed and the background events are efficiently rejected. The experimental design, the physics program and the running plan of the experiment is presented.

  12. Coset Space Dimensional Reduction approach to the Standard Model

    International Nuclear Information System (INIS)

    Farakos, K.; Kapetanakis, D.; Koutsoumbas, G.; Zoupanos, G.

    1988-01-01

    We present a unified theory in ten dimensions based on the gauge group E 8 , which is dimensionally reduced to the Standard Mode SU 3c xSU 2 -LxU 1 , which breaks further spontaneously to SU 3L xU 1em . The model gives similar predictions for sin 2 θ w and proton decay as the minimal SU 5 G.U.T., while a natural choice of the coset space radii predicts light Higgs masses a la Coleman-Weinberg

  13. What is special about the group of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, H.B.; Brene, N.

    1989-06-15

    The standard model is based on the algebra of U/sub 1/xSU/sub 2/xSU/sub 3/. The systematics of charges of the fundamental fermions seems to suggest the importance of a particular group having this algebra, viz. S(U/sub 2/xU/sub 3/). This group is distinguished from all other connected compact non semisimple groups with dimensionality up to 12 by a characteristic property: it is very ''skew''. By this we mean that the group has relatively few ''generalised outer automorphisms''. One may speculate about physical reasons for this fact. (orig.).

  14. What is special about the group of the standard model?

    Science.gov (United States)

    Nielsen, H. B.; Brene, N.

    1989-06-01

    The standard model is based on the algebra of U 1×SU 2×SU 3. The systematics of charges of the fundamental fermions seems to suggest the importance of a particular group having this algebra, viz. S(U 2×U 3). This group is distinguished from all other connected compact non semisimple groups with dimensionality up to 12 by a characteristic property: it is very “skew”. By this we mean that the group has relatively few “generalised outer automorphisms”. One may speculate about physical reasons for this fact.

  15. High Mass Standard Model Higgs searches at the Tevatron

    Directory of Open Access Journals (Sweden)

    Petridis Konstantinos A.

    2012-06-01

    Full Text Available We present the results of searches for the Standard Model Higgs boson decaying predominantly to W+W− pairs, at a center-of-mass energy of √s = 1.96 TeV, using up to 8.2 fb−1 of data collected with the CDF and D0 detectors at the Fermilab Tevatron collider. The analysis techniques and the various channels considered are discussed. These searches result in exclusions across the Higgs mass range of 156.5< mH <173.7 GeV for CDF and 161< mH <170 GeV for D0.

  16. Future high precision experiments and new physics beyond Standard Model

    International Nuclear Information System (INIS)

    Luo, Mingxing.

    1993-01-01

    High precision (< 1%) electroweak experiments that have been done or are likely to be done in this decade are examined on the basis of Standard Model (SM) predictions of fourteen weak neutral current observables and fifteen W and Z properties to the one-loop level, the implications of the corresponding experimental measurements to various types of possible new physics that enter at the tree or loop level were investigated. Certain experiments appear to have special promise as probes of the new physics considered here

  17. Assessing the reliability of the borderline regression method as a standard setting procedure for objective structured clinical examination

    Directory of Open Access Journals (Sweden)

    Sara Mortaz Hejri

    2013-01-01

    Full Text Available Background: One of the methods used for standard setting is the borderline regression method (BRM. This study aims to assess the reliability of BRM when the pass-fail standard in an objective structured clinical examination (OSCE was calculated by averaging the BRM standards obtained for each station separately. Materials and Methods: In nine stations of the OSCE with direct observation the examiners gave each student a checklist score and a global score. Using a linear regression model for each station, we calculated the checklist score cut-off on the regression equation for the global scale cut-off set at 2. The OSCE pass-fail standard was defined as the average of all station′s standard. To determine the reliability, the root mean square error (RMSE was calculated. The R2 coefficient and the inter-grade discrimination were calculated to assess the quality of OSCE. Results: The mean total test score was 60.78. The OSCE pass-fail standard and its RMSE were 47.37 and 0.55, respectively. The R2 coefficients ranged from 0.44 to 0.79. The inter-grade discrimination score varied greatly among stations. Conclusion: The RMSE of the standard was very small indicating that BRM is a reliable method of setting standard for OSCE, which has the advantage of providing data for quality assurance.

  18. Dose assessment models. Annex A

    International Nuclear Information System (INIS)

    1982-01-01

    The models presented in this chapter have been separated into 2 general categories: environmental transport models which describe the movement of radioactive materials through all sectors of the environment after their release, and dosimetric models to calculate the absorbed dose following an intake of radioactive materials or exposure to external irradiation. Various sections of this chapter also deal with atmospheric transport models, terrestrial models, and aquatic models.

  19. Standard Model CP-violation and baryon asymmetry

    CERN Document Server

    Gavela, M.B.; Orloff, J.; Pene, O.

    1994-01-01

    Simply based on CP arguments, we argue against a Standard Model explanation of the baryon asymmetry of the universe in the presence of a first order phase transition. A CP-asymmetry is found in the reflection coefficients of quarks hitting the phase boundary created during the electroweak transition. The problem is analyzed both in an academic zero temperature case and in the realistic finite temperature one. The building blocks are similar in both cases: Kobayashi-Maskawa CP-violation, CP-even phases in the reflection coefficients of quarks, and physical transitions due to fermion self-energies. In both cases an effect is present at order $\\alpha_W^2$ in rate. A standard GIM behaviour is found as intuitively expected. In the finite temperature case, a crucial role is played by the damping rate of quasi-particles in a hot plasma, which is a relevant scale together with $M_W$ and the temperature. The effect is many orders of magnitude below what observation requires, and indicates that non standard physics is ...

  20. Big bang nucleosynthesis: The standard model and alternatives

    Science.gov (United States)

    Schramm, David N.

    1991-01-01

    Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from He-4 at 24% by mass through H-2 and He-3 at parts in 10(exp 5) down to Li-7 at parts in 10(exp 10). Furthermore, the recent large electron positron (LEP) (and the stanford linear collider (SLC)) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conlusions on the baryonic density relative to the critical density, omega(sub b) remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that omega(sub b) approximately equals 0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming omega(sub total) = 1) and the need for dark baryonic matter, since omega(sub visible) is less than omega(sub b).

  1. Big bang nucleosynthesis: The standard model and alternatives

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1991-01-01

    Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from 4 He at 24% by mass through 2 H and 3 He at parts in 10 5 down to 7 Li at parts in 10 10 . Furthermore, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, Ω b , remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that Ω b ≅0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming Ω total =1) and the need for dark baryonic matter, since Ω visible b . (orig.)

  2. Characteristics of States' Alternate Assessments Based on Modified Academic Achievement Standards in 2008. Synthesis Report 72

    Science.gov (United States)

    Albus, Deb; Lazarus, Sheryl S.; Thurlow, Martha L.; Cormier, Damien

    2009-01-01

    In April 2007, Federal No Child Left Behind regulations were finalized that provided states with additional flexibility for assessing some students with disabilities. The regulations allowed states to offer another assessment option, alternate assessments based on modified academic achievement standards (AA-MAS). States are not required to have…

  3. Physics beyond the standard model in the non-perturbative unification scheme

    International Nuclear Information System (INIS)

    Kapetanakis, D.; Zoupanos, G.

    1990-01-01

    The non-perturbative unification scenario predicts reasonably well the low energy gauge couplings of the standard model. Agreement with the measured low energy couplings is obtained by assuming certain kind of physics beyond the standard model. A number of possibilities for physics beyond the standard model is examined. The best candidates so far are the standard model with eight fermionic families and a similar number of Higgs doublets, and the supersymmetric standard model with five families. (author)

  4. Development of a standard equipment management model for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hee Seung; Ju, Tae Young; Kim, Jung Wun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2012-10-15

    Most utilities that have achieved high performance have introduced a management model to improve performance and operate plants safely. The Nuclear Energy Institute has developed and updated its Standard Nuclear Performance Model (SNPM) in order to provide a summary of nuclear processes, cost definitions, and key business performance measures for business performance comparison and benchmarking. Over the past decade, Korea Hydro and Nuclear Power Co. (KHNP) has introduced and implemented many engineering processes such as Equipment Reliability (ER), Maintenance Rule (MR), Single Point Vulnerability (SPV), Corrective Action Program (CAP), and Self Assessment (SA) to improve plant performance and to sustain high performance. Some processes, however, are not well interfaced with other processes, because they were developed separately and were focused on the process itself. KHNP is developing a Standard Equipment Management Model (SEMM) to integrate these engineering processes and to improve the interrelation among the processes. In this paper, a draft model and attributes of the SEMM are discussed.

  5. Development of a standard equipment management model for nuclear power plants

    International Nuclear Information System (INIS)

    Chang, Hee Seung; Ju, Tae Young; Kim, Jung Wun

    2012-01-01

    Most utilities that have achieved high performance have introduced a management model to improve performance and operate plants safely. The Nuclear Energy Institute has developed and updated its Standard Nuclear Performance Model (SNPM) in order to provide a summary of nuclear processes, cost definitions, and key business performance measures for business performance comparison and benchmarking. Over the past decade, Korea Hydro and Nuclear Power Co. (KHNP) has introduced and implemented many engineering processes such as Equipment Reliability (ER), Maintenance Rule (MR), Single Point Vulnerability (SPV), Corrective Action Program (CAP), and Self Assessment (SA) to improve plant performance and to sustain high performance. Some processes, however, are not well interfaced with other processes, because they were developed separately and were focused on the process itself. KHNP is developing a Standard Equipment Management Model (SEMM) to integrate these engineering processes and to improve the interrelation among the processes. In this paper, a draft model and attributes of the SEMM are discussed

  6. The hadronic standard model for strong and electroweak interactions

    International Nuclear Information System (INIS)

    Raczka, R.

    1993-01-01

    We propose a new model for strong and electro-weak interactions. First, we review various QCD predictions for hadron-hadron and lepton-hadron processes. We indicate that the present formulation of strong interactions in the frame work of Quantum Chromodynamics encounters serious conceptual and numerical difficulties in a reliable description of hadron-hadron and lepton-hadron interactions. Next we propose to replace the strong sector of Standard Model based on unobserved quarks and gluons by the strong sector based on the set of the observed baryons and mesons determined by the spontaneously broken SU(6) gauge field theory model. We analyse various properties of this model such as asymptotic freedom, Reggeization of gauge bosons and fundamental fermions, baryon-baryon and meson-baryon high energy scattering, generation of Λ-polarization in inclusive processes and others. Finally we extend this model by electro-weak sector. We demonstrate a remarkable lepton and hadron anomaly cancellation and we analyse a series of important lepton-hadron and hadron-hadron processes such as e + + e - → hadrons, e + + e - → W + + W - , e + + e - → p + anti-p, e + p → e + p and p + anti-p → p + anti-p processes. We obtained a series of interesting new predictions in this model especially for processes with polarized particles. We estimated the value of the strong coupling constant α(M z ) and we predicted the top baryon mass M Λ t ≅ 240 GeV. Since in our model the proton, neutron, Λ-particles, vector mesons like ρ, ω, φ, J/ψ ect. and leptons are elementary most of experimentally analysed lepton-hadron and hadron-hadron processes in LEP1, LEP2, LEAR, HERA, HERMES, LHC and SSC experiments may be relatively easily analysed in our model. (author). 252 refs, 65 figs, 1 tab

  7. New phenomena in the standard no-scale supergravity model

    CERN Document Server

    Kelley, S; Nanopoulos, Dimitri V; Zichichi, Antonino; Kelley, S; Lopez, J L; Nanopoulos, D V; Zichichi, A

    1994-01-01

    We revisit the no-scale mechanism in the context of the simplest no-scale supergravity extension of the Standard Model. This model has the usual five-dimensional parameter space plus an additional parameter \\xi_{3/2}\\equiv m_{3/2}/m_{1/2}. We show how predictions of the model may be extracted over the whole parameter space. A necessary condition for the potential to be stable is {\\rm Str}{\\cal M}^4>0, which is satisfied if \\bf m_{3/2}\\lsim2 m_{\\tilde q}. Order of magnitude calculations reveal a no-lose theorem guaranteeing interesting and potentially observable new phenomena in the neutral scalar sector of the theory which would constitute a ``smoking gun'' of the no-scale mechanism. This new phenomenology is model-independent and divides into three scenarios, depending on the ratio of the weak scale to the vev at the minimum of the no-scale direction. We also calculate the residual vacuum energy at the unification scale (C_0\\, m^4_{3/2}), and find that in typical models one must require C_0>10. Such constrai...

  8. How to use the Standard Model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Henning, Brian; Lu, Xiaochuan [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Murayama, Hitoshi [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Kavli Institute for the Physics and Mathematics of the Universe (WPI),Todai Institutes for Advanced Study, University of Tokyo,Kashiwa 277-8583 (Japan)

    2016-01-05

    We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.

  9. Electroweak baryogenesis in extensions of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Fromme, L.

    2006-07-07

    We investigate the generation of the baryon asymmetry in two extensions of the Standard Model; these are the {phi}{sup 6} and the two-Higgs-doublet model. Analyzing the thermal potential in the presence of CP violation, we find a strong first order phase transition for a wide range of parameters in both models. We compute the relevant bubble wall properties which then enter the transport equations. In non-supersymmetric models electroweak baryogenesis is dominated by top transport, which we treat in the WKB approximation. We calculate the CP-violating source terms starting from the Dirac equation. We show how to resolve discrepancies between this treatment and the computation in the Schwinger-Keldysh formalism. Furthermore, we keep inelastic scatterings of quarks and W bosons at a finite rate, which considerably affects the amount of the generated baryon asymmetry depending on the bubble wall velocity. In addition, we improve the transport equations by novel source terms which are generated by CP-conserving perturbations in the plasma. It turns out that their effect is relatively small. Both models under consideration predict a baryon to entropy ratio close to the observed value for a large part of the parameter space without being in conflict with constraints on electric dipole moments. (orig.)

  10. Electroweak baryogenesis in extensions of the standard model

    International Nuclear Information System (INIS)

    Fromme, L.

    2006-01-01

    We investigate the generation of the baryon asymmetry in two extensions of the Standard Model; these are the Φ 6 and the two-Higgs-doublet model. Analyzing the thermal potential in the presence of CP violation, we find a strong first order phase transition for a wide range of parameters in both models. We compute the relevant bubble wall properties which then enter the transport equations. In non-supersymmetric models electroweak baryogenesis is dominated by top transport, which we treat in the WKB approximation. We calculate the CP-violating source terms starting from the Dirac equation. We show how to resolve discrepancies between this treatment and the computation in the Schwinger-Keldysh formalism. Furthermore, we keep inelastic scatterings of quarks and W bosons at a finite rate, which considerably affects the amount of the generated baryon asymmetry depending on the bubble wall velocity. In addition, we improve the transport equations by novel source terms which are generated by CP-conserving perturbations in the plasma. It turns out that their effect is relatively small. Both models under consideration predict a baryon to entropy ratio close to the observed value for a large part of the parameter space without being in conflict with constraints on electric dipole moments. (orig.)

  11. NASA Standard for Models and Simulations (M and S): Development Process and Rationale

    Science.gov (United States)

    Zang, Thomas A.; Blattnig, Steve R.; Green, Lawrence L.; Hemsch, Michael J.; Luckring, James M.; Morison, Joseph H.; Tripathi, Ram K.

    2009-01-01

    After the Columbia Accident Investigation Board (CAIB) report. the NASA Administrator at that time chartered an executive team (known as the Diaz Team) to identify the CAIB report elements with Agency-wide applicability, and to develop corrective measures to address each element. This report documents the chronological development and release of an Agency-wide Standard for Models and Simulations (M&S) (NASA Standard 7009) in response to Action #4 from the report, "A Renewed Commitment to Excellence: An Assessment of the NASA Agency-wide Applicability of the Columbia Accident Investigation Board Report, January 30, 2004".

  12. Neutron electric dipole moment in the minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Inui, T.; Mimura, Y.; Sakai, N.; Sasaki, T.

    1995-01-01

    The neutron electric dipole moment (EDM) due to the single quark EDM and to the transition EDM is calculated in the minimal supersymmetric standard model. Assuming that the Cabibbo-Kobayashi-Maskawa matrix at the grand unification scale is the only source of CP violation, complex phases are induced in the parameters of soft supersymmetry breaking at low energies. The chargino one-loop diagram is found to give the dominant contribution of the order of 10 -27 similar 10 -29 e.cm for the quark EDM, assuming the light chargino mass and the universal scalar mass to be 50 GeV and 100 GeV, respectively. Therefore the neutron EDM in this class of model is difficult to measure experimentally. The gluino one-loop diagram also contributes due to the flavor changing gluino coupling. The transition EDM is found to give dominant contributions for certain parameter regions. (orig.)

  13. Can the "standard" unitarized Regge models describe the TOTEM data?

    CERN Document Server

    Alkin, A; Martynov, E

    2013-01-01

    The standard Regge poles are considered as inputs for two unitarization methods: eikonal and U-matrix. It is shown that only models with three input pomerons and two input odderons can describe the high energy data on $pp$ and $\\bar pp$ elastic scattering including the new data from Tevatron and LHC. However, it seems that the both considered models require a further modification (e.g. nonlinear reggeon trajectories and/or nonexponential vertex functions) for a more satisfactory description of the data at 19.0 GeV$\\leq \\sqrt{s}\\leq$ 7 TeV and 0.01 $\\leq |t|\\leq $14.2 GeV$^{2}$.

  14. Electro symmetry breaking and beyond the standard model

    International Nuclear Information System (INIS)

    Barklow, T.; Dawson, S.; Haber, H.E.

    1995-05-01

    The development of the Standard Model of particle physics is a remarkable success story. Its many facets have been tested at present day accelerators; no significant unambiguous deviations have yet been found. In some cases, the model has been verified at an accuracy of better than one part in a thousand. This state of affairs presents our field with a challenge. Where do we go from here? What is our vision for future developments in particle physics? Are particle physicists' recent successes a signal of the field's impending demise, or do real long-term prospects exist for further progress? We assert that the long-term health and intellectual vitality of particle physics depends crucially on the development of a new generation of particle colliders that push the energy frontier by an order of magnitude beyond present capabilities. In this report, we address the scientific issues underlying this assertion

  15. Alive and well: A short review about standard solar models

    Energy Technology Data Exchange (ETDEWEB)

    Serenelli, Aldo [Campus UAB, Carrer de Can Magrans S/N, Instituto de Ciencias del Espacio (ICE/CSIC-IEEC), Cerdanyola del Valles (Spain)

    2016-04-15

    Standard solar models (SSMs) provide a reference framework across a number of research fields: solar and stellar models, solar neutrinos, particle physics the most conspicuous among them. The accuracy of the physical description of the global properties of the Sun that SSMs provide has been challenged in the last decade by a number of developments in stellar spectroscopic techniques. Over the same period of time, solar neutrino experiments, and Borexino in particular, have measured the four solar neutrino fluxes from the pp-chains that are associated with 99% of the nuclear energy generated in the Sun. Borexino has also set the most stringent limit on CNO energy generation, only ∝ 40% larger than predicted by SSMs. More recently, and for the first time, radiative opacity experiments have been performed at conditions that closely resemble those at the base of the solar convective envelope. In this article, we review these developments and discuss the current status of SSMs, including its intrinsic limitations. (orig.)

  16. Angular correlations in top quark decays in standard model extensions

    International Nuclear Information System (INIS)

    Batebi, S.; Etesami, S. M.; Mohammadi-Najafabadi, M.

    2011-01-01

    The CMS Collaboration at the CERN LHC has searched for the t-channel single top quark production using the spin correlation of the t-channel. The signal extraction and cross section measurement rely on the angular distribution of the charged lepton in the top quark decays, the angle between the charged lepton momentum and top spin in the top rest frame. The behavior of the angular distribution is a distinct slope for the t-channel single top (signal) while it is flat for the backgrounds. In this Brief Report, we investigate the contributions which this spin correlation may receive from a two-Higgs doublet model, a top-color assisted technicolor (TC2) and the noncommutative extension of the standard model.

  17. Supersymmetric quantum mechanics, spinors and the standard model

    International Nuclear Information System (INIS)

    Woit, P.

    1988-01-01

    The quantization of the simplest supersymmetric quantum mechanical theory of a free fermion on a riemannian manifold requires the introduction of a complex structure on the tangent space. In 4 dimensions, the subgroup of the group of frame rotations that preserves the complex structure is SU(2) x U(1), and it is argued that this symmetry can be consistently interpreted to be an internal gauge symmetry for the analytically continued theory in Minkowski space. The states of the theory carry the quantum numbers of a generation of leptons in the Weinberg-Salam model. Examination of the geometry of spinors in four dimensions also provides a natural SU(3) symmetry and very simple construction of a multiplet with the standard model quantum numbers. (orig.)

  18. Decay of the standard model Higgs field after inflation

    CERN Document Server

    Figueroa, Daniel G; Torrenti, Francisco

    2015-01-01

    We study the nonperturbative dynamics of the Standard Model (SM) after inflation, in the regime where the SM is decoupled from (or weakly coupled to) the inflationary sector. We use classical lattice simulations in an expanding box in (3+1) dimensions, modeling the SM gauge interactions with both global and Abelian-Higgs analogue scenarios. We consider different post-inflationary expansion rates. During inflation, the Higgs forms a condensate, which starts oscillating soon after inflation ends. Via nonperturbative effects, the oscillations lead to a fast decay of the Higgs into the SM species, transferring most of the energy into $Z$ and $W^{\\pm}$ bosons. All species are initially excited far away from equilibrium, but their interactions lead them into a stationary stage, with exact equipartition among the different energy components. From there on the system eventually reaches equilibrium. We have characterized in detail, in the different expansion histories considered, the evolution of the Higgs and of its ...

  19. From the CERN web: Standard Model, SESAME and more

    CERN Multimedia

    2015-01-01

    This section highlights articles, blog posts and press releases published in the CERN web environment over the past weeks. This way, you won’t miss a thing...   Left: ATLAS non-leptonic MWZ data. Right: ATLAS σ × B exclusion for W’ → WZ. Is the Standard Model about to crater? 28 October – CERN Courier The Standard Model is coming under more and more pressure from experiments. New results from the analysis of LHC’s Run 1 data show effects that, if confirmed, would be the signature of new interactions at the TeV scale. Continue to read…      Students and teachers participate in lectures about CERN science at the first ever SESAME teacher and students school. New CERN programme to develop network between SESAME schools 22 October - by Harriet Jarlett In September CERN welcomed 28 visitors from the Middle East for the first ever student and teacher school f...

  20. Collider physics within the standard model a primer

    CERN Document Server

    Altarelli, Guido

    2017-01-01

    With this graduate-level primer, the principles of the standard model of particle physics receive a particular skillful, personal and enduring exposition by one of the great contributors to the field. In 2013 the late Prof. Altarelli wrote: The discovery of the Higgs boson and the non-observation of new particles or exotic phenomena have made a big step towards completing the experimental confirmation of the standard model of fundamental particle interactions. It is thus a good moment for me to collect, update and improve my graduate lecture notes on quantum chromodynamics and the theory of electroweak interactions, with main focus on collider physics. I hope that these lectures can provide an introduction to the subject for the interested reader, assumed to be already familiar with quantum field theory and some basic facts in elementary particle physics as taught in undergraduate courses. “These lecture notes are a beautiful example of Guido’s unique pedagogical abilities and scientific vision”. From...

  1. CP violation and flavour mixing in the standard model

    International Nuclear Information System (INIS)

    Ali, A.; London, D.

    1995-08-01

    We review and update the constraints on the parameters of the quark flavour mixing matrix V CKM in the standard model and estimate the resulting CP asymmetries in B decays, taking into account recent experimental and theoretical developments. In performing our fits, we use inputs from the measurements of the following quantities: (i) vertical stroke εvertical stroke , the CP-violating parameter in K decays, (ii) ΔM d , the mass difference due to the B 0 d - anti B 0 d mixing, (iii) the matrix elements vertical stroke V cb vertical stroke and vertical stroke V ub vertical stroke , (iv) B-hadron lifetimes, and (v) the top quark mass. The experimental input in points (ii) - (v) has improved compared to our previous fits. With the updated CKM matrix we present the currently-allowed range of the ratios vertical stroke V td /V ts vertical stroke and vertical stroke V td /V ub vertical stroke , as well as the standard model predictions for the B s 0 - anti B s 0 mixing parameter x s , (or, equivalently, ΔM s ) and the quantities sin 2α, sin 2β and sin 2 γ, which characterize the CP-asymmetries in B-decays. Various theoretical issues related to the so-called ''penguin-pollution'', which are of importance for the determination of the phases α and γ from the CP-asymmetries in B decays, are also discussed. (orig.)

  2. Penguin-like diagrams from the standard model

    International Nuclear Information System (INIS)

    Ping, Chia Swee

    2015-01-01

    The Standard Model is highly successful in describing the interactions of leptons and quarks. There are, however, rare processes that involve higher order effects in electroweak interactions. One specific class of processes is the penguin-like diagram. Such class of diagrams involves the neutral change of quark flavours accompanied by the emission of a gluon (gluon penguin), a photon (photon penguin), a gluon and a photon (gluon-photon penguin), a Z-boson (Z penguin), or a Higgs-boson (Higgs penguin). Such diagrams do not arise at the tree level in the Standard Model. They are, however, induced by one-loop effects. In this paper, we present an exact calculation of the penguin diagram vertices in the ‘tHooft-Feynman gauge. Renormalization of the vertex is effected by a prescription by Chia and Chong which gives an expression for the counter term identical to that obtained by employing Ward-Takahashi identity. The on-shell vertex functions for the penguin diagram vertices are obtained. The various penguin diagram vertex functions are related to one another via Ward-Takahashi identity. From these, a set of relations is obtained connecting the vertex form factors of various penguin diagrams. Explicit expressions for the gluon-photon penguin vertex form factors are obtained, and their contributions to the flavor changing processes estimated

  3. Penguin-like diagrams from the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Ping, Chia Swee [High Impact Research, University of Malaya, 50603 Kuala Lumpur (Malaysia)

    2015-04-24

    The Standard Model is highly successful in describing the interactions of leptons and quarks. There are, however, rare processes that involve higher order effects in electroweak interactions. One specific class of processes is the penguin-like diagram. Such class of diagrams involves the neutral change of quark flavours accompanied by the emission of a gluon (gluon penguin), a photon (photon penguin), a gluon and a photon (gluon-photon penguin), a Z-boson (Z penguin), or a Higgs-boson (Higgs penguin). Such diagrams do not arise at the tree level in the Standard Model. They are, however, induced by one-loop effects. In this paper, we present an exact calculation of the penguin diagram vertices in the ‘tHooft-Feynman gauge. Renormalization of the vertex is effected by a prescription by Chia and Chong which gives an expression for the counter term identical to that obtained by employing Ward-Takahashi identity. The on-shell vertex functions for the penguin diagram vertices are obtained. The various penguin diagram vertex functions are related to one another via Ward-Takahashi identity. From these, a set of relations is obtained connecting the vertex form factors of various penguin diagrams. Explicit expressions for the gluon-photon penguin vertex form factors are obtained, and their contributions to the flavor changing processes estimated.

  4. Nuclear anapole moment and tests of the standard model

    International Nuclear Information System (INIS)

    Flambaum, V. V.

    1999-01-01

    There are two sources of parity nonconservation (PNC) in atoms: the electron-nucleus weak interaction and the magnetic interaction of electrons with the nuclear anapole moment. A nuclear anapole moment has recently been observed. This is the first discovery of an electromagnetic moment violating fundamental symmetries--the anapole moment violates parity and charge-conjugation invariance. We describe the anapole moment and how it can be produced. The anapole moment creates a circular magnetic field inside the nucleus. The interesting point is that measurements of the anapole allow one to study parity violation inside the nucleus through atomic experiments. We use the experimental result for the nuclear anapole moment of 133 Cs to find the strengths of the parity violating proton-nucleus and meson-nucleon forces. Measurements of the weak charge characterizing the strength of the electron-nucleon weak interaction provide tests of the Standard Model and a way of searching for new physics beyond the Standard Model. Atomic experiments give limits on the extra Z-boson, leptoquarks, composite fermions, and radiative corrections produced by particles that are predicted by new theories. The weak charge and nuclear anapole moment can be measured in the same experiment. The weak charge gives the mean value of the PNC effect while the anapole gives the difference of the PNC effects for the different hyperfine components of an electromagnetic transition. The interaction between atomic electrons and the nuclear anapole moment may be called the ''PNC hyperfine interaction.''

  5. Background and derivation of ANS-5.4 standard fission product release model. Technical report

    International Nuclear Information System (INIS)

    1982-01-01

    ANS Working Group 5.4 was established in 1974 to examine fission product releases from UO2 fuel. The scope of ANS-5.4 was narrowly defined to include the following: (1) Review available experimental data on release of volatile fission products from UO2 and mixed-oxide fuel; (2) Survey existing analytical models currently being applied to lightwater reactors; and (3) Develop a standard analytical model for volatile fission product release to the fuel rod void space. Place emphasis on obtaining a model for radioactive fission product releases to be used in assessing radiological consequences of postulated accidents

  6. JPL Thermal Design Modeling Philosophy and NASA-STD-7009 Standard for Models and Simulations - A Case Study

    Science.gov (United States)

    Avila, Arturo

    2011-01-01

    The Standard JPL thermal engineering practice prescribes worst-case methodologies for design. In this process, environmental and key uncertain thermal parameters (e.g., thermal blanket performance, interface conductance, optical properties) are stacked in a worst case fashion to yield the most hot- or cold-biased temperature. Thus, these simulations would represent the upper and lower bounds. This, effectively, represents JPL thermal design margin philosophy. Uncertainty in the margins and the absolute temperatures is usually estimated by sensitivity analyses and/or by comparing the worst-case results with "expected" results. Applicability of the analytical model for specific design purposes along with any temperature requirement violations are documented in peer and project design review material. In 2008, NASA released NASA-STD-7009, Standard for Models and Simulations. The scope of this standard covers the development and maintenance of models, the operation of simulations, the analysis of the results, training, recommended practices, the assessment of the Modeling and Simulation (M&S) credibility, and the reporting of the M&S results. The Mars Exploration Rover (MER) project thermal control system M&S activity was chosen as a case study determining whether JPL practice is in line with the standard and to identify areas of non-compliance. This paper summarizes the results and makes recommendations regarding the application of this standard to JPL thermal M&S practices.

  7. Environmental assessment in support of proposed voluntary energy conservation standard for new residential buildings

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, D.L.; Parker, G.B.; Callaway, J.W.; Marsh, S.J.; Roop, J.M.; Taylor, Z.T.

    1989-06-01

    The objective of this environmental assessment (EA) is to identify the potential environmental impacts that could result from the proposed voluntary residential standard (VOLRES) on private sector construction of new residential buildings. 49 refs., 15 tabs.

  8. Standardized Handwriting to Assess Bradykinesia, Micrographia and Tremor in Parkinson's Disease

    NARCIS (Netherlands)

    Smits, Esther J.; Tolonen, Antti J.; Cluitmans, Luc; van Gils, Mark; Conway, Bernard A.; Zietsma, Rutger C.; Leenders, Klaus L.; Maurits, Natasha M.

    2014-01-01

    Objective: To assess whether standardized handwriting can provide quantitative measures to distinguish patients diagnosed with Parkinson's disease from age- and gender-matched healthy control participants. Design: Exploratory study. Pen tip trajectories were recorded during circle, spiral and line

  9. The Impact of Early Exposure of Eighth Grade Math Standards on End of Grade Assessments

    Science.gov (United States)

    Robertson, Tonjai E.

    2016-01-01

    The purpose of this study was to examine the Cumberland County Schools district-wide issue surrounding the disproportional performance of eighth grade Math I students' proficiency scores on standardized end-of-grade and end-of-course assessments. The study focused on the impact of the school district incorporating eighth grade math standards in…

  10. Convergence or Divergence: Alignment of Standards, Assessment, and Issues of Diversity.

    Science.gov (United States)

    Carter, Norvella, Ed.

    In this report, teacher educators scrutinize the relationships between the standards and assessment movement in education and the United States' increasingly multicultural population. The papers include: "Foreword" (Jacqueline Jordan Irvine); (1) "Diversity and Standards: Defining the Issues" (Norvella P. Carter); (2) "Accountability and…

  11. Assessing MBA Student Teamwork under the AACSB Assurance of Learning Standards

    Science.gov (United States)

    Procino, Matthew C.

    2012-01-01

    Since the 2003 release of the AACSB's Assurance of Learning standards, outcomes assessment has been a required practice for business schools wishing to receive their endorsement. While most accredited institutions had been dabbling with the measurement of student learning, the new standards raised the bar considerably. It is now necessary to…

  12. Future Directions in Assessment: Influences of Standards and Implications for Language Learning

    Science.gov (United States)

    Cox, Troy L.; Malone, Margaret E.; Winke, Paula

    2018-01-01

    As "Foreign Language Annals" concludes its 50th anniversary, it is fitting to review the past and peer into the future of standards-based education and assessment. Standards are a common yardstick used by educators and researchers as a powerful framework for conceptualizing teaching and measuring learner success. The impact of standards…

  13. Rare B-decays in the standard model

    International Nuclear Information System (INIS)

    Ali, A.; Greub, C.; Mannel, T.

    1993-02-01

    We review theoretical work done in studies of the Flavour Changing Neutral Current (FCNC) B-decays in the context of the Standard Model. Making use of the QCD-improved effective Hamiltonian describing the so-called vertical stroke ΔBvertical stroke =1 and vertical stroke ΔBvertical stroke =2, vertical stroke ΔQvertical stroke =0 transitions, we calculate the rates and differential distributions in a large number of B-decays. The FCNC processes discussed here include the radiative decays B → X s + γ, B → X d + γ, and the semileptonic decays B → X s l + l - , B → X d l + l - , B → X s ν l anti ν l , and B → X d ν l anti ν l . We also discuss the inclusive photon energy spectrum calculated from the Charged Current (CC) decays B → X c + γ and B → X u + γ and the mentioned FCNC radiative decays. The importance of carrying out measurements of the inclusive photon energy spectrum in B-decays is emphasized. Using phenomenological potential models and the Heavy Quark Effective Theory (HQET) we estimate decay branching ratios in a number of exclusive FCNC B-decays. Purely leptonic and photonic decays (B d , B s ) → l + l - and (B d , B s ) → γγ are also estimated. The principal interest in the studies of FCNC B-decays lies in their use in determining the parameters of the standard Model, in particular the CKM matrix elements and the top quark mass. The parametric dependence of these and other QCD-specific parameters on the rates and distributions is worked out numerically. (orig.)

  14. Assessing uncertainty in mechanistic models

    Science.gov (United States)

    Edwin J. Green; David W. MacFarlane; Harry T. Valentine

    2000-01-01

    Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...

  15. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  16. Adherence of pain assessment to the German national standard for pain management in 12 nursing homes

    OpenAIRE

    Osterbrink, Jürgen; Bauer, Zsuzsa; Mitterlehner, Barbara; Gnass, Irmela; Kutschar, Patrick

    2014-01-01

    BACKGROUND: Pain is very common among nursing home residents. The assessment of pain is a prerequisite for effective multiprofessional pain management. Within the framework of the German health services research project, ‘Action Alliance Pain-Free City Muenster’, the authors investigated pain assessment adherence according to the German national Expert Standard for Pain Management in Nursing, which is a general standard applicable to all chronic/acute pain-affected persons and highly recommen...

  17. Adherence of Pain Assessment to the German National Standard for Pain Management in 12 Nursing Homes

    Directory of Open Access Journals (Sweden)

    Jürgen Osterbrink

    2014-01-01

    Full Text Available BACKGROUND: Pain is very common among nursing home residents. The assessment of pain is a prerequisite for effective multiprofessional pain management. Within the framework of the German health services research project, ‘Action Alliance Pain-Free City Muenster’, the authors investigated pain assessment adherence according to the German national Expert Standard for Pain Management in Nursing, which is a general standard applicable to all chronic/acute pain-affected persons and highly recommended for practice.

  18. The development of methodological tools to assess the health sector with the resulting standardized index

    Directory of Open Access Journals (Sweden)

    Hansuvarova Evgenia Adolfovna

    2016-10-01

    The proposed assessment methodology resulting standardized health index in the various countries of the world allows you to define the country implementing an effective management strategy in the health sector. The leading positions belong to the countries where the state health policy has shown its greatest efficiency. This technique can be used not only for point scoring result of a standardized health index in the world, but also to assess in a particular country.

  19. Sensitivity Assessment of Ozone Models

    Energy Technology Data Exchange (ETDEWEB)

    Shorter, Jeffrey A.; Rabitz, Herschel A.; Armstrong, Russell A.

    2000-01-24

    The activities under this contract effort were aimed at developing sensitivity analysis techniques and fully equivalent operational models (FEOMs) for applications in the DOE Atmospheric Chemistry Program (ACP). MRC developed a new model representation algorithm that uses a hierarchical, correlated function expansion containing a finite number of terms. A full expansion of this type is an exact representation of the original model and each of the expansion functions is explicitly calculated using the original model. After calculating the expansion functions, they are assembled into a fully equivalent operational model (FEOM) that can directly replace the original mode.

  20. The role of Health Impact Assessment in the setting of air quality standards: An Australian perspective

    Energy Technology Data Exchange (ETDEWEB)

    Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au [WHO Collaborating Centre for Environmental Health Impact Assessment (Australia); Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia (Australia); Katscherian, Dianne [WHO Collaborating Centre for Environmental Health Impact Assessment (Australia); Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia (Australia); Harris, Patrick [CHETRE — UNSW Research Centre for Primary Health Care and Equity, University of New South Wales (Australia)

    2013-11-15

    The approaches used for setting or reviewing air quality standards vary from country to country. The purpose of this research was to consider the potential to improve decision-making through integration of HIA into the processes to review and set air quality standards used in Australia. To assess the value of HIA in this policy process, its strengths and weaknesses were evaluated aligned with review of international processes for setting air quality standards. Air quality standard setting programmes elsewhere have either used HIA or have amalgamated and incorporated factors normally found within HIA frameworks. They clearly demonstrate the value of a formalised HIA process for setting air quality standards in Australia. The following elements should be taken into consideration when using HIA in standard setting. (a) The adequacy of a mainly technical approach in current standard setting procedures to consider social determinants of health. (b) The importance of risk assessment criteria and information within the HIA process. The assessment of risk should consider equity, the distribution of variations in air quality in different locations and the potential impacts on health. (c) The uncertainties in extrapolating evidence from one population to another or to subpopulations, especially the more vulnerable, due to differing environmental factors and population variables. (d) The significance of communication with all potential stakeholders on issues associated with the management of air quality. In Australia there is also an opportunity for HIA to be used in conjunction with the NEPM to develop local air quality standard measures. The outcomes of this research indicated that the use of HIA for air quality standard setting at the national and local levels would prove advantageous. -- Highlights: • Health Impact Assessment framework has been applied to a policy development process. • HIA process was evaluated for application in air quality standard setting.